WO2019064916A1 - Simulateur de robot - Google Patents

Simulateur de robot Download PDF

Info

Publication number
WO2019064916A1
WO2019064916A1 PCT/JP2018/028992 JP2018028992W WO2019064916A1 WO 2019064916 A1 WO2019064916 A1 WO 2019064916A1 JP 2018028992 W JP2018028992 W JP 2018028992W WO 2019064916 A1 WO2019064916 A1 WO 2019064916A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
task
timing chart
unit
state
Prior art date
Application number
PCT/JP2018/028992
Other languages
English (en)
Japanese (ja)
Inventor
小菅 昌克
ナット タン ドアン
常田 晴弘
吉田 昌弘
Original Assignee
日本電産株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産株式会社 filed Critical 日本電産株式会社
Priority to JP2019544368A priority Critical patent/JPWO2019064916A1/ja
Publication of WO2019064916A1 publication Critical patent/WO2019064916A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/406Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by monitoring or safety
    • G05B19/4069Simulating machining process on screen

Definitions

  • the present invention relates to a robot simulator.
  • Patent Document 1 describes simulating the operation of a robot in which a grinding tool is attached to an arm.
  • Patent No. 4961447 gazette
  • work element means the minimum unit of work performed by the robot in a series of work such as “take” or “place” an object.
  • the present invention aims to support examination of each work element when performing simulation on a series of work of a robot.
  • a program execution unit that executes a robot program including a task that is information related to a work element of the robot, and the robot according to the passage of time based on the execution result by the program execution unit. And a period corresponding to the work element of the task on the time axis can be identified based on the state information calculating unit that calculates state information that is information indicating the state of the state, and the state information obtained by the state information calculating unit And a display control unit configured to display a first timing chart indicating a state of the robot on a display device.
  • FIG. 1 is a view showing the overall configuration of a robot simulator according to the first embodiment.
  • FIG. 2 is a diagram showing a hardware configuration of each device included in the robot simulator of the first embodiment.
  • FIG. 3 is a view showing a three-dimensional CAD window according to an example of the first embodiment.
  • FIG. 4 is a view showing a teaching window according to an example of the first embodiment.
  • FIG. 5 is a diagram for conceptually explaining a job.
  • FIG. 6 is a diagram for conceptually explaining a task.
  • FIG. 7 is a view showing transition of a window for teaching according to an example of the first embodiment.
  • FIG. 8 is a diagram showing an example of the task list.
  • FIG. 9 is a view showing an example of a timing chart displayed by the robot simulator according to the first embodiment.
  • FIG. 9 is a view showing an example of a timing chart displayed by the robot simulator according to the first embodiment.
  • FIG. 10 is a view showing a three-dimensional CAD window according to an example of the first embodiment.
  • FIG. 11 is a functional block diagram of the robot simulator according to the first embodiment.
  • FIG. 12 is an example of a sequence chart showing processing of the robot simulator according to the first embodiment.
  • FIG. 13 is a view showing the overall configuration of a robot simulator according to the second embodiment.
  • FIG. 14 is a diagram showing a hardware configuration of each device included in the robot simulator of the second embodiment.
  • FIG. 15 is a view showing an example of a timing chart displayed by the robot simulator according to the second embodiment.
  • FIG. 16 is a functional block diagram of a robot simulator according to the second embodiment.
  • FIG. 17 is a view showing an example of a timing chart displayed by the robot simulator according to the third embodiment.
  • FIG. 18 is a view showing an example of a timing chart displayed by the robot simulator according to the third embodiment.
  • work element means the minimum unit of work performed by the robot in a series of work such as “take” or "place” an object.
  • object means an object that is the target of the robot's work, and is not limited to the object that the robot grips (for example, the workpiece that is the processing object), but an object related to the robot's work (for example, the robot Also includes a shelf) on which an object to be gripped is placed.
  • the robot simulator according to the first embodiment simulates the operation of an actual robot without connecting it to the actual robot, and displays the state of the robot on a timing chart.
  • FIG. 1 is a view showing the overall configuration of a robot simulator 1 of the present embodiment.
  • FIG. 2 is a diagram showing a hardware configuration of each device included in the robot simulator 1 of the present embodiment.
  • the robot simulator 1 includes an information processing device 2 and a robot control device 3.
  • the information processing device 2 and the robot control device 3 are communicably connected by, for example, a communication network cable EC.
  • the information processing apparatus 2 is an apparatus for teaching an operation to a robot installed in a factory line.
  • the information processing device 2 is provided to perform off-line teaching by the operator, and is disposed, for example, at a position away from the factory where the robot is installed (for example, a work place of the operator).
  • the robot control device 3 executes a robot program transmitted from the information processing device 2.
  • the robot control device 3 is not connected to the robot, but when connected to the robot, it is possible to send a control signal according to the execution result of the robot program to the robot to operate the robot. Therefore, preferably, the robot control device 3 is disposed in the vicinity of the actual robot.
  • the information processing device 2 includes a control unit 21, a storage 22, an input device 23, a display device 24, and a communication interface unit 25.
  • the control unit 21 includes a central processing unit (CPU), a read only memory (ROM), and a random access memory (RAM).
  • the ROM stores a three-dimensional CAD application program and teaching software.
  • the CPU executes three-dimensional CAD application software (hereinafter referred to as “CAD software” as appropriate) on the ROM and teaching software in the RAM.
  • the teaching software and the CAD software execute processing in cooperation via an API (Application Program Interface).
  • the control unit 21 continuously displays images in frame units in order to perform moving image reproduction by CAD software.
  • the storage 22 is a large-capacity storage device such as a hard disk drive (HDD) or a solid state drive (SSD), and is configured to be sequentially accessible by the CPU of the control unit 21. As described later, the storage 22 stores a task database 221, a hierarchical list database 222, a three-dimensional model database 223, and an execution log database 224. The storage 22 stores data of a three-dimensional model that is referred to when executing CAD software. In the example of the present embodiment, the storage 22 stores data of a three-dimensional model of a robot and an object (for example, a pen, a cap, a product, a pen tray, a cap tray, a product tray described later). The storage 22 stores execution log data acquired from the robot control device 3.
  • HDD hard disk drive
  • SSD solid state drive
  • the execution log data includes robot program execution results and robot state data to be described later.
  • the robot state data is used to create a timing chart to be described later.
  • the robot state data may also be used to reproduce the motion of the robot in a virtual space by animation (animation) by three-dimensional CAD.
  • the input device 23 is a device for receiving an operation input by an operator, and includes a pointing device.
  • the display device 24 is a device for displaying execution results of teaching software and CAD software, and includes a display drive circuit and a display panel.
  • the communication interface unit 25 includes a communication circuit for performing network communication with the robot control device 3.
  • the robot control device 3 includes a control unit 31, a storage 32, and a communication interface unit 33.
  • the control unit 31 includes a CPU, a ROM, a RAM, and a control circuit.
  • the control unit 31 executes a robot program received from the information processing device 2 and outputs execution log data.
  • the execution log data includes the execution result of the robot program and robot state data of the robot that executes the work described in the robot program.
  • the storage 32 includes data of a model of a robot manipulator (a robot body including the arm 51 and the hand 52).
  • the control unit 31 is a physical quantity (for example, an angular displacement of the joint angle of the arm 51, an angular velocity, a position of the hand 52, etc.) according to the passage of time during execution of the job of each unit constituting the manipulator based on the execution result of the robot program. Etc.).
  • the data of the physical quantity is included in the robot state data.
  • the storage 32 is a mass storage device such as an HDD or an SSD, and is configured to be sequentially accessible by the CPU of the control unit 31.
  • the storage 32 stores a robot program and execution log data in addition to data of a model of a robot manipulator.
  • the communication interface unit 33 includes a communication circuit for performing network communication with the information processing device 2.
  • CAD software and teaching software are executed by the information processing apparatus 2.
  • the execution result of CAD software is displayed in a CAD window, and the execution result of teaching software is displayed in a teaching window.
  • the operator causes both the CAD window and the teaching window to be displayed on the information processing apparatus 2 or causes the information processing apparatus 2 to display the CAD window and the teaching window while switching between the CAD window and the teaching window. I do.
  • FIG. 3 shows a CAD window W1 according to an example of the present embodiment.
  • CAD CAD window W1
  • FIG. 3 an image in a state where the robot R, the pen tray 11, the cap tray 12, the jig 13, and the product tray 14 are arranged in a virtual space on the table T (hereinafter referred to as “CAD as appropriate”
  • the image is displayed.
  • the robot R performs a series of operations for assembling a product (a finished product with the cap fitted on the pen) by fitting the cap on the pen.
  • a pen group P composed of a plurality of pens is disposed on the pen tray 11, and a cap group C composed of a plurality of caps is disposed on the cap tray 12.
  • the jig 13 is a member for the robot R to temporarily arrange a pen and fit a cap.
  • the product tray 14 is a member for placing a product.
  • each pen included in the pen group P, each cap included in the cap group C, the pen tray 11, the cap tray 12, the jig 13, and the product tray 14 are work targets of the robot R. It is an example of an object. Further, a product in which a cap is fitted to a pen is also an example of an object.
  • FIG. 4 shows a teaching window W2 according to an example of the present embodiment. Displayed in the teaching window W2 is a robot R included in the CAD image of FIG. 3 and a hierarchical list showing hierarchical relationships of objects.
  • the teaching software can create hierarchical lists in conjunction with CAD software.
  • a tree structured data format is prepared by the teaching software as a hierarchical list.
  • the operator drags the robot R and the object in the CAD image to a desired node of the data format of the tree structure while selecting the robot R and the object in the CAD image with the CAD window and the teaching window displayed. .
  • the hierarchical list can be completed by sequentially performing this operation on all objects required to teach the robot R.
  • the name of each node displayed in the hierarchical list the name of the original three-dimensional model may be applied as it is, but the name may be changed later.
  • FIG. 3 exemplifies a CAD image when there is one robot, when two or more robots exist, the two or more robots can be registered in the hierarchical list.
  • nodes 61 to 63 mean the following contents.
  • the hand sequence is set for each object held by the hand 52 because it depends on the object held by the hand 52 of the robot R. If a task to be described later is created when the hand sequence is not set, a program based on the task can not be executed, so a warning display may be output to the display device 24.
  • the components of the object include jig (JIG), pen tray (PenTray), cap tray (CapTray), product tray (ProductTray), pens (Pen1, Pen2, ..., Pen12), caps (Cap1, Cap2, ... ,, Cap12) and products (PenProduct1, PenProduct2, ..., PenProduct12) are included.
  • JIG pen tray
  • CapTray cap tray
  • ProductTray product tray
  • pens Pen1, Pen2, ..., Pen12
  • caps Cap1, Cap2, ... ,, Cap12
  • products PenProduct1, PenProduct2, ..., PenProduct12
  • Node 64 (PenProduct): jig in a state of holding product (PenProduct)
  • Node 65 (PenJ): jig in a state of holding pen (Pen)
  • Node 66 (CapJ): holding a cap (Cap) Jig of state
  • PenTray Below the nodes of the pen tray (PenTray), nodes corresponding to the pens Pen1, Pen2, ..., Pen12 are provided. Below the nodes of the cap tray (CapTray), nodes corresponding to the caps Cap1, Cap2, ..., Cap12 are provided. Below the nodes of the product tray (ProductTray), nodes corresponding to the products PenProduct 1, PenProduct 2,..., PenProduct 12 are provided.
  • Robot R in the hierarchical list Robot_R in FIG. 4
  • objects jig (JIG), pen tray (PenTray), cap tray (CapTray), product tray (ProductTray), pens Pen1, Pen2, ..., Pen12 , Caps Cap1, Cap2, ..., Cap12, and products PenProduct1, PenProduct2, ..., PenProduct12
  • JIG pen tray
  • CapTray cap tray
  • ProductTray product tray
  • pens Pen1, Pen2, ..., Pen12 Caps Cap1, Cap2, ..., Cap12
  • PenProduct1, PenProduct2, ..., PenProduct12 are in a state associated with the data of the corresponding three-dimensional model. Therefore, even if there is a change in the robot R and the three-dimensional model of the object in the hierarchical list after the hierarchical list is created, it is not necessary to register again in the hierarchical list.
  • FIG. 5 is a diagram for conceptually explaining a job.
  • FIG. 6 is a diagram for conceptually explaining a task.
  • a job is a series of tasks performed by the robot R.
  • a task is information on a work element which is a minimum unit of work performed by the robot R in a series of work. Accordingly, as shown in FIG. 5, the period corresponding to the job (JOB), a plurality of tasks T 1, T 2, ..., and duration of a plurality of work elements corresponding to T n-1, T n, the robot The movement between R's work elements (that is, “movement between tasks”) is included. As shown in FIG. 5, in the present embodiment, a plurality of tasks are defined for the job performed by the robot R.
  • Each task includes a plurality of motions (meaning “movement of robot R”) M 1 to M n .
  • the tasks may include hand sequences (HS) in addition to motion.
  • the hand sequence is a series of processes for gripping an object by the hand 52 of the robot R. Between adjacent motions, a waiting time WT due to an interlock or the like described later may be set.
  • a line with an arrow conceptually shows the trajectory of the hand 52 of the robot R.
  • the trajectory includes approach points AP1 and AP2 which are passing points before reaching the target point TP of the work element, the target point TP, and departure points DP1 and DP2 which are passing points after reaching the target point TP.
  • the target point TP indicates the position of the target object of the work element.
  • the motion of the robot R before reaching the approach point AP1 corresponds to the movement between work elements (that is, the movement between the previous work element and the work elements shown in FIG. 6).
  • the movement of the robot R after the departure point DP2 corresponds to the movement between work elements (that is, the movement between the work element shown in FIG. 6 and the next work element).
  • the motion of the robot R from the approach point AP1 to the departure point DP2 corresponds to one work element and one task, and the movement of the robot R between adjacent points in the work element corresponds to one motion. That is, the task may include, in addition to the information on the work element, information on at least one of the target point TP, the approach point AP, and the departure point DP.
  • interlock is set in approach point AP1, AP2 is illustrated.
  • the interlock operates the robot R at the target point TP, the approach point AP, and / or the departure point DP until a predetermined signal is input in order to avoid interference with other robots etc. It is processing to make it wait.
  • the task may include information on the interlock setting including the point of setting the interlock and the timeout value of the waiting time due to the interlock.
  • FIG. 7 is a view showing transition of a window for teaching according to an example of the present embodiment.
  • the operator points to one of the nodes 61 to 63 of the hand of the hand of the robot R (Robot_R) included in the robot area RA.
  • Robot_R the operator points to one of the nodes 61 to 63 of the hand of the hand of the robot R (Robot_R) included in the robot area RA.
  • a teaching window W3 for task creation in FIG. 7 is displayed.
  • the teaching window W3 is a screen for performing detailed setting of tasks, and displays items of task name (Name), type of work element (Function), and target of work element (Target).
  • name type of work element
  • Tuget target of work element
  • any work from a pull-down menu consisting of candidates of a plurality of types of work elements set in advance (for example, Pick up, Place, etc.) It is configured to be able to select an element.
  • an object to be a work object is selected with the pointing device from the object area PA of the hierarchical list, and left-click operation is selected in the item of Target (Target) in the teaching window W3.
  • the target object is input.
  • the name (Name) of the task is automatically determined and displayed based on the data.
  • the operator points at node 61 in robot area RA and the pointing device Right click on and select "Create task”.
  • the pen tray is displayed in the item of the target (Target) of the teaching window W3.
  • “PenTray” is input.
  • "Pick up” is selected from candidates for a plurality of types of work elements.
  • the teaching window W3 shown in FIG. 7 is displayed. .
  • a task with the name "Pickup_Pen1_From_PenTray” is created as information related to the work element "take pen 1 from pen tray".
  • the operator intuitively performs the task by designating the target of the work element (here, “Pen Pen1”) and the start point (for example, “Pen tray”) or the end point of the work element on the hierarchical list. Can be created.
  • the name of the task is the work content of the work element (for example, "Pickup"), the work target (for example, "Pen Pen 1"), the target (for example, "pen tray”) which is the start point of the work element, or the end point
  • the contents of the task can be immediately understood from the name of the task because it is automatically created so as to include the object.
  • an approach point AP, a target point TP, and a departure point DP included in the task may be automatically created.
  • the operator may operate the button b1 (“detail setting”) of the teaching window W3 to set one or more approach points and / or departure points.
  • the center of gravity of the object is set as the target point TP, and a predetermined coordinate is set in the local coordinate system of the object based on the center of gravity of the object.
  • the approach point AP and the departure point DP may be set on the axis (for example, on the Z axis).
  • Motion parameters are parameters relating to the motion of the hand 52 of the robot R between adjacent approach points AP included in the task, between the approach point AP and the target point TP, and between the target point TP and the departure point DP. It is. For example, such parameters include moving speed, acceleration, acceleration time, deceleration time, posture of the robot R, and the like.
  • motion parameter setting By selecting “motion parameter setting”, the motion parameter can be changed from the default value.
  • interlock setting it is possible to change from the default value the timeout value of the interlock waiting time and the setting of the operation when the waiting time exceeds the timeout value and it is judged as an error.
  • a button b2 (“Create") is provided in the teaching window W3 of FIG. By operating the button b2 ("Create”), the task set by the teaching window W3 is registered in a task list described later.
  • a program for executing a work element corresponding to a task shown in FIG. 6 consists of the following functions, each function corresponding to a robot R It is described by a program for executing motion. Note that the following move (AP1) may be separately defined as movement between work elements.
  • the task list is information on a list of a plurality of tasks corresponding to each of a plurality of work elements included in a job performed by the robot R.
  • the tasks created for the specific job by the teaching window W3 are sequentially registered in the task list corresponding to the job.
  • the order of the plurality of tasks included in the task list indicates the execution order of the plurality of work elements respectively corresponding to the plurality of tasks.
  • the teaching window W4 in FIG. 8 displays an example of a task list corresponding to a job (“Pen Assembly”) which is a series of operations “attach a cap to a pen and assemble a product”.
  • An example of this task list includes tasks corresponding to the following six work elements (i) to (vi).
  • the teaching window W3 for creating a task shows a case where a task having a name shown in parentheses is created.
  • the operator can set the selected tasks in an arbitrary order in the task list by performing the drag operation while selecting any task with the pointing device in the task list.
  • a robot program is created in the information processing device 2, and the robot program is executed in the robot control device 3.
  • the simulation program is executed.
  • the robot controller 3 is not connected to a real robot, so the controller 31 of the robot controller 3 controls the manipulator based on the data of the model of the manipulator of the robot R recorded in the storage 32. Physical quantities (for example, angular displacement of joint angles of the arm 51, angular velocity, etc.) according to the passage of time of the respective parts constituting the computer are calculated as robot state data.
  • the execution log data including the job execution result and the robot state data is transmitted from the robot control device 3 to the information processing device 2.
  • the information processing device 2 creates and displays a timing chart based on the robot state data.
  • the timing chart created by the information processing device 2 is also referred to as an "offline timing chart".
  • the offline timing chart is an example of a first timing chart.
  • FIG. 9 shows an example of a timing chart displayed on the display device 24 of the information processing device 2.
  • the timing chart illustrated in FIG. 9 shows a timing chart corresponding to the task named “Pickup_Pen1_From_PenTray” in the job (“Pen Assembly”) illustrated in the teaching window W4 in FIG. ing.
  • the timing chart of FIG. 9 includes a part A which displays a graph and a part B which displays a task execution status, and the time from the job start time is displayed at the top. ing.
  • cursors CU1 to CU3 are provided. Each pointing device can move each cursor left and right.
  • the cursor CU1 indicates the time from the start time of the job, and the cursors CU2 and CU3 indicate times when the time indicated by the cursor CU1 is 0.
  • a portion (A) of FIG. 9 displays a waveform (graph) in which changes in reference time (for example, 1 ms) of the following robot state data of the robot R are plotted.
  • the position of the robot R is, for example, the position of the tip flange of the manipulator.
  • ⁇ Command value ("posX") of the position of the X coordinate (world coordinate system) of the robot R
  • ⁇ Command value of position of Y coordinate (world coordinate system) of robot R (“posY")
  • ⁇ Command value ("posZ”) of the position of Z coordinate (world coordinate system) of robot R
  • Angular Velocity angular velocity of joint angle of arm 51
  • the graph displayed in part A of FIG. 9 is only data of a part of a plurality of types of robot state data.
  • the operator can select the type of robot state data to be displayed from among the plurality of types of robot state data.
  • the task name (“Task Function”) and the motion state (“Motion status”) are displayed.
  • the task name (“Task Function”) is "Pickup_Pen1_From_PenTray” as described above, and the period (start time and end time) of the task included in the job is known from FIG.
  • the motion status ("Motion status") corresponds to the motion included in the task. That is, in the motion state ("Motion status"), the period (start time and end time) of the following three types of motion (here, including the hand sequence) can be known.
  • the start time and the end time of each motion match the execution start time and the execution end time of the function corresponding to each motion.
  • the start time of each task matches the start time of the first motion among the plurality of motions included in each task, and the end time of each task is the end time of the last motion among the plurality of motions included in each task Match with
  • -Move ... arc movement. It is mainly used for moving between approach points AP and for moving between departure points DP.
  • ⁇ Move (Line) ... linear movement. It is mainly used for the movement from the approach point AP to the target point TP and for the movement from the target point TP to the departure point DP.
  • HandSequence () ... Hand sequence processing
  • the cause of the error may be displayed in the column of the status of the teaching window W4 of FIG.
  • the cause of the error includes, for example, overspeeding, reaching of the target point, reaching of the singular point and the like.
  • the information processing apparatus 2 causes the robot R and the three-dimensional model of the object to operate in the virtual space based on the robot state data, and the motion of the robot R corresponding to the task as a simulation output. Display the animation (animation) of. Thereby, the operator can confirm the result of simulation of the job in the CAD window.
  • FIG. 10 shows an example of a CAD window W5 for reproducing a moving image of a result of simulation of a job.
  • buttons b4 to b8 are provided as operation objects by the operator.
  • the button b4 is an operation button for causing the robot R and the object to operate in the virtual space so that the same change in robot state data as in the timing chart shown in FIG. 9 (that is, at normal speed).
  • the button b5 is an operation button for operating the robot R and the object in the virtual space so that the robot state data changes at twice the normal speed.
  • the button b6 is an operation button for operating the robot R and the object in the virtual space so that the robot state data changes at a speed lower than the normal speed.
  • the button b7 is an operation button for temporarily stopping the robot R operating in the virtual space and the object.
  • the button b8 is an operation button for operating the robot R and the object in the virtual space by changing the robot state data (that is, performing reverse reproduction) so that the progress of time is in the reverse direction.
  • FIG. 11 is a functional block diagram of the robot simulator 1 according to the embodiment.
  • the robot simulator 1 includes a display control unit 101, a task creation unit 102, a program creation unit 103, a program execution unit 104, a state information calculation unit 105, a simulation unit 107, and an interlock setting unit 108.
  • the robot simulator 1 further includes a task database 221, a hierarchical list database 222, a three-dimensional model database 223, and an execution log database 224.
  • the CPU included in the control unit 21 executes the teaching software and / or the CAD software to perform the processing.
  • the display control unit 101 controls the display device 24 to display the execution results of the teaching software and the CAD software.
  • the control unit 21 of the information processing device 2 generates image data including the output of teaching software and CAD software, buffers it, and transmits it to the display device 24.
  • the display device 24 drives the display drive circuit to display an image on the display panel.
  • the task creation unit 102 has a function of creating a task that is information related to work elements of the robot with respect to the object based on the operation input of the operator.
  • the control unit 21 of the information processing device 2 receives an operation input of the operator from the input device 23, the control unit 21 of FIG.
  • the task is created as a file including information of the target of the work element (Target) and recorded in the storage 22.
  • the control unit 21 determines the task name in accordance with a predetermined rule, based on the type (Function) of the work element and the target (Target) of the work element.
  • the storage 22 stores a task database 221 including tasks created by the task creating unit 102.
  • the control unit 21 of the information processing device 2 sequentially creates tasks corresponding to a plurality of work elements included in a job performed by the robot, and thereby creates a plurality of tasks associated with the job as a task list.
  • each task is recorded in a state associated with a specific job.
  • the display control unit 101 refers to the task database 221 of the storage 22 and displays a task list, which is a list of a plurality of tasks, on the display device 24.
  • a task list which is a list of a plurality of tasks, on the display device 24.
  • the names of the tasks in the displayed task list are configured such that the work contents of the work elements of the robot can be recognized, so that a series of work contents can be intuitively understood by the operator intuitively. It has become.
  • the display control unit 101 displays hierarchical data in which the components of the robot and the components of the object are hierarchically described on the display device 24.
  • the control unit 21 of the information processing device 2 creates a hierarchical list by linking teaching software and CAD software. That is, the control unit 21 receives an operation input for dragging a robot and an object in a CAD image to a desired node of the data format of the tree structure in a state where the robot and the object are selected by the pointing device of the input device 23. Create a list
  • the control unit 21 records the created hierarchical list in the hierarchical list database 222 of the storage 22. In the hierarchical list, each node corresponding to the component of the robot and the component of the object in the data of the tree structure is associated with the corresponding data of the three-dimensional model recorded in the three-dimensional model database 223 It has become.
  • the task creating unit 102 has a function of creating a task based on an operation input of an operator specifying a component of a robot of hierarchical data and a component of an object.
  • the control unit 21 of the information processing device 2 displays a node indicating a hand of a robot corresponding to an operation of gripping a specific target and a node corresponding to the target from the hierarchical list. Create a task based on the operation input of the selected operator. By using the hierarchical list, the operator can create tasks intuitively according to the contents of work elements.
  • the program creation unit 103 has a function of creating a program for causing the robot 5 to execute a work element corresponding to the task based on the task created by the task creation unit 102.
  • the control unit 21 of the information processing device 2 selects the type of work element included in the task, the target of the work element, the approach point, the departure point, the motion parameter, and the interlock. Based on the settings, a function is created in which a program for causing the robot to execute the work element corresponding to the task is written.
  • the program creation unit 103 automatically creates a program by referring to the information included in the task and rewriting the coordinate position etc. in the format of a predetermined program according to the type of the work element stored in the storage 22. Do.
  • the program execution unit 104 has a function of executing a robot program including a task that is information on work elements of the robot.
  • the control unit 21 of the information processing device 2 controls the robot program (program for simulation) created based on at least one task via the communication interface unit 25 Send to 3
  • one job includes a plurality of work elements described by the task and movement between the work elements.
  • One task includes a plurality of motions and an optional hand sequence, and a program for executing a work element corresponding to the task is defined by a plurality of functions. Movement between work elements of the robot is defined by a function described by a program including predetermined default motion parameters.
  • the control unit 31 of the robot control device 3 executes a robot program for one job by sequentially executing a plurality of functions corresponding to each motion and hand sequence.
  • the state information calculation unit 105 has a function of calculating robot state data (example of state information) which is information indicating the state of the robot according to the passage of time based on the execution result by the program execution unit 104. Then, the display control unit 101 is a timing chart showing the state of the robot so that the period corresponding to the task element of the task can be identified on the time axis based on the robot state data obtained by the state information calculation unit 105. (Ie, the off-line timing chart) is displayed on the display device 24.
  • robot state data example of state information
  • example of state information is information indicating the state of the robot according to the passage of time based on the execution result by the program execution unit 104.
  • the display control unit 101 is a timing chart showing the state of the robot so that the period corresponding to the task element of the task can be identified on the time axis based on the robot state data obtained by the state information calculation unit 105. (Ie, the off-line timing chart) is displayed on the display device 24.
  • the control unit 31 of the robot control device 3 causes the information processing device 2 to sequentially execute the program created based on each task. Execute the received robot program. Then, the control unit 31 calculates robot state data in the entire job as the execution result of the robot program, and sequentially records the data in the storage 32.
  • the robot state data is data of physical quantities (for example, arm joint angle, arm speed and acceleration, hand position, etc.) indicating the state of the robot according to the elapsed time calculated based on the model of the manipulator 50. .
  • control unit 31 When the execution of the robot program for the entire job is completed, the control unit 31 reads robot state data from the storage 32 for each predetermined reference time (for example, 1 ms) along the time axis of the progress of the job. The control unit 31 transmits execution log data including the read robot state data and the execution result of the robot program to the information processing apparatus 2 via the communication interface unit 33.
  • predetermined reference time for example, 1 ms
  • the control unit 21 of the information processing device 2 When acquiring the execution log data including the robot state data from the robot control device 3, the control unit 21 of the information processing device 2 records the execution log data in the execution log database 224.
  • the control unit 21 generates an image including an off-line timing chart of a predetermined format based on the acquired robot state data for each reference time, and displays the image on the display device 24.
  • the said timing chart is comprised so that the period corresponded to a specific work element can be identified about a series of operation
  • the robot state data may include information of a plurality of types of robots.
  • the display control unit 101 displays the off-line timing chart on the display device 24 based on the information of a part of the robot state data selected from the plurality of types by the operation input of the operator. It is also good. As a result, the operator can display robot status data of the type he / she wishes to check on the off-line timing chart.
  • the method of selecting the type of robot state data is not limited, for example, it is preferable that the display screen of the timing chart is configured to include a menu that allows the type of robot state data to be selected by an operation by the operator.
  • the storage 22 (an example of a storage unit) stores a three-dimensional model database 223 including a three-dimensional model of a robot in a virtual space and information of a three-dimensional model of an object.
  • the simulation unit 107 has a function of operating a three-dimensional model in a virtual space and displaying it on the display device 24 based on the robot state data obtained by the state information calculation unit 105.
  • the control unit 21 of the information processing device 2 reads the execution log data recorded in the execution log database 224, and based on the robot state data included in the execution log data, in the virtual space.
  • the robot and the three-dimensional model of the object are operated and displayed on the display device 24.
  • the operator can visually confirm the operation of the robot of each task, so the setting value of each task (for example, approach point, departure point, motion parameter, etc.) and the arrangement of the object (for example, FIG. It becomes easy to re-examine the arrangement of the cap tray 12 and the like 3).
  • the setting value of each task for example, approach point, departure point, motion parameter, etc.
  • the arrangement of the object for example, FIG. It becomes easy to re-examine the arrangement of the cap tray 12 and the like 3).
  • the simulation unit 107 makes the operation of the three-dimensional model faster or slower than the speed based on the robot state data, or the progress of time in the opposite direction. It may be displayed on Thereby, the operator can reproduce the simulation result at the operator's desired speed or by reverse reproduction.
  • the control unit 21 of the information processing device 2 makes the robot and the object three-dimensional by setting the speed higher or lower than the normal speed according to the operation by the operator on the button of the CAD window.
  • the model is reproduced as a moving image, or the three-dimensional model is reproduced in reverse and displayed on the display device 24. For example, since robot status data for each 1 ms reference time is recorded in the execution log database 224, the control unit 21 creates a reproduction frame according to the operation of the operator.
  • the interlock setting unit 108 has a function of setting a time-out value of the waiting time for waiting the operation of the robot at at least one of the target point, the approach point and the departure point set to the task based on the operation input of the operator.
  • the control unit 21 of the information processing device 2 performs the operation input of the operator received by the input device 23 (for example, the operation input in the teaching window W3 of FIG. 7).
  • the storage 22 is accessed, and the timeout value of the interlock waiting time set at the specific approach point or departure point included in the task included in the task database 221 is rewritten.
  • the interlock setting unit 108 allows the operator to easily set the timeout value of the standby time by the interlock for each work element when performing the off-line teaching of the robot.
  • the simulation unit 107 may operate the three-dimensional model in the virtual space by invalidating the standby time due to the interlock. That is, when the simulation program is executed, the three-dimensional model is operated without waiting for the interlock. As a result, the operator can check the execution result of the program paying attention to the movement of the robot without stopping the robot in the virtual space.
  • FIG. 12 is an example of a sequence chart showing job execution processing of the robot simulator according to the present embodiment.
  • the program creation unit 103 creates a robot program for causing the robot R to execute a job.
  • the robot program includes a plurality of functions in which programs for causing the robot R to execute the work elements corresponding to the tasks in the task list corresponding to the job are described.
  • the created robot program is transmitted from the information processing device 2 to the robot control device 3 (step S14) and executed by the robot control device 3. That is, the program execution unit 104 executes the robot program in task units (step S16).
  • the state information calculation unit 105 calculates robot state data which is information indicating the state of the robot R according to the passage of time based on the execution result by the program execution unit 104, and stores execution log data including the robot state data. Record in 32 (step S18).
  • the processes of steps S16 and S18 are performed until all tasks included in the task list are completed.
  • the robot control device 3 transmits the execution log data to the information processing device 2 (step S22).
  • the execution log data includes robot state data for each predetermined reference time (for example, 1 ms) along the time axis of the progress of the job, and the execution result of the robot program.
  • the information processing device 2 records the received execution log data in the execution log database 224.
  • the display control unit 101 sets a timing chart (offline, based on robot status data (that is, robot status data obtained by the status information calculation unit 105) included in the execution log data received in step S22 (offline)
  • the timing chart is displayed on the display unit 24 (step S24).
  • the simulation unit 107 operates the three-dimensional model in the virtual space based on the robot state data and displays it on the display device 24 (step S26).
  • the robot simulator 1 calculates robot state data based on the execution result of the robot program corresponding to the job, and based on the robot state data, determines the task element on the time axis.
  • An offline timing chart is displayed so that the corresponding period can be identified. Therefore, when displaying the result of simulation about a series of tasks of the robot, the period corresponding to the specific task can be identified, and the consideration for each task can be supported.
  • FIG. 13 is a view showing the overall configuration of a robot simulator 1A of the present embodiment.
  • FIG. 14 is a diagram showing a hardware configuration of each device included in the robot simulator 1A of the present embodiment.
  • symbol is attached
  • the robot simulator 1A includes an information processing device 2 and a robot control device 3A.
  • the information processing device 2 and the robot control device 3A are communicably connected by, for example, a communication network cable EC.
  • the robot control device 3A is common to the robot control device 3 of the first embodiment in that it executes the robot program transmitted from the information processing device 2, but the robot control in that it is connected to the robot R and the cable WC. Different from device 3.
  • the robot control device 3A is disposed in the vicinity of the actual robot.
  • the robot control device 3A includes a control unit 31A, a storage 32, and a communication interface unit 33.
  • the control unit 31A of the robot control device 3A executes a robot program to generate a control signal for the robot R.
  • the control signal for the robot R is a signal (for example, a pulse signal) indicating a command value for the robot R.
  • the command value for the robot R is, for example, a target value such as the position, velocity, or acceleration of the robot R.
  • the control unit 31A acquires robot state data, which is data indicating the state of the robot R, from the robot R.
  • the present embodiment differs from the first embodiment in that the robot state data acquired by the control unit 31A is acquired from the robot R instead of the operation value.
  • the robot state data of this embodiment includes input / output signals to / from the robot R.
  • the control unit 31 ⁇ / b> A sequentially records robot state data acquired from the robot R in the storage 32.
  • the control unit 31A reads robot state data for each predetermined reference time (for example, 1 ms) along the time axis of the progress of the job from the storage 32, and executes the read robot state data and the execution result of the robot program.
  • Log data is transmitted to the information processing device 2.
  • the robot R includes a manipulator 50 as a robot body, a control device 55, a sensor group 56, and an input / output unit 57.
  • the manipulator 50 is a main body of the robot R, and includes an arm 51 and a hand 52, and a plurality of motors for controlling the positions and angles of the arm 51 and the hand 52.
  • the control device 55 includes a servo mechanism that controls a motor provided in the manipulator 50 so as to follow a command value received from the robot control device 3A.
  • the sensor group 56 may include a plurality of sensors that detect the position, velocity, and acceleration of the manipulator 50, and the angle, angular velocity, and the like of each joint.
  • the robot state data transmitted from the controller 55 to the robot controller 3A may be a physical quantity of the manipulator 50 detected by any of the sensors of the sensor group 56, or a physical quantity that can be estimated from a control current value or the like for the motor. It may be For example, when the angular velocity sensor of the joint angle of the arm 51 is provided, the data of the angular velocity of the joint angle of the arm 51 may be a detection value of the angular velocity sensor, or a servo for controlling the motor of the manipulator 50 It may be a control current value in the mechanism.
  • the input / output unit 57 is an input / output interface with a system outside the robot R.
  • examples of the input signal include an interlock signal from an external system, and a signal for correcting the position of the object when the robot R is disposed in a line as described later. .
  • Timing Chart of this Embodiment the robot R of the actual machine is connected to the robot control device 3A, and based on the robot state data acquired from the robot R, a timing chart is created in the information processing device 2 Be done.
  • the timing chart created in the present embodiment is also referred to as an “online timing chart” as appropriate.
  • the on-line timing chart is an example of a second timing chart.
  • FIG. 15 shows an example of a timing chart of the present embodiment.
  • the timing chart of FIG. 15 includes a part C that is a part that displays input / output signals, in addition to a part A that is a part that displays a graph and a part B that is a part that displays the task execution status.
  • the robot state data included in the online timing chart may include the state of input / output signals in the robot. It is preferable that the input / output signals displayed on the online timing chart can be selected from the input / output signals provided to the robot R by the operation input of the operator.
  • a waveform (graph) in which changes at each reference time (for example, 1 ms) of the following robot state data acquired from the robot R are plotted is displayed.
  • ⁇ Acquisition value of position of robot R's X coordinate (world coordinate system) ("posX") ⁇ Acquisition value of position of Y coordinate (world coordinate system) of robot R ("posY")
  • PosZ Acquisition value of position of robot R's Z coordinate (world coordinate system)
  • Angular Velocity Acquired value of angular velocity of joint angle of arm 51
  • the start time and the end time of each motion coincide with the execution start time and the execution end time of the function corresponding to each motion.
  • the start time of each task matches the start time of the first motion among the plurality of motions included in each task, and the end time of each task is the end time of the last motion among the plurality of motions included in each task Match with
  • part C of FIG. 15 the state (ON (H level) or OFF (L level)) of the I / O (input / output signal) is displayed.
  • the numbers in [] are predetermined I / O numbers, and the characters following [] indicate the contents of I / O.
  • the input-output signal contained in C part may show indeterminate. In the example shown in FIG. 15, I / O of the following contents is shown.
  • Pen Set A camera is provided in the working environment of the actual robot R, and is turned on when the camera recognizes the pen tray. When the pen tray is fixed and arranged, it is always on. However, when the pen tray is carried in a line, the camera switches from OFF to ON when it recognizes the pen in the pen tray.
  • Vision lock on Indicates a correction timing for correcting the command value for the robot R based on the position of the pen tray recognized by the camera installed in the working environment of the robot R of the actual machine. It turns ON when the correction timing is reached. The correction timing is preset for the task.
  • Hand 1 grasp ... Turns on when the hand holds the pen. The timing of ON differs depending on the hand sequence. ⁇ [17] Hand1 release ... It turns ON when the hand releases the pen.
  • the robot R of the actual machine is turned on during the picking up operation, it can be understood that the user has failed to grasp the object and drops it.
  • the period (start time and end time) of each task included in the job of robot R and each task are included.
  • Period of motion (start time and end time) to be known.
  • the time t3 which is the correction timing of the command value for the robot R is later than the time t2 which is the start time of the motion corresponding to the command value.
  • FIG. 16 is a functional block diagram of a robot simulator 1A according to the embodiment.
  • the robot simulator 1A includes a display control unit 101, a task creation unit 102, a program creation unit 103, a program execution unit 104, a state information acquisition unit 106, a simulation unit 107, and an interlock setting unit 108.
  • the robot simulator 1A according to the present embodiment differs from the robot simulator 1 according to the first embodiment in that the state information acquisition unit 106 is provided instead of the state information calculation unit 105.
  • control unit 21 of the information processing device 2 recognizes, for example, an operation input to the button b3 of the teaching window W3 (FIG. 8). And transmit the robot program to the robot control apparatus 3A.
  • the control unit 31A of the robot control device 3A executes the received program for a real machine.
  • the state information acquisition unit 106 generates a control signal for the robot based on the execution result by the program execution unit 104, and robot state data, which is information indicating a state according to the passage of time of the robot operating according to the control signal, It has a function to acquire from a robot.
  • the display control unit 101 determines the state of the robot so that the period corresponding to the task element of the task can be identified on the time axis based on the robot state data obtained by the state information acquisition unit 106.
  • the timing chart ie, the on-line timing chart
  • the control unit 31A of the robot control device 3A sequentially executes the program created based on each task, from the information processing device 2 Execute the received robot program. Then, the control unit 31A generates a control signal including a command value for the robot R every time the function included in the robot program is executed, and transmits the control signal to the robot R.
  • the control device 55 of the robot R controls the motor provided in the manipulator 50 so as to follow the command value received from the robot control device 3A, and transmits robot state data to the robot control device 3A.
  • the robot state data is a physical quantity of the manipulator 50 detected by any of the sensors of the sensor group 56 or an estimated value (for example, a control current value or the like) of the physical quantity of the manipulator 50.
  • the control unit 31A of the robot control device 3A sequentially records the robot state data acquired from the robot R in the storage 32.
  • the control unit 31A reads robot state data accumulated in the storage 32 every predetermined reference time (for example, 1 ms), and transmits execution log data including robot state data read every predetermined reference time to the information processing apparatus 2 Do.
  • the control unit 21 of the information processing device 2 records the received execution log data in the execution log database 224.
  • control unit 21 generates an image including a timing chart (on-line timing chart) of a predetermined format based on robot state data included in the execution log data, and displays the display device 24. Display on The timing chart can make it possible to identify a period corresponding to a specific work element in a series of work of the robot.
  • the robot in the robot state data, may include plural types of information.
  • the display control unit 101 displays an online timing chart on the display device 24 based on the information of a part of the robot state data selected from the plurality of types by the operation input of the operator. It is also good.
  • the simulation unit 107 may have a function of operating the three-dimensional model in the virtual space and displaying it on the display device 24 based on the robot state data obtained by the state information acquisition unit 106. At this time, the simulation unit 107 causes the motion of the three-dimensional model to be faster or slower than the speed based on the robot state data, or to make the progress of time reverse. It may be displayed on the display device 24.
  • the control unit 21 of the information processing device 2 causes the display device 24 to operate the three-dimensional model of the robot and the object in the virtual space based on the robot state data for each reference time acquired from the robot R. indicate.
  • the operator can visually confirm the operation of the robot of each task, so the setting value of each task (for example, approach point, departure point, motion parameter, etc.) and the arrangement of the object (for example, FIG. It becomes easy to re-examine the arrangement of the cap tray 12 and the like 3).
  • the setting value of each task for example, approach point, departure point, motion parameter, etc.
  • the arrangement of the object for example, FIG. It becomes easy to re-examine the arrangement of the cap tray 12 and the like 3).
  • the process of displaying the online timing chart and reproducing the moving image of the three-dimensional model is substantially in common with the process of the flowchart shown in FIG. 12, but in step S18, robot state data is acquired from the robot R Differs in that they are recorded.
  • the robot simulator 1A of this embodiment transmits a command value to the robot based on the execution result of the robot program corresponding to the job, and acquires robot state data from the robot. Then, based on the acquired robot state data, the robot simulator 1A displays the online timing chart so that the period corresponding to the task element of the task can be identified on the time axis. Therefore, when displaying the result of the series of tasks of the robot of the actual machine, the period corresponding to the specific task can be identified, and the examination for each task can be supported.
  • the robot simulator of the present embodiment has the functions of the robot simulator 1 of the first embodiment and the functions of the robot simulator 1A of the second embodiment, and both the off-line timing chart and the on-line timing chart are displayed on the display device 24. indicate. That is, in the present embodiment, the display control unit 101 displays the offline timing chart (example of the first timing chart) and the online timing chart (example of the second timing chart) as the state of the robot during a period corresponding to the task element. Are displayed on the display device 24 so that they can be compared. With this display, it is possible to compare and examine robot state data in the case where the same job is connected to the robot and in the case where the robot is not connected to each other for each work element included in the job.
  • robot state data (computed robot state data and robots obtained by executing robot programs created for the same job)
  • the acquired robot state data is recorded.
  • the control unit 21 of the information processing device 2 displays the off-line timing chart based on the calculated robot state data and the on-line timing chart based on the robot state data acquired from the robot on the display 24 that can be compared. .
  • the display mode that makes it possible to compare the offline timing chart and the online timing chart is not limited.
  • the offline timing chart is displayed on the upper side of the display screen
  • the online timing chart is displayed on the lower side of the display screen.
  • the display control unit 101 displays the off-line timing chart and the on-line timing chart in a switchable manner, or superimposes the off-line timing chart and the on-line timing chart on a common time axis and displays them on the display device 24. . With this display mode, it becomes easy to compare the off-line timing chart and the on-line timing chart.
  • FIG. 17 shows an example in which the off-line timing chart and the on-line timing chart are superimposed and displayed on a common time axis.
  • the off-line timing chart shown in FIG. 9 is displayed superimposed on the on-line timing chart shown in FIG. 15 on a common time axis.
  • FIG. 18 shows an example in which the off-line timing chart and the on-line timing chart are displayed switchably.
  • a tab named “Result # 1 (robot not connected)” corresponding to the offline timing chart and a name “Result # 2 (robot connected) corresponding to the online timing chart And tabs are provided switchably.
  • FIG. 18 shows the case where the tab named “Result # 1 (robot not connected)” is selected, and has the same display content as the off-line timing chart of FIG.
  • the display control unit 101 may display an off-line timing chart or an on-line timing chart on the display device 24 so that the states of robots in different work elements can be compared. Thereby, it is possible to compare the state of the robot among different work elements among the plurality of work elements included in the robot's job.
  • the control unit 21 of the information processing device 2 controls the information processing device 2 based on the operation input of the operator who selects two tasks among the plurality of tasks included in the job.
  • the unit 21 creates a timing chart in which the periods of the two selected tasks are configured in parallel.
  • the timing chart is created such that the start times of the selected two task periods coincide.
  • the simulation unit 107 operates the three-dimensional model based on robot state data in a partial period designated based on the operator's operation input among the periods displayed in the off-line timing chart or the on-line timing chart. It may be displayed on the display device 24. As a result, it is possible to confirm the operation of the robot in the period that the operator wants to focus on during the entire period of the job.
  • the offline timing chart or the online timing chart is provided with a button b4 (an operation button for operating the robot and the object at a normal speed) as illustrated in FIG.
  • the control unit 21 of the information processing device 2 reproduces the moving image corresponding to the off-line timing chart or the on-line timing chart based on the operation input of the button b4. At this time, by operating the cursors CU1 to CU3 provided on the timing chart, it is possible to designate a period to be a reproduction target of a moving image as a part of the periods displayed on the timing chart.
  • the display control unit 101 may display the on-line timing chart on the display device 24 so that the states of the robots in different execution results can be compared even with the same work element. Thereby, it is possible to grasp the variation of the state of the robot when trying the same task a plurality of times.
  • the control unit 21 of the information processing device 2 plots and plots a plurality of robot state data acquired by executing the robot program a plurality of times such that the start times of jobs coincide with each other. Create a timing chart.
  • the information processing device 2 may provide useful information based on robot state data acquired from the robot control device 3. For example, by providing information obtained by statistically processing the movement distance and the number of movements between work elements in a job, it is possible to support consideration of the location, direction, etc. of supply and / or discharge of an object in a line. Also, the time ratio between the operation time of the robot in the job and the waiting time may be calculated and presented. As a result, it is possible to support consideration of improvement of the work order in the job or slowing down of unnecessary high speed operation.
  • the robot simulator according to the embodiment described above does not have to have all the functions described in the functional block diagram of FIG. 11 or FIG. 16, and may have at least a part of the functions.
  • the robot simulator is illustrated as including the two devices of the information processing device and the robot control device, but the invention is not limited thereto, and may be configured as an integrated device.
  • the input device of the information processing apparatus includes the pointing device
  • the present invention is not limited thereto, and another device may be used.
  • a display panel provided with a touch input function may be used to receive touch input by an operator.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

L'invention vise à faciliter l'examen d'éléments de processus individuels lorsqu'une simulation est effectuée en relation avec une série d'opérations réalisées par un robot. À cet effet, l'invention concerne un simulateur de robot comprenant : une unité d'exécution de programme qui exécute un programme de robot comprenant des tâches, qui sont des informations concernant des éléments de processus du robot ; une unité de calcul d'informations d'état qui calcule des informations d'état, qui sont des informations indiquant l'état du robot correspondant à l'évolution du temps, sur la base du résultat de l'exécution réalisée par l'unité d'exécution de programme ; et une unité de commande d'affichage qui affiche, sur une unité d'affichage, un premier chronogramme indiquant l'état du robot sur la base des informations d'état obtenues par l'unité de calcul d'informations d'état, ce premier chronogramme étant affiché de sorte qu'il devient possible de faire la distinction entre des périodes qui correspondent aux éléments de processus des tâches sur un axe des temps.
PCT/JP2018/028992 2017-09-26 2018-08-02 Simulateur de robot WO2019064916A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019544368A JPWO2019064916A1 (ja) 2017-09-26 2018-08-02 ロボットシミュレータ

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017184659 2017-09-26
JP2017-184659 2017-09-26

Publications (1)

Publication Number Publication Date
WO2019064916A1 true WO2019064916A1 (fr) 2019-04-04

Family

ID=65902757

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/028992 WO2019064916A1 (fr) 2017-09-26 2018-08-02 Simulateur de robot

Country Status (2)

Country Link
JP (1) JPWO2019064916A1 (fr)
WO (1) WO2019064916A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020059342A1 (fr) * 2018-09-21 2020-03-26 日本電産株式会社 Simulateur de robot
JP2021192945A (ja) * 2020-06-08 2021-12-23 株式会社安川電機 制御システム及び制御方法
US11520571B2 (en) 2019-11-12 2022-12-06 Bright Machines, Inc. Software defined manufacturing/assembly system
WO2022254538A1 (fr) * 2021-05-31 2022-12-08 ファナック株式会社 Dispositif de simulation de robot
WO2023119348A1 (fr) * 2021-12-20 2023-06-29 ファナック株式会社 Dispositif d'aide à l'enseignement de programme
US11826913B2 (en) 2020-07-01 2023-11-28 Kabushiki Kaisha Yaskawa Denki Control system, robot system and control method
WO2023243010A1 (fr) * 2022-06-15 2023-12-21 ヤマハ発動機株式会社 Dispositif d'assistance et système robot

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03188507A (ja) * 1989-12-18 1991-08-16 Fanuc Ltd 加工表示方式
JP2002132311A (ja) * 2000-10-26 2002-05-10 Citizen Watch Co Ltd 加工プログラムのグラフ表示方法及びそのための装置
JP2004148428A (ja) * 2002-10-30 2004-05-27 Honda Motor Co Ltd シミュレーション装置
JP2004174662A (ja) * 2002-11-27 2004-06-24 Fanuc Ltd ロボットの動作状態解析装置
JP2004209641A (ja) * 2002-12-30 2004-07-29 Abb Res Ltd 工業ロボットをプログラミングするための方法およびシステム
JP2009297877A (ja) * 2008-06-17 2009-12-24 Yamatake Corp ロボット教示プログラム作成装置及び作成方法
JP2011022688A (ja) * 2009-07-14 2011-02-03 Fanuc Ltd 工作機械の工具軌跡表示装置
JP2016018539A (ja) * 2014-07-11 2016-02-01 三菱電機株式会社 シミュレーション画像表示装置
JP2017126199A (ja) * 2016-01-14 2017-07-20 ファナック株式会社 ブロック時間表示手段を有する数値制御装置
JP2017138827A (ja) * 2016-02-04 2017-08-10 ファナック株式会社 加工時間予測装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03188507A (ja) * 1989-12-18 1991-08-16 Fanuc Ltd 加工表示方式
JP2002132311A (ja) * 2000-10-26 2002-05-10 Citizen Watch Co Ltd 加工プログラムのグラフ表示方法及びそのための装置
JP2004148428A (ja) * 2002-10-30 2004-05-27 Honda Motor Co Ltd シミュレーション装置
JP2004174662A (ja) * 2002-11-27 2004-06-24 Fanuc Ltd ロボットの動作状態解析装置
JP2004209641A (ja) * 2002-12-30 2004-07-29 Abb Res Ltd 工業ロボットをプログラミングするための方法およびシステム
JP2009297877A (ja) * 2008-06-17 2009-12-24 Yamatake Corp ロボット教示プログラム作成装置及び作成方法
JP2011022688A (ja) * 2009-07-14 2011-02-03 Fanuc Ltd 工作機械の工具軌跡表示装置
JP2016018539A (ja) * 2014-07-11 2016-02-01 三菱電機株式会社 シミュレーション画像表示装置
JP2017126199A (ja) * 2016-01-14 2017-07-20 ファナック株式会社 ブロック時間表示手段を有する数値制御装置
JP2017138827A (ja) * 2016-02-04 2017-08-10 ファナック株式会社 加工時間予測装置

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020059342A1 (fr) * 2018-09-21 2020-03-26 日本電産株式会社 Simulateur de robot
US11520571B2 (en) 2019-11-12 2022-12-06 Bright Machines, Inc. Software defined manufacturing/assembly system
JP2021192945A (ja) * 2020-06-08 2021-12-23 株式会社安川電機 制御システム及び制御方法
US11826913B2 (en) 2020-07-01 2023-11-28 Kabushiki Kaisha Yaskawa Denki Control system, robot system and control method
WO2022254538A1 (fr) * 2021-05-31 2022-12-08 ファナック株式会社 Dispositif de simulation de robot
WO2023119348A1 (fr) * 2021-12-20 2023-06-29 ファナック株式会社 Dispositif d'aide à l'enseignement de programme
WO2023243010A1 (fr) * 2022-06-15 2023-12-21 ヤマハ発動機株式会社 Dispositif d'assistance et système robot

Also Published As

Publication number Publication date
JPWO2019064916A1 (ja) 2020-10-15

Similar Documents

Publication Publication Date Title
WO2019064916A1 (fr) Simulateur de robot
JP7151713B2 (ja) ロボットシミュレータ
CN110394780B (zh) 机器人的仿真装置
EP3093108B1 (fr) Appareil et procédé de traitement d'informations
US20200117168A1 (en) Runtime Controller for Robotic Manufacturing System
US7194396B2 (en) Simulation device
KR910000873B1 (ko) 조립 로보트의 제어방법과 그 제어 시스템
CN104002297B (zh) 示教系统、示教方法及机器人系统
JP2019171501A (ja) ロボットの干渉判定装置、ロボットの干渉判定方法、プログラム
JP2019171498A (ja) ロボットプログラム実行装置、ロボットプログラム実行方法、プログラム
CN113710432A (zh) 用于确定机器人的轨迹的方法
JP7259860B2 (ja) ロボットの経路決定装置、ロボットの経路決定方法、プログラム
JPWO2019064919A1 (ja) ロボット教示装置
CN109605378B (zh) 运动参数的处理方法、装置和系统及存储介质
CN110000753B (zh) 用户交互方法、控制设备及存储介质
CN109807898B (zh) 运动控制方法、控制设备及存储介质
CN109910004B (zh) 用户交互方法、控制设备及存储介质
US6798416B2 (en) Generating animation data using multiple interpolation procedures
CN113021329B (zh) 一种机器人运动控制方法、装置、可读存储介质及机器人
WO2020059342A1 (fr) Simulateur de robot
CN110000775B (zh) 设备管理方法、控制设备及存储介质
JP7167925B2 (ja) ロボット教示装置
JP7099470B2 (ja) ロボット教示装置
WO2020066947A1 (fr) Dispositif de détermination d'itinéraire de robot, procédé de détermination d'itinéraire de robot, et programme
WO2019064914A1 (fr) Dispositif d'apprentissage de robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18860197

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019544368

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18860197

Country of ref document: EP

Kind code of ref document: A1