WO2015006199A1 - Dispositif multi-affichage pour des objets qui interagissent ou dépendent les uns des autres - Google Patents

Dispositif multi-affichage pour des objets qui interagissent ou dépendent les uns des autres Download PDF

Info

Publication number
WO2015006199A1
WO2015006199A1 PCT/US2014/045552 US2014045552W WO2015006199A1 WO 2015006199 A1 WO2015006199 A1 WO 2015006199A1 US 2014045552 W US2014045552 W US 2014045552W WO 2015006199 A1 WO2015006199 A1 WO 2015006199A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
instructions
objects
viewer
sequence
Prior art date
Application number
PCT/US2014/045552
Other languages
English (en)
Inventor
Gregory F. Rossano
Carlos Martinez
Stephen H. MURPHY
Mikael Hedelind
Thomas A. Fuhlbrigge
Original Assignee
Abb Technology Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Technology Ag filed Critical Abb Technology Ag
Publication of WO2015006199A1 publication Critical patent/WO2015006199A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/23Pc programming
    • G05B2219/23168Display progress of program
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/34Director, elements to supervisory
    • G05B2219/34336Avoid deadlock, lock-up
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/34Director, elements to supervisory
    • G05B2219/34348Coordination of operations, different machines, robots execute different tasks

Definitions

  • Multi-viewer for Interacting or Depending Objects
  • This invention relates to systems that have two or more objects that interact or depend on each other and more particularly to visualizing the operation of each object, and visually identifying when the objects interact with or depend on each other .
  • robot has meant a single mechanical unit that has one arm.
  • robot is broader and includes a single mechanical unit that has one or more actuated axes or a mobile platform.
  • the at least one other object may for example be any object that follows a procedure that requires the object to interact with or depend on a robot.
  • the other object could run or perform a procedure within the system or in a separate system.
  • One of the main challenges of a programmer or sequence planner of such systems is to understand the sequence of operations performed by the objects before, during and after their performance and when they interact with or depend on each other .
  • a system has an object that is a robot that is capable of running or performing an associated procedure that causes the robot to perform a sequence of operations.
  • the system also has one or more other objects that are each capable of running or performing an associated procedure that causes the one or more other objects to perform a sequence of operations which when performed interact with the robot or depend on the robot when the robot performs the robot sequence of operations .
  • the system further has a multi-viewer having the capability to display either before, during or after the robot and each of the one or more other objects run or perform the associated procedure or the sequence of operations of the robot, the sequence of operations of the one or more other objects and the interaction between the robot sequence of operations and the one or more other objects' sequence (s) of operations .
  • Fig. 1 shows a robot system that has two or more robots that interact with each other.
  • Fig. 2 shows a screen from a standard text comparison tool.
  • Fig. 3 shows one example of the user interface for the multi-viewer utility described herein.
  • Fig. 4 shows a program in which three robots arms interact in a system.
  • Fig. 5 is an example of the present user interface that shows where robot arms interact with each other .
  • Fig. 6 shows two pointers in each program viewer one of which represents the last known position of the robot and the other of which shows the instruction that is being or will be executed.
  • Fig. 7 shows a flowchart for the procedure to identify the coordination points between two or more robot programs .
  • Fig. 8 shows a flowchart for validating the content of the programs that have been at 702 of the flowchart of Fig. 7.
  • Fig. 9 shows a flowchart for the steps that are performed to draw the viewer .
  • Fig. 10 shows an embodiment for the data structure of the coordination points .
  • Fig. 11 shows a system that has a robot arm that interacts with a robot that is a mobile platform.
  • Fig. 12 shows a system that has a robot arm that interacts with a communication module.
  • Fig. 13 shows an example of a multi-viewer editor that includes a performance profiling tool for the two robot arms whose programs are shown in the editor shown in Fig . 3.
  • Fig. 14 shows a Gantt chart with the profiling data shown in Fig. 13.
  • Fig. 15 shows a flowchart for identifying the candidate steps.
  • FIG. 1 One example of a robot system that has two or more robots interacting with each other is shown in Fig. 1.
  • the arm of robot 100 picks up an engine block from a pallet 400 and orients the engine block so that robot 200 can perform the piston insertion.
  • the arm of robot 200 has a gripper 201 that robot 200 uses to pick up a piston subassembly.
  • An example of a robot system that has two or more robots working in coordination on separate workpieces in the same work cell is a system known as the FlexArc 250R MultiMove Cell available from ABB. In this system, two robots each have one arm that holds an arc welding tool. Each of the robots has an associated fixture for holding a workpiece to be welded by that robot .
  • the second robot performs an arc welding operation on its workpiece while a workpiece previously welded by the first robot is unloaded from the first robot's fixture and a new workpiece to be arc welded by the first robot is loaded onto that fixture.
  • the system indexes and the first robot begins to arc weld its associated workpiece.
  • the workpiece that was welded by the second robot is unloaded from the fixture for that robot and a new workpiece to be welded is loaded onto that fixture.
  • the system indexes to begin a new cycle of arc welding by both robots after the first robot has finished arc welding its workpiece.
  • the preferred approach to coordinate objects is to use a semaphore or flag implemented by shared signals or data.
  • Some systems provide built-in functionality which automate and facilitate the interaction of two or more robot arms.
  • An example of this built-in functionality is the ABB MultiMove® software, which provides an ABB RAPID® instruction to coordinate two or more robot arms during their operation.
  • Fig. 2 shows a screen from Araxis Merge a standard text comparison tool available from Araxis Ltd.
  • the standard comparison tool compares the text of two programs identified as "al” and "a.2" .
  • This tool provides a connection between both editors, with the goal of identifying differences between two or more texts.
  • this tool does not have the capability of parsing programs and identifying when these programs interact with each other.
  • this standard text comparison tool does not provide its user with a sense of flow and state in the operation of two programs .
  • multi-viewer editor that is, a utility, which serves as a visual aid where programs or procedures are displayed side by side, similar to text comparison tools but with functionality and features that are not in standard text comparison tools.
  • the multi-viewer utility promotes understanding by a programmer where robot arms interact or depend with each other of the sequence of each robot arm and their interactions before, during and after their performance by displaying each robot arm's program and graphically showing where the robot arms interact with each other.
  • the multi-viewer utility also promotes understanding by a programmer or a sequence creator of a robot and an object such as a mobile platform, communication object or other object such as a human who will interact with the robot before, during or after the performance by the robot and the object.
  • the showing of the interaction can be, but not limited to, with graphics and/or text such as, for example and without limitation, lines, color or font matching, icons, symbols and the like. Also, different types of communication points could be displayed by the same utility.
  • graphics and/or text such as, for example and without limitation, lines, color or font matching, icons, symbols and the like.
  • different types of communication points could be displayed by the same utility.
  • One example of the user interface for this multi- viewer utility for two arms is shown in Fig. 3. The two arms are identified as left and right arms .
  • the multi-viewer utility 30 has three columns 32, 34 and 36. Two of the three columns 32 for the left arm and 36 for the right arm are of a robot program viewer. The two columns 32 and 36 of the program viewer are separated from each other by one column 34 of a coordination viewer.
  • the multi-viewer utility 30 parses each of the programs for the two robots and identifies in the coordination viewer column 34 the different coordination points among the two robot arms in this example.
  • Each coordination point is represented inside the robot program in different ways : some of them are represented by a single instruction, while others are represented in a data structure. There are different types of coordination points such as WaitForArm, or dependencies of robot arm tasks which can be a single instruction or a group of instructions .
  • column 34 has an icon that is unique for each type of coordination point .
  • the multi-viewer tool 30 displays the coordination points and the robot programs.
  • the viewer makes distinctions between the different types of coordination points by using different graphical representations and/or text.
  • the viewer also adjusts the robot program lines to be aligned to the coordination points.
  • Each of the robot program viewers 32 and 36 show their respective programs in the order the program is executed. A programmer can scroll down to understand the overall sequence of each robot arm.
  • the coordination viewer 34 in the center column shows where the left and right arms interact with each other.
  • Fig. 3 shows a system in which the right arm whose robot program is shown in program viewer column 36 is picking a PCB and placing it in a position where the left arm whose robot program is shown in program viewer column 32 can clean a camera lens on the PCB .
  • the left arm also needs to pick a frame before it starts cleaning the lenses.
  • the first coordination point is a condition that is represented by a "stop light” 38. This icon indicates that the right arm can place the PCB only after the left arm has picked the frame.
  • the second coordination point is a synchronization point represented by a "plug" 40. This icon indicates that both robots must be executing that specific instruction at the same time.
  • the utility 30 provides automatic padding to align robot programs based on their coordination points.
  • One example of the padding is the empty white lines shown in Fig. 3 added in the right column 36 of the robot program viewer to align the stop light.
  • Another example is the ' 42 shown in Fig. 3 added in the right column 36 of the robot program viewer so that both arms have their WaitForArm instructions aligned.
  • the ' 42 can also represent adjusting the padding of coordinated instructions by transforming and compressing the display of non-coordinated instructions such that the coordinated instructions are aligned.
  • the multi-viewer tool shown in Fig. 3 displays each robot program within a robot program viewer that has a viewer column 32 for the left robot arm and a viewer column 36 for the right robot arm and a column 34 that displays the coordination points by an icon that is unique for each type of coordination point .
  • Fig. 3 can be modified to support systems that use more than two robot arms .
  • One way to support more than two arms is to only show two robots at a time with the user selecting which two robots to view.
  • FIG. 4 shows a program in which three robot arms interact in a system.
  • the user can arrange the columns based on the dependencies of the arms in adjacent columns.
  • dependencies of arms that are not in adjacent columns can be highlighted graphically with a special icon, or color or with text.
  • the lines for coordination points between two non-adjacent arms could be shown selectively by several means, such as showing them only when the user moves the mouse over a coordination icon, the user pressing a button, or other similar means .
  • a program for a robot arm can interact with two other objects that are not robot arms. There can be coordination points between the robot arm and the two other objects as well as coordination points between the two other objects.
  • the user interface can also be used to graphically show when possible deadlock scenarios exist (i.e. when both arms are stopped because they are waiting for each other) .
  • Fig. 5 shows an example of showing possible deadlocks by crossed lines 50 and 52.
  • the viewer can also display runtime information, such as a pointer of the current or next instruction to be executed, a pointer of the motion instruction currently being executed by the robot, the status of robot running, etc.
  • runtime information such as a pointer of the current or next instruction to be executed, a pointer of the motion instruction currently being executed by the robot, the status of robot running, etc.
  • Some robot systems such as the ABB IRB5®, provide a mechanism to track the current instructions. In an ABB system, this is traceable via ABB's PC SDK® software using event handlers. The viewer can use a PC SDK® event handler to update the location of the pointer icons when the robot is running .
  • Fig. 6 shows two pointers in each program viewer.
  • One is a "robot” 60 to represent the motion instruction currently being executed by the robot, and the other is an “arrow” 62 to show the instruction that will be executed next.
  • the pointer shows the sequence while the objects are performing their operations (i.e. at runtime) and updating the viewer with information about which step each object has completed, is waiting to complete, and/or is currently performing.
  • FIG. 7 there is shown a flowchart 700 for the procedure to identify the coordination points between two or more robot programs .
  • either the whole program or only selected portions are loaded into the multi-viewer utility.
  • the next two steps 704 and the optional step 706 are to discover the coordination points.
  • the loaded programs are parsed to identify the items to be shown in the viewer: (1) instructions and/or functions and (2) coordination points. These items could be, but are not limited to:
  • At 706 is the optional step for obtaining the metadata.
  • Some of the coordination points could be defined inside of the programs (e.g. data structures) or outside of the program (files saved in the robot memory or other software devices) .
  • the next two steps 708 and 710 are for obtaining the information of the coordination points.
  • the question is asked if more points are needed. If the answer is yes, the flow proceeds to 710 where all dependencies are searched and linked for the needed information. When the searching and linking of all dependencies are completed the flow returns to 708 to ask again if more points are needed.
  • FIG. 8 there is shown a flowchart 800 for validating the content of the programs that have been loaded at 702 of flowchart 700.
  • the context of the viewer is defined. This is either the whole program or only selected portions (e.g. the current execution, a predefined routine or function, or just specific lines of the programs).
  • the coordination points' information is obtained.
  • the next five steps 806, 808, 810, 812 and 814 are to validate the content of the program (s) .
  • the determination in 812 is that the coordination point can generate a dead-lock, then that point is flagged as a dead-lock at 814. If the determination in 812 is that the coordination point cannot generate a dead-lock, then the flow 800 proceeds to 806 described above .
  • FIG. 9 there is shown a flowchart 900 for the steps that are performed to draw the viewer .
  • the context of the viewer is defined. Either the whole programs of interest or only selected portions, for example, the current execution or a predefined routine or function or specific lines of the programs, can be shown in the viewer.
  • the defined context of the viewer is referred to below as the "programs" which can as described above be the programs of interest in their entirety or selected portions of those programs.
  • the text of the programs is obtained along with the coordination point information.
  • Step 906 the content of the programs are validated using the technique described for flowchart 800 shown in Fig. 8. As is described above for Fig. 8, validation of the programs includes validating that all of the coordination points are valid and there is not a possible dead-lock. Step 906 is an optional step because the viewer could be used to only show programs independent of their validity.
  • the next three steps 908, 910 and 912 are the padding to align the coordination points.
  • padding is added to the display in the viewer to align the points that are dependent and their dependencies. Empty lines can be inserted either at the dependent or dependency programs, based on the need to align the coordination points.
  • Each coordination point can have a different strategy to add padding. For example:
  • Task or Program Line dependency will insert padding in the dependent robot program.
  • Synchronization will insert padding in the program which has the lesser number of lines (defined in the context) . In this manner, the program with more number of lines will be kept the same, that is, no padding is inserted in that program.
  • Fig. 10 there is shown an embodiment for the data structure for the coordination points.
  • the embodiment shows one example of how the information required to represent a coordination point can be stored.
  • the program for the multi-viewer described herein may be resident in a robot controller or a separate computing device such as a PC.
  • the program to draw the viewer may take the form of a computer program product on a tangible computer-usable or computer-readable medium having computer-usable program code embodied in the medium.
  • the tangible computer-usable or computer-readable medium may be any tangible medium such as by way of example but without limitation, a portable computer diskette, a flash drive, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable readonly memory (EPROM or Flash memory) , a portable compact disc read-only memory (CD-ROM) , an optical storage device, or a magnetic storage device.
  • a portable computer diskette such as by way of example but without limitation, a portable computer diskette, a flash drive, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable readonly memory (EPROM or Flash memory) , a portable compact disc read-only memory (CD-ROM) , an optical storage device, or a magnetic storage device.
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable readonly memory
  • CD-ROM compact disc read-only memory
  • the above embodiment could be easily adapted to support procedures for objects other than robot arms.
  • One or more of the procedures could be a sequence of steps to be executed by a mobile platform, communication module, or other object.
  • the sequences can also be for a robot and a human who will interact with the robot.
  • the viewer can be used before the robot and the human each perform their sequence of steps to determine if when the sequences are performed there will be a conflict.
  • Fig. 11 shows an example of a system that has a robot arm that interacts with a robot that is a mobile platform.
  • the robot arm is mounted on the mobile platform, and the platform with the robot on it moves between two tables identified as the InTable and the OutTable.
  • the robot arm picks a part from the InTable that the robot arm will place on the OutTable when the mobile platform is docked at that table .
  • Fig. 11 shows three columns 1102, 1104 and 1106.
  • Column 1102 is a viewer for the program for the robot arm.
  • Column 1106 is a viewer for the operation of the program that controls the motion of the mobile platform.
  • Column 1104 shows the coordination points between the two viewers.
  • Column 1104 shows three "stop lights" 38.
  • the first stop light 38 is after the mobile platform has moved the robot arm to the InTable and has docked with that table. Thereafter the robot arm can perform the steps to pick a part from the InTable.
  • the second stop light 38 is after the robot has picked the part from the InTable.
  • the mobile platform remains stationary until the robot has completed the picking. Only then does the mobile platform move the robot holding the picked part to the OutTable.
  • the third stop light 38 is after the mobile table has docked with the OutTable.
  • the robot arm can perform the steps to place the picked part on the OutTable.
  • Fig. 12 shows an example of a system that has a robot arm that interacts with a communication module.
  • the robot arm picks a part from a supply of parts such as a bin.
  • the part has a bar code on it and the communication module communicates with a bar code reader that reads the bar code on the part and writes that information to a computing device.
  • Fig. 12 shows three columns 1202, 1204 and 1206.
  • Column 1202 is a viewer for the robot program.
  • Column 1206 is a viewer for the operation of the communication module that communicates with the bar code reader.
  • Column 1204 shows the coordination points between the two viewers .
  • the first stop light 38 is after the robot arm has picked the part with the bar code on it and moved to the location of the bar code reader. By the end of that operation the communication module should be fully initialized. After that has happened the bar code reader can read the bar code on the picked part, process the data and write that data to the computing device.
  • the second stop light 38 which is after the communication module has performed the reading, processing and writing steps ensures that the robot arm will not move to the position where the robot will place the picked part for further processing. Upon completion of the data processing resulting from the bar code reading, the robot arm will move to the position to place the picked part for further processing.
  • Fig. 13 there is shown one example of a multi-view editor that includes a performance profiling tool for the same two robot arms whose programs are shown in the editor of Fig. 3.
  • This editor shows the different timing and instruction statistics for each robot program and the statistics related to coordinated points.
  • An instruction can represent one or more operations to be performed.
  • time as used below includes not only absolute time but also averages, max, min etc. timing information. That is, "time” is time related information .
  • Fig. 13 shows the time used to execute a single instruction, a group of instructions, as well as the idle time for coordination points, e.g. waiting for a condition to be set by the other system or waiting to start a synchronous operation.
  • the execution time information is displayed in the "left arm” and "right arm” columns. Depending on the program's profiling capabilities, the viewer could display the execution time for each instruction, a group of instructions, or for the whole operation. Wait times are displayed in the center column which also shows the coordination points between the robots. Specific colors or images, not shown in Fig. 13, could be used to highlight which operation, group of operations or waiting times are the longest, shortest or similar time related attributes.
  • Fig. 14 Users can, as shown in Fig. 14, also generate a Gantt chart with the profiling data.
  • the chart shows the sequence of steps of both arms, categories of operations performed in those steps, and the coordination points along with the whole cycle time.
  • the Gantt chart of Fig. 14 shows an embodiment with three operation categories, namely, motion commands, hand commands and waiting time .
  • the total time for executing all of the commands in the motion and hand commands categories and the total waiting time category are shown in the figure.
  • the Gantt chart can show the total time for each operation category further broken down to show operation category timing information for each arm individually as well, e.g. showing the total time one arm has spent executing motion commands .
  • the user can also use the profiling data to calculate percentagewise how much time each different operation category took inside each step or during the whole cycle.
  • the idle time i.e. wait times
  • a method to generate a list of candidates of the undone operations can be used to identify the operations that can be performed by an idle robot.
  • the final list can organized by, but not limited to, the amount of time required to execute this operation (from the performance profile), the order in which such steps are ordered to be executed (i.e. order dependencies between operations), the required robots or resources used by the operations .
  • the list can be filtered depending on the input data provided by users. For example, the list can include only independent steps if there is a data structure to store the precedents and dependencies among steps. This helps the user to pick the best feasible step in order to optimize the work load and the cycle time.
  • a flowchart 1500 for identifying the candidate steps At action 1502, the program(s) are loaded. Either the whole program or only selected portions, e.g. the current execution, a predefined routine or function, or just specific lines of the programs, are loaded. At action 1504, the loaded program(s) are parsed. The goal of the parsing is to identify steps, that is, a group of one or more instructions and/or functions, by looking for: (1) singled and grouped instructions and/or functions, and (2) coordination points. These steps could be, but are not limited to:
  • Synchronization - these instructions contain or represent dependencies or coordination points.
  • each step that is single and grouped instructions and/or functions, has the following information :
  • Cycle time either from the last cycle or an average of the historical cycle times.
  • the flow then proceeds to query 1510 where it is asked if there are more candidate steps that need to be processed. If the answer is yes, then the flow proceeds to query 1512 where the next candidate step is obtained and it is asked if this candidate step is already scheduled. If the answer is yes, the flow returns to query 1510 to determine if there are more candidate steps to be processed. If the answer to this query is still yes the flow proceeds to query 1512 to obtain the next step and determine if it is already scheduled. When at query 1510 there are no more candidate steps to be scheduled, the answer at that query is no and the flow proceeds directly to action 1516 described below.
  • Action 1516 sorts the list of candidate steps and also has an input from action 1518, query 1520 and action 1522.
  • Action 1518 is triggered when one of the candidates changes its status . If the program is running the changes in the step status will modify the list.
  • Query 1520 asks if the step status is "Set to Scheduled?" If the answer is no, no action is taken. If the answer is yes, then at action 1522 the non- scheduled step is removed from the list of steps to be sorted by action 1516.
  • Action 1516 sorts the list of steps based on the user preference, either by:
  • Action 1524 filters the list of steps based on the user preference. One example it could be to filter the list to have only the independent steps. Action 1526 shows the filtered list of steps.
  • timing information can be based on multiple cycles.
  • the user can choose to view the average, minimum, or maximum times for operations and idle time, as well as standard deviations or other timing and statistical information. Viewing this information helps the user to decide how to change the sequence in order to optimize it.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Numerical Control (AREA)

Abstract

La présente invention concerne un dispositif multi-affichage pour des objets qui interagissent ou dépendent les uns des autres. Les objets peuvent être des robots ou un robot et un ou plusieurs autres objets. Le dispositif multi-affichage affiche les séquences ou étapes à réaliser par les objets et identifie visuellement sur l'écran si ces séquences ou étapes interagissent les unes avec les autres ou dépendent les unes des autres. Le dispositif multi-affichage peut également afficher les différentes statistiques de synchronisation et d'instructions pour chaque programme robotique et les statistiques relatives aux points coordonnés.
PCT/US2014/045552 2013-07-08 2014-07-07 Dispositif multi-affichage pour des objets qui interagissent ou dépendent les uns des autres WO2015006199A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361843680P 2013-07-08 2013-07-08
US61/843,680 2013-07-08

Publications (1)

Publication Number Publication Date
WO2015006199A1 true WO2015006199A1 (fr) 2015-01-15

Family

ID=51220915

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/045552 WO2015006199A1 (fr) 2013-07-08 2014-07-07 Dispositif multi-affichage pour des objets qui interagissent ou dépendent les uns des autres

Country Status (1)

Country Link
WO (1) WO2015006199A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017050895A1 (fr) * 2015-09-23 2017-03-30 Universität Bayreuth Dispositif de commande de robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0269737A1 (fr) * 1986-02-25 1988-06-08 Fanuc Ltd. Procede d'affichage du temps d'usinage d'un tour a quatre axes d'usinage simultane
EP0642067A1 (fr) * 1993-09-07 1995-03-08 Traub AG Système de programmation orientée dialogue pour une machine-outil à CNC
EP1341066A2 (fr) * 2002-02-28 2003-09-03 Star Micronics Co., Ltd. Machine-outil à commande numérique et sa méthode de modification de programme
US20040030452A1 (en) * 2002-08-06 2004-02-12 Stefan Graf Method and apparatus for the synchronous control of manipulators
EP2546030A2 (fr) * 2011-07-13 2013-01-16 KUKA Laboratories GmbH Commande de robots synchonisée uniforme et détection des blocages dans la synchronisation uniforme

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0269737A1 (fr) * 1986-02-25 1988-06-08 Fanuc Ltd. Procede d'affichage du temps d'usinage d'un tour a quatre axes d'usinage simultane
EP0642067A1 (fr) * 1993-09-07 1995-03-08 Traub AG Système de programmation orientée dialogue pour une machine-outil à CNC
EP1341066A2 (fr) * 2002-02-28 2003-09-03 Star Micronics Co., Ltd. Machine-outil à commande numérique et sa méthode de modification de programme
US20040030452A1 (en) * 2002-08-06 2004-02-12 Stefan Graf Method and apparatus for the synchronous control of manipulators
EP2546030A2 (fr) * 2011-07-13 2013-01-16 KUKA Laboratories GmbH Commande de robots synchonisée uniforme et détection des blocages dans la synchronisation uniforme

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017050895A1 (fr) * 2015-09-23 2017-03-30 Universität Bayreuth Dispositif de commande de robot

Similar Documents

Publication Publication Date Title
JP6018045B2 (ja) 選択したデータの一時的な書式設定とグラフ化
RU2672790C2 (ru) Управление жизненным циклом изготовления крепежных деталей изделия
US20140214203A1 (en) Operating program writing system
EP1970803B1 (fr) Infobulle interactive pour afficher et naviguer vers différentes positions d'un point de données
US8635598B2 (en) Automatic code decoration for code review
US9910643B2 (en) Program for program editing
US20080062195A1 (en) Method for coordinated drawing review of realted cad drawings
CN102234046B (zh) 电梯控制软件现场调试系统
US20170160716A1 (en) Numerical controller
CN111459599A (zh) 物料清单编辑方法、装置、设备及存储介质
CN104461864A (zh) 一种基于Eclipse插件的Java源代码缺陷检测方法及其系统
JP5814603B2 (ja) テスト仕様作成支援装置、方法及びプログラム
US11960925B2 (en) Program generating device, program generating method, and information storage medium
WO2015006199A1 (fr) Dispositif multi-affichage pour des objets qui interagissent ou dépendent les uns des autres
JP2011170697A (ja) ソフトウェア構造分析装置
Novais et al. Sourceminer evolution: A tool for supporting feature evolution comprehension
JP4902567B2 (ja) 作業手順書作成システム、及び、作業手順書作成プログラム
JP2009545824A (ja) プラズマ処理システムコンポーネント解析ソフトウェアおよびそのプラズマ処理システムコンポーネント解析ソフトウェアを製作するための方法およびシステム
US9285949B2 (en) Data processing system, method and program product of creating program information, and program information display system
KR101342607B1 (ko) 유저인터페이스를 이용한 컴퓨터프로그램의 워크플로 자동구현방법 및 장치
Bennett et al. Working with’monster’traces: Building a scalable, usable, sequence viewer
US20100169053A1 (en) Method for creating weldment inspection documents
JP2003303213A (ja) 設計作業支援装置
KR20220154888A (ko) 설계 변경점 관리 방법
CN108038144A (zh) 一种复杂系统人机交互工程信息的快速修改方法及工具

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14742443

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14742443

Country of ref document: EP

Kind code of ref document: A1