WO2023091406A1 - Robot with smart trajectory recording - Google Patents
Robot with smart trajectory recording Download PDFInfo
- Publication number
- WO2023091406A1 WO2023091406A1 PCT/US2022/049954 US2022049954W WO2023091406A1 WO 2023091406 A1 WO2023091406 A1 WO 2023091406A1 US 2022049954 W US2022049954 W US 2022049954W WO 2023091406 A1 WO2023091406 A1 WO 2023091406A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- spatial points
- robot
- welding system
- operator
- recorded
- Prior art date
Links
- 238000003466 welding Methods 0.000 claims abstract description 83
- 238000009499 grossing Methods 0.000 claims description 11
- 230000004913 activation Effects 0.000 claims description 2
- 238000000034 method Methods 0.000 description 10
- 238000003860 storage Methods 0.000 description 9
- 241000282414 Homo sapiens Species 0.000 description 7
- 238000005520 cutting process Methods 0.000 description 6
- 230000015654 memory Effects 0.000 description 4
- 238000012805 post-processing Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000009941 weaving Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/42—Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
- G05B19/423—Teaching successive positions by walk-through, i.e. the tool head or end effector being grasped and guided directly, with or without servo-assistance, to follow a path
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/36—Nc in input of data, input key till input tape
- G05B2219/36458—Teach only some points, for playback interpolation between points
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45104—Lasrobot, welding robot
Definitions
- Embodiments of the present invention relate to the use of robots (e.g., collaborative robots or cobots, or more traditional industrial robots) for welding or cutting. More specifically, embodiments of the present invention relate to systems and methods for recording robot path traversals and creating associated motion programs in a more efficient manner.
- robots e.g., collaborative robots or cobots, or more traditional industrial robots
- the motion of the tool center point (TCP) of a robot is automatically recorded as an operator moves the arm of the robot within the workspace.
- a welding gun/torch is attached to the end of the robot arm (with respect to the TCP) and the robot is calibrated to know where the TCP is located in three-dimensional space with respect to at least one coordinate system (e.g., the coordinate system of the robot and/or of the workspace).
- the operator pushes an actuator (e.g., a button or a switch) and proceeds to move the robot arm in space (e.g., ingress towards a weld joint to be welded, across the weld joint, and/or egress away from the weld joint).
- an actuator e.g., a button or a switch
- Pushing of the actuator starts the robot to record the position of the TCP (and effectively the tip of the welding gun/torch) in 3D space as the operator moves the robot arm.
- the operator does not have to subsequently push a button or do anything else to cause multiple position points to be recorded along the trajectory that the robot arm takes.
- Multiple position points defining the trajectory are recorded automatically as the operator moves the robot arm, and a motion program for the robot is automatically created. The number of recorded points is based on a distance traveled, in accordance with one embodiment.
- the operator can push the same actuator again (or a different actuator) to stop the recording.
- a system may include a “smart” welding torch that attaches to the arm of a robot and which can be moved along a desired welding path to program the desired welding path into a controller of the robot via actuators on the “smart” welding torch.
- the torch can be a “smart” cutting torch for performing cutting operations instead of welding operations.
- a welding system for generating a motion program includes a robot (e.g., a collaborative robot) having an arm and a calibrated tool center point (TCP).
- the welding system also includes a welding tool connected to a distal end of the arm of the robot in a determined relation to the TCP.
- the welding system further includes a programmable robot controller and a servo-mechanism apparatus configured to move the arm of the robot under the command of the programmable robot controller via a motion program.
- the welding system also includes an actuator operatively connected to the programmable robot controller.
- the welding system is configured to allow an operator to activate the actuator and proceed to manually move the arm of the robot in a 3D space from a start point to a destination point, defining an operator path.
- the operator path may be an ingress path toward a work piece, or an egress path away from a work piece.
- Activation of the actuator commands the programmable robot controller to record a plurality of spatial points of the TCP in the 3D space as the operator manually moves the arm of the robot along the operator path. The operator does not have to subsequently activate any actuator to cause the plurality of spatial points to be recorded along the operator path that the arm of the robot takes when manually moved by the operator.
- the programmable robot controller is configured to identify and eliminate extraneous spatial points from the plurality of spatial points as recorded, leaving a subset of the plurality of spatial points as recorded, where the extraneous spatial points are a result of extraneous movements of the arm of the robot by the operator.
- the extraneous spatial points are identified by the robot controller at least in part by the controller analyzing the plurality of spatial points as recorded to determine which spatial points of the plurality of spatial points as recorded are not needed to accomplish moving from the start point to the destination point within the 3D space.
- the programmable robot controller is configured to perform a spatial smoothing operation on the subset of the plurality of spatial points as recorded, resulting in a smoothed trajectory of spatial points, and the programmable robot controller is configured to automatically generate the motion program for the robot corresponding to the smoothed trajectory of spatial points.
- the programmable robot controller is configured to perform a spatial interpolation operation on the subset of the plurality of spatial points as recorded, resulting in an interpolated trajectory of spatial points.
- the programmable robot controller is configured to perform a spatial smoothing operation on the interpolated trajectory of spatial points as recorded, resulting in a smoothed trajectory of spatial points, and the programmable robot controller is configured to automatically generate the motion program for the robot corresponding to the smoothed trajectory of spatial points.
- a robotic welding system for generating a motion program.
- the robotic welding system includes a programmable robot controller of a robot (e.g., a collaborative robot) having a computer processor and a computer memory.
- the programmable robot controller is configured to digitally record, in the computer memory, a plurality of spatial points along an operator path in a 3D space taken by a calibrated tool center point (TCP) of the robot as an operator manually moves a robot arm of the robot along the operator path from a start point to a destination point within the 3D space.
- TCP calibrated tool center point
- the operator path may be an ingress path toward a work piece, or an egress path away from a work piece.
- the programmable robot controller is also configured to identify and eliminate, from the computer memory, extraneous spatial points from the plurality of spatial points as digitally recorded, leaving a subset of the plurality of spatial points as digitally recorded, where the extraneous spatial points are a result of extraneous movements of the robot arm by the operator.
- the extraneous spatial points are identified by the programmable robot controller at least in part by the programmable robot controller analyzing the plurality of spatial points as digitally recorded to determine which spatial points of the plurality of spatial points as digitally recorded are not needed to accomplish moving from the start point to the destination point within the 3D space.
- the programmable robot controller is configured to perform a spatial smoothing operation on the subset of the plurality of spatial points as recorded, resulting in a smoothed trajectory of spatial points, and the programmable robot controller is configured to automatically generate a motion program for the robot corresponding to the smoothed trajectory of spatial points.
- the programmable robot controller is configured to perform a spatial interpolation operation on the subset of the plurality of spatial points as recorded, resulting in an interpolated trajectory of spatial points.
- the programmable robot controller is configured to perform a spatial smoothing operation on the interpolated trajectory of spatial points as recorded, resulting in a smoothed trajectory of spatial points, and automatically generate a motion program for the robot corresponding to the smoothed trajectory of spatial points.
- FIG. 1 illustrates one embodiment of a welding system having a collaborative robot
- FIG. 2 illustrates one embodiment of the collaborative robot of FIG. 1;
- FIG. 3 illustrates one embodiment of a welding torch/gun attached to a distal end of an arm of the collaborative robot of FIG. 1 and FIG. 2;
- FIG. 4 illustrates another embodiment of a welding torch/gun attached to a distal end of an arm of the collaborative robot of FIG. 1 and FIG. 2;
- FIG. 5 illustrates an example of digitally recorded spatial position points of an operator path formed by an operator moving an arm of the collaborative robot of FIG. 2;
- FIG. 6 illustrates a block diagram of an example embodiment of a controller that can be used, for example, in the welding system of FIG. 1.
- FIG. 1 illustrates one embodiment of a welding system 100 having a collaborative robot 200.
- FIG. 2 illustrates one embodiment of the collaborative robot 200 of FIG. 1.
- the welding system 100 includes a collaborative robot 200, a welding power supply 310, and a programmable robot controller 320.
- the collaborative robot 200 has an arm 210 configured to hold a welding torch (e.g., a welding gun) 220.
- a welding torch e.g., a welding gun
- the terms “torch” and “gun” are used herein interchangeably.
- the collaborative robot 200 also includes a servo-mechanism apparatus 230 configured to move the arm 210 of the collaborative robot 200 under the command of the robot controller 320 (via a motion program).
- the welding system 100 includes a wire feeder (not shown) to feed welding wire to the welding torch 220.
- the motion of the calibrated tool center point (TCP 205) of a cobot is recorded as an operator moves the arm of the cobot within the workspace.
- a welding gun/torch 220 is attached to the end of the cobot arm 210 (with respect to the TCP) and the cobot is calibrated to know where the TCP is located in three-dimensional space with respect to a coordinate system (e.g., the coordinate system of the cobot).
- the operator pushes an actuator (e.g., a button or a switch) and proceeds to move the cobot arm in space (e.g., ingress towards a weld joint to be welded, across the weld joint, or egress away from the weld joint).
- the trajectories associated with ingress and egress are known herein as “air move” trajectories, since they are trajectories in the air and not at the weld joint.
- Pushing of the actuator starts the cobot to record the position of the TCP (and effectively the tip of the welding gun/torch) in 3D space (e.g., as coordinate points) as the operator moves the cobot arm.
- Another actuator 224 e.g., see FIG. 4 herein
- the welding torch 220 is a “smart” welding torch.
- the term “smart” is used herein to refer to certain programmable capabilities provided by the welding torch/gun 220 which are supported by the robot controller 320.
- the welding torch 220 includes a torch body 226 (e.g., see FIG. 3 herein) configured to be operatively connected to the arm 210 of the collaborative robot 200.
- One actuator device 224 e.g., see FIG.
- the torch body 226 is configured to be activated by a human user to enable the arm 210 of the collaborative robot, with the welding torch 220 connected, to be moved by the human user (operator) along a desired path (e.g., along an ingress path to a weld joint, along an egress path away from a weld joint, or along a weld joint itself along which the collaborative robot 200 is to make a weld).
- a desired path e.g., along an ingress path to a weld joint, along an egress path away from a weld joint, or along a weld joint itself along which the collaborative robot 200 is to make a weld.
- Another actuator device 222 (see FIG. 3 and FIG. 4) on the torch body is configured to be activated by the human user to initiate a recording cycle at a start point and to terminate the recording cycle at a destination or end point in three-dimensional (3D) space.
- the actuator devices 222 and 224 are configured to communicate with the robot controller 320 to record the weld points along the desired path. For example, a weld may be made along a desired welding path by the collaborative robot 200 using the welding torch 220 from the start point to the destination point.
- the operator does not have to repetitively push a button (actuator) or do anything else to cause multiple position points to be recorded (e.g., by the cobot controller 320) along the trajectory that the cobot arm takes.
- Multiple position points e.g., spatial coordinate points
- a motion program for the cobot is automatically created (e.g., by the cobot controller 320).
- the number of recorded points is based on a distance traveled, in accordance with one embodiment.
- the operator can push the same actuator 222 again (or another actuator) to stop the recording. Therefore, for any single weld, no more than two button clicks are required.
- the actuator to start/stop recording may be located on the cobot arm, the cobot body, or the welding torch/gun, in accordance with various embodiments. Other locations within the system are possible as well.
- post-processing e.g., spatial and/or temporal filtering
- the cobot welding system e.g., by the cobot controller 320
- the motion program is updated accordingly.
- the post-processing results in smoothing the subsequent automatic movement of the cobot along the recorded trajectory as commanded by the motion program. For example, any unwanted jerky, non-uniform motion (e.g., in position and/or orientation) introduced by the operator when moving the cobot arm is vastly reduced, if not totally eliminated. More uniform time spacing between the recorded points is also provided.
- programming of fine motion of the cobot arm is automated during post processing (e.g., for weaving along the weld joint, or when the welding torch/gun is rounding a corner of a weld).
- FIG. 3 illustrates one embodiment of a “smart” welding torch 220 configured to be used by the collaborative robot 200.
- the “smart” welding torch 220 is configured to be operatively connected to (attached to) the arm 210 of the collaborative robot 200.
- the “smart” welding torch 220 includes a first actuator device 222 (e.g., a momentary push-button device).
- the first actuator device 222 is on the torch body 226 and is configured to be activated by a human user (operator) to initiate a recording cycle along a path, for example, at a start point 227 and to terminate the recording cycle, for example, at a destination point 229 in three-dimensional (3D) space.
- the actuator device may be a momentary push-button device, a switch, or another type of actuator device, in accordance with various embodiments.
- Position points 227, 228, and 229 in three-dimensional space along the path are automatically recorded by the robot controller 320 as the operator moves the welding torch 220 (as attached to the cobot arm 210) along the path trajectory (before actual welding occurs). Again, an actuator does not have to be pushed or switched in order to indicate each position point to be recorded.
- Multiple position points (spatial points) defining the trajectory are recorded automatically as the operator moves the cobot arm, and a motion program for the cobot is automatically created. The number of recorded points is based on a distance traveled, in accordance with one embodiment.
- FIG. 4 illustrates another embodiment of a welding torch/gun 400 attached to a distal end of an arm 210 of the collaborative robot 200 of FIG. 1 and FIG. 2.
- the “smart” welding torch 400 also includes a second actuator device 224 (e.g., configured as a dead man’s switch).
- the second actuator device 224 on the torch body 226 is configured to be activated by a human user to enable the arm 210 of the collaborative robot 200, with the “smart” welding torch 400 connected, to be moved by the human user along a desired path (e.g., an ingress path, an egress path, or a weld path).
- the “smart” welding torch 400 allows the user to safely move the arm 210 of the robot 200 and create path programs (motion programs). When the user releases the second actuator device 224, the robot arm 210 cannot move (the arm is locked).
- the first and second actuator devices 222 and 224 communicate, either directly or indirectly, with the robot controller 320 to accomplish the functionality described herein, in accordance with one embodiment.
- the user holds down the second actuator device 224 to move the arm 210 while establishing start/end locations (to initiate a recording cycle and to terminate the recording cycle using the first actuator device 222) and automatically recording operator path position points (spatial coordinate points) without having to manipulate an actuator device at each recorded point.
- start/end locations to initiate a recording cycle and to terminate the recording cycle using the first actuator device 222
- operator path position points spatialal coordinate points
- the actuator device 222 may be located elsewhere on the system (e.g., on the cobot arm or on the servo-mechanism apparatus 230).
- FIG. 5 illustrates an example of digitally recorded spatial points 500 (dotted line) of an operator path formed by an operator moving an arm 210 of the collaborative robot 200 of FIG. 2 in a 3D space of a defined coordinate system of the robot 200.
- the operator path of FIG. 5 has a start point 510 (where recording is started) and a destination point 520 where recording is ended.
- the operator path may be an ingress path from the start point 510 to the beginning of a weld joint position at the destination point 520.
- FIG. 5 illustrates an example of digitally recorded spatial points 500 (dotted line) of an operator path formed by an operator moving an arm 210 of the collaborative robot 200 of FIG. 2 in a 3D space of a defined coordinate system of the robot 200.
- the operator path of FIG. 5 has a start point 510 (where recording is started) and a destination point 520 where recording is ended.
- the operator path may be an ingress path from the start point 510 to the beginning of a weld joint position at the destination point 520.
- the operator (for whatever reason) moved the robot arm 210 in an extraneous manner, instead of taking a more direct path.
- the portion of the recorded spatial points 500 within the depicted dotted-and-dashed oval 530 of FIG. 5 are extraneous spatial points.
- the extraneous spatial points are a result of extraneous movements of the arm 210 of the robot 200 by the operator and are not needed to accomplish moving from the start point 510 to the destination point 520 within the 3D space. For example, maybe the operator decided to re-orient an angular orientation of the torch 220, which resulted in the extraneous spatial points being recorded.
- the programmable robot controller 320 is programmed to identify and eliminate the extraneous spatial points from the recorded spatial points 500, leaving a subset of the recorded spatial points 500.
- the extraneous spatial points are identified by the robot controller 320 at least in part by the controller 320 analyzing the recorded spatial points 500 to determine which spatial points of the recorded spatial points as recorded are not needed to accomplish moving from the start point to the destination point within the 3D space. Referring to FIG. 5, the robot controller 320 would identify and eliminate the recorded spatial points within the dotted-and-dashed oval 530.
- identify and eliminate as used herein generally refers to differentiating the extraneous spatial points from the rest of the spatial points as originally recorded.
- initially identifying the extraneous spatial points may involve computing work space distance relationships and/or work space vector relationships for each recorded spatial point with respect to the start point and the destination point, and/or with respect to those recorded spatial points immediately surrounding or next to each recorded spatial point. Those recorded spatial points having distance relationships and/or vector relationships that are outside of some defined range(s) may be identified as extraneous spatial points. Other techniques of identifying the extraneous spatial points are possible as well, in accordance with other embodiments. Eliminating the extraneous spatial points, as identified, may involve deleting the extraneous spatial points from a computer memory, digitally flagging the extraneous spatial points as being extraneous, or some other technique, in accordance with various embodiments.
- the controller 320 can proceed to perform a spatial interpolation operation and/or a spatial smoothing operation on the remaining subset of the recorded spatial points. For example, additional spatial points may be generated via interpolation between certain recorded spatial points to fill in any gaps (e.g., between the recorded spatial points 502 and 504 in FIG. 5). More uniform time spacing between the recorded points can also be provided via temporal interpolation, for example. Then, the overall trajectory formed by the spatial points (as interpolated, if performed) can be smoothed, via a spatial smoothing operation, to eliminate any unwanted jerky or non-uniform motion from the trajectory.
- programming of fine motion of the robot arm is automated during post processing (e.g., for weaving along the weld joint, or when the welding torch/gun is rounding a comer of a weld).
- the robot controller 320 automatically generates a motion program for the robot.
- the motion program will command the robot to move such that resultant trajectory formed by the interpolated and/or smoothed spatial points is followed by the robot.
- FIG. 6 illustrates a block diagram of an example embodiment of a controller 600 that can be used, for example, in the welding system 100 of FIG. 1.
- the controller 600 may be used as the robot controller 320 and/or as a controller in the welding power supply 310.
- the controller 600 includes at least one processor 614 (e.g., a microprocessor, a central processing unit, a graphics processing unit) which communicates with a number of peripheral devices via bus subsystem 612.
- peripheral devices may include a storage subsystem 624, including, for example, a memory subsystem 628 and a file storage subsystem 626, user interface input devices 622, user interface output devices 620, and a network interface subsystem 616.
- the input and output devices allow user interaction with the controller 600.
- Network interface subsystem 616 provides an interface to outside networks and is coupled to corresponding interface devices in other devices.
- User interface input devices 622 may include a keyboard, pointing devices such as a mouse, trackball, touchpad, or graphics tablet, a scanner, a touchscreen incorporated into the display, audio input devices such as voice recognition systems, microphones, and/or other types of input devices.
- pointing devices such as a mouse, trackball, touchpad, or graphics tablet
- audio input devices such as voice recognition systems, microphones, and/or other types of input devices.
- use of the term "input device” is intended to include all possible types of devices and ways to input information into the controller 600 or onto a communication network.
- User interface output devices 620 may include a display subsystem, a printer, or non-visual displays such as audio output devices.
- the display subsystem may include a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), a projection device, or some other mechanism for creating a visible image.
- the display subsystem may also provide non-visual display such as via audio output devices.
- output device is intended to include all possible types of devices and ways to output information from the controller 600 to the user or to another machine or computer system.
- Storage subsystem 624 stores programming and data constructs that provide some or all of the functionality described herein.
- Computer-executable instructions and data are generally executed by processor 614 alone or in combination with other processors.
- Memory 628 used in the storage subsystem 624 can include a number of memories including a main random access memory (RAM) 630 for storage of instructions and data during program execution and a read only memory (ROM) 632 in which fixed instructions are stored.
- a file storage subsystem 626 can provide persistent storage for program and data files, and may include a hard disk drive, a solid state drive, a floppy disk drive along with associated removable media, a CD-ROM drive, an optical drive, or removable media cartridges.
- the computer-executable instructions and data implementing the functionality of certain embodiments may be stored by file storage subsystem 626 in the storage subsystem 624, or in other machines accessible by the processor(s) 614.
- Bus subsystem 612 provides a mechanism for letting the various components and subsystems of the controller 600 communicate with each other as intended. Although bus subsystem 612 is shown schematically as a single bus, alternative embodiments of the bus subsystem may use multiple buses. [0035]
- the controller 600 can be of varying types. Due to the ever-changing nature of computing devices and networks, the description of the controller 600 depicted in FIG. 6 is intended only as a specific example for purposes of illustrating some embodiments. Many other configurations of a controller are possible, having more or fewer components than the controller 600 depicted in FIG. 6.
- the start point and the destination point of a weld is recorded.
- a data base is accessed to indicate how to set the various angles, stick out, etc., based on visually (e.g., optically) observing the weld joint to determine what type of weld is to be created.
- a camera or a light detection and ranging (Lidar) capability may be employed.
- a weld wire sensing technique may be used to determine the type of weld (e.g., see U.S. Published Patent Application No. 2020/0139474 Al which is incorporated herein by reference it its entirety).
- the automatic recording and filtering of “air move” trajectories between welds is performed.
- the operator user can focus mainly on the creation of welds (starts, in-between points, and ends), not so much on the creation of points in the air as the cobot TCP moves to (ingress) and away from (egress) a weld.
- air motion of the cobot is recorded as a means to map free space around the cobot. All “air move” trajectories are recorded to help build a map of the entire cobot workspace. For a given part to be welded, this information can be used to generate collision free trajectories from one weld to another using established path planning algorithms.
- the cobot is deliberately allowed to collide with objects in the workspace of the cobot to learn and find a collision free trajectory. It has been observed that cobot collisions are harmless, unlike traditional industrial robots. Recording of air motion while an operator moves the arm of the cobot within the workspace is not performed. Instead, the welds are created and the cobot attempts to move its arm between the welds (from one weld to another) without regard for collisions. If the cobot collides on its way to the next weld, the cobot can modify its programmed trajectory in an attempt to find a collision free trajectory. Information from any collision itself (e.g., colliding forces) can be used.
- any collision itself e.g., colliding forces
- a “weld database” containing information about plate angles, torch angles, work angles, stick out, etc. is provided to achieve a particular robotic weld.
- a search strategy is automatically generated (tactile or with a more advanced sensor such as a small camera or a Lidar capability) to accurately locate the weld joint in space during production, where the part tolerances may vary from part to part.
- the weld joint is automatically located during the programming process for a new part to be programmed by locating the weld wire (or other sensor) “close enough” to the weld joint and using a predetermined search strategy to locate the wire more precisely within the weld joint.
- One embodiment provides advanced 3D sensing/scanning for weld feature recognition.
- the volumes and high mix production present in the cobot space are exploited as training data to train the cobot to recognize weld joints.
- Training data can be gathered from many cobots within a manufacturing space.
- ML machine learning
- Such a model can be used to predict torch angles, weld types, etc., given information from a 3D sensor.
- the system can offer suggestions benefiting the end-user while also gathering the information necessary to build such an ML model.
- Embodiments of the present invention are not limited to cobots. Other embodiments employing industrial robots are possible as well. For example, a user may use a teach pendant to get a robot close in to a joint, and then let the robot automatically perform a fine tuning of position and orientation at the joint. For example, a touch-sensing technique may be performed as discussed in U.S. Patent No. 9,833,857 B2 which is incorporated by reference herein in its entirety.
- a database is queried upon weld creation to get basic parameters for the type of weld joint.
- a user gets the welding wire close within the weld joint at a position (cobot via user arm, or robot using teach pendant), then lets the cobot/robot do a search routine to achieve a more fine-tuned positioning within the weld joint (robot knows joint type and other parameters, from database, and performs a corresponding search strategy). This is done for each position across the weld joint.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22839026.6A EP4433882A1 (en) | 2021-11-17 | 2022-11-15 | Robot with smart trajectory recording |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163280289P | 2021-11-17 | 2021-11-17 | |
US63/280,289 | 2021-11-17 | ||
US17/880,802 US20230150131A1 (en) | 2021-11-17 | 2022-08-04 | Robot with smart trajectory recording |
US17/880,802 | 2022-08-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023091406A1 true WO2023091406A1 (en) | 2023-05-25 |
Family
ID=84829641
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2022/049954 WO2023091406A1 (en) | 2021-11-17 | 2022-11-15 | Robot with smart trajectory recording |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023091406A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150094855A1 (en) * | 2012-05-04 | 2015-04-02 | Leoni Cia Cable Systems Sas | Imitation learning method for a multi-axis manipulator |
US9833857B2 (en) | 2011-01-10 | 2017-12-05 | Fronius International Gmbh | Method for teaching/testing a motion sequence of a welding robot, welding robot and control system for same |
US9919424B1 (en) * | 2015-07-27 | 2018-03-20 | X Development Llc | Analog control switch for end-effector |
US20190086907A1 (en) * | 2016-04-12 | 2019-03-21 | Universal Robots A/S | Programming a robot by demonstration |
US20200139474A1 (en) | 2017-06-26 | 2020-05-07 | Fronius International Gmbh | Method and Device for Scanning a Workpiece Surface of a Metal Workpiece |
-
2022
- 2022-11-15 WO PCT/US2022/049954 patent/WO2023091406A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9833857B2 (en) | 2011-01-10 | 2017-12-05 | Fronius International Gmbh | Method for teaching/testing a motion sequence of a welding robot, welding robot and control system for same |
US20150094855A1 (en) * | 2012-05-04 | 2015-04-02 | Leoni Cia Cable Systems Sas | Imitation learning method for a multi-axis manipulator |
US9919424B1 (en) * | 2015-07-27 | 2018-03-20 | X Development Llc | Analog control switch for end-effector |
US20190086907A1 (en) * | 2016-04-12 | 2019-03-21 | Universal Robots A/S | Programming a robot by demonstration |
US20200139474A1 (en) | 2017-06-26 | 2020-05-07 | Fronius International Gmbh | Method and Device for Scanning a Workpiece Surface of a Metal Workpiece |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10737396B2 (en) | Method and apparatus for robot path teaching | |
JP5784670B2 (en) | Method, apparatus, and system for automated motion for medical robots | |
US8315738B2 (en) | Multi-arm robot system interference check via three dimensional automatic zones | |
US11648683B2 (en) | Autonomous welding robots | |
US9625899B2 (en) | Teaching system, robot system, and teaching method | |
US10394216B2 (en) | Method and system for correcting a processing path of a robot-guided tool | |
US20230150131A1 (en) | Robot with smart trajectory recording | |
CN107848117B (en) | Robot system and control method | |
US20230234230A1 (en) | Robot with smart trajectory recording | |
JP7259860B2 (en) | ROBOT ROUTE DETERMINATION DEVICE, ROBOT ROUTE DETERMINATION METHOD, AND PROGRAM | |
KR101947160B1 (en) | Coding education method using augmented reality | |
WO2023091406A1 (en) | Robot with smart trajectory recording | |
US20240308070A1 (en) | Robot with smart path planning for multiple parts | |
WO2024206060A1 (en) | Robot with smart trajectory recording | |
WO2022259600A1 (en) | Information processing device, information processing system, information processing method, and program | |
CN118664581A (en) | Robot for intelligent path planning for multiple parts | |
BR102024004918A2 (en) | PATH PLANNING METHOD USING A COLLABORATIVE ROBOTIC SYSTEM AND COLLABORATIVE ROBOTIC WELDING SYSTEM FOR PATH PLANNING | |
KR102007491B1 (en) | Method for providing coding training using virtual robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22839026 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112024008977 Country of ref document: BR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022839026 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022839026 Country of ref document: EP Effective date: 20240617 |
|
ENP | Entry into the national phase |
Ref document number: 112024008977 Country of ref document: BR Kind code of ref document: A2 Effective date: 20240507 |