WO2024064281A1 - Systèmes et techniques de modification de pièce - Google Patents

Systèmes et techniques de modification de pièce Download PDF

Info

Publication number
WO2024064281A1
WO2024064281A1 PCT/US2023/033377 US2023033377W WO2024064281A1 WO 2024064281 A1 WO2024064281 A1 WO 2024064281A1 US 2023033377 W US2023033377 W US 2023033377W WO 2024064281 A1 WO2024064281 A1 WO 2024064281A1
Authority
WO
WIPO (PCT)
Prior art keywords
workpiece
image
bead
toolpath
equipment
Prior art date
Application number
PCT/US2023/033377
Other languages
English (en)
Inventor
Zhijie Zhu
Erich A. MIELKE
Original Assignee
3M Innovative Properties Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3M Innovative Properties Company filed Critical 3M Innovative Properties Company
Publication of WO2024064281A1 publication Critical patent/WO2024064281A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B12/00Arrangements for controlling delivery; Arrangements for controlling the spray area
    • B05B12/08Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means
    • B05B12/084Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means responsive to condition of liquid or other fluent material already sprayed on the target, e.g. coating thickness, weight or pattern
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B12/00Arrangements for controlling delivery; Arrangements for controlling the spray area
    • B05B12/08Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means
    • B05B12/12Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means responsive to conditions of ambient medium or target, e.g. humidity, temperature position or movement of the target relative to the spray apparatus
    • B05B12/122Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means responsive to conditions of ambient medium or target, e.g. humidity, temperature position or movement of the target relative to the spray apparatus responsive to presence or shape of target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B12/00Arrangements for controlling delivery; Arrangements for controlling the spray area
    • B05B12/08Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means
    • B05B12/12Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means responsive to conditions of ambient medium or target, e.g. humidity, temperature position or movement of the target relative to the spray apparatus
    • B05B12/126Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means responsive to conditions of ambient medium or target, e.g. humidity, temperature position or movement of the target relative to the spray apparatus responsive to target velocity, e.g. to relative velocity between spray apparatus and target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B13/00Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
    • B05B13/02Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work
    • B05B13/04Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation
    • B05B13/0431Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation with spray heads moved by robots or articulated arms, e.g. for applying liquid or other fluent material to 3D-surfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B15/00Details of spraying plant or spraying apparatus not otherwise provided for; Accessories
    • B05B15/14Arrangements for preventing or controlling structural damage to spraying apparatus or its outlets, e.g. for breaking at desired places; Arrangements for handling or replacing damaged parts
    • B05B15/16Arrangements for preventing or controlling structural damage to spraying apparatus or its outlets, e.g. for breaking at desired places; Arrangements for handling or replacing damaged parts for preventing non-intended contact between spray heads or nozzles and foreign bodies, e.g. nozzle guards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05CAPPARATUS FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05C11/00Component parts, details or accessories not specifically provided for in groups B05C1/00 - B05C9/00
    • B05C11/10Storage, supply or control of liquid or other fluent material; Recovery of excess liquid or other fluent material
    • B05C11/1002Means for controlling supply, i.e. flow or pressure, of liquid or other fluent material to the applying apparatus, e.g. valves
    • B05C11/1005Means for controlling supply, i.e. flow or pressure, of liquid or other fluent material to the applying apparatus, e.g. valves responsive to condition of liquid or other fluent material already applied to the surface, e.g. coating thickness, weight or pattern
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05CAPPARATUS FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05C11/00Component parts, details or accessories not specifically provided for in groups B05C1/00 - B05C9/00
    • B05C11/10Storage, supply or control of liquid or other fluent material; Recovery of excess liquid or other fluent material
    • B05C11/1002Means for controlling supply, i.e. flow or pressure, of liquid or other fluent material to the applying apparatus, e.g. valves
    • B05C11/1015Means for controlling supply, i.e. flow or pressure, of liquid or other fluent material to the applying apparatus, e.g. valves responsive to a conditions of ambient medium or target, e.g. humidity, temperature ; responsive to position or movement of the coating head relative to the target
    • B05C11/1021Means for controlling supply, i.e. flow or pressure, of liquid or other fluent material to the applying apparatus, e.g. valves responsive to a conditions of ambient medium or target, e.g. humidity, temperature ; responsive to position or movement of the coating head relative to the target responsive to presence or shape of target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05CAPPARATUS FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05C5/00Apparatus in which liquid or other fluent material is projected, poured or allowed to flow on to the surface of the work
    • B05C5/02Apparatus in which liquid or other fluent material is projected, poured or allowed to flow on to the surface of the work the liquid or other fluent material being discharged through an outlet orifice by pressure, e.g. from an outlet device in contact or almost in contact, with the work
    • B05C5/0208Apparatus in which liquid or other fluent material is projected, poured or allowed to flow on to the surface of the work the liquid or other fluent material being discharged through an outlet orifice by pressure, e.g. from an outlet device in contact or almost in contact, with the work for applying liquid or other fluent material to separate articles
    • B05C5/0212Apparatus in which liquid or other fluent material is projected, poured or allowed to flow on to the surface of the work the liquid or other fluent material being discharged through an outlet orifice by pressure, e.g. from an outlet device in contact or almost in contact, with the work for applying liquid or other fluent material to separate articles only at particular parts of the articles
    • B05C5/0216Apparatus in which liquid or other fluent material is projected, poured or allowed to flow on to the surface of the work the liquid or other fluent material being discharged through an outlet orifice by pressure, e.g. from an outlet device in contact or almost in contact, with the work for applying liquid or other fluent material to separate articles only at particular parts of the articles by relative movement of article and outlet according to a predetermined path
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1684Tracking a line or surface by means of sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36405Adjust path by detecting path, line with a photosensor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37575Pre-process, measure workpiece before machining
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40425Sensing, vision based motion planning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40449Continuous, smooth robot motion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40516Replanning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40564Recognize shape, contour of object, extract position and orientation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40609Camera to monitor end effector as well as object to be handled
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40613Camera, laser scanner on end effector, hand eye manipulator, local
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45013Spraying, coating, painting
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45065Sealing, painting robot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45104Lasrobot, welding robot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45235Dispensing adhesive, solder paste, for pcb
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Definitions

  • the process comprises capturing an image of the workpiece, using an image capture equipment; processing the image, using an image processor, to determine at least one of a position or an orientation of the workpiece; obtaining a nominal toolpath; measuring, using an online sensor A, a surface of the workpiece to obtain a workpiece surface measurement; and generating, using a waypoint modifier, an updated toolpath based on at least the nominal toolpath and the position and/or the orientation of the workpiece.
  • the process further comprises generating, using a path planner, a planned path for the workpiece-modifying equipment, based on at least the updated toolpath and the workpiece surface measurement; scanning, using an online sensor B, a modification to the workpiece performed by a workpiece-modifying equipment; determining modifier parameters, based on at least one modifier characteristic; and modifying the workpiece by the workpiece- modifying equipment according to the planned path and the modifier parameters.
  • a path planner a planned path for the workpiece-modifying equipment, based on at least the updated toolpath and the workpiece surface measurement
  • scanning using an online sensor B, a modification to the workpiece performed by a workpiece-modifying equipment
  • determining modifier parameters based on at least one modifier characteristic
  • modifying the workpiece by the workpiece- modifying equipment according to the planned path and the modifier parameters.
  • the process comprises capturing an image of the workpiece, using an image capture equipment; processing the image, using an image processor, to determine at least one of a position or an orientation of the workpiece; obtaining a nominal toolpath; measuring, using an online sensor A, a surface of the workpiece to obtain a workpiece surface measurement; and generating, using a waypoint modifier, an updated toolpath based on at least the nominal toolpath and the position and/or the orientation of the workpiece.
  • the process further comprises generating, using a path planner, a planned path for the dispenser, based on at least the updated toolpath and the workpiece surface measurement; scanning, using an online sensor B, a bead of material on the workpiece dispensed by the dispenser; determining dispensing parameters, based on a bead profile; and communicating a control signal to a dispenser controller that causes the dispenser to dispense the material on the workpiece according to the planned path and the dispensing parameters.
  • a system comprises an image capture equipment configured to capture an image of the workpiece; an online sensor A configured to measure a surface of the workpiece; and an online sensor B configured to scan a modification to the workpiece.
  • the system further comprises a controller comprising: a robot motion controller configured to control a movement of a robot arm; an offline registration unit configured to determine a list of waypoints based on at least the image of the workpiece captured by the image capture equipment, a digital model of the workpiece, and a nominal toolpath; an online localization unit configured to determine an updated list of waypoints based on a stream of measurements of the surface of the workpiece measured by the online sensor A and a stream of joint poses of the robot arm; a path planner unit configured to plan a path based on the list of waypoints and the updated list of waypoints; and a modification equipment controller configured to control a workpiece-modifying equipment according to the planned path and at least one modification parameter.
  • a controller comprising: a robot motion controller configured to control a movement of a robot arm; an offline registration unit configured to determine a list of waypoints based on at least the image of the workpiece captured by the image capture equipment, a digital model of the workpiece, and a nominal toolpath; an
  • the system additionally comprises a modifier comprising the workpiece-modifying equipment.
  • FIG.1 illustrates a system for modifying a workpiece in which example embodiments can be implemented.
  • FIG.1A illustrates an offline registration.
  • FIG.1B illustrates a path planner.
  • FIG.2 illustrates another system for modifying a workpiece in which example embodiments can be implemented.
  • FIG.2A illustrates another offline registration.
  • FIG.2B illustrates another path planner.
  • FIG.3 illustrates a schematic of a portion of a modifying system in which example embodiments can be implemented.
  • FIG.4 illustrates a schematic of coordinate systems for a portion of a modifying system in which example embodiments can be implemented.
  • FIG.5 illustrates a method for modifying a workpiece in which example embodiments can be implemented.
  • FIG.6 illustrates an example system for modifying a workpiece in accordance with embodiments herein.
  • FIGS.7A–E illustrate schematics of feature correspondences between an image and a computer- aided design (CAD) file in accordance with embodiments herein.
  • FIG.8 is a graph of mean pixel error versus iteration in accordance with an embodiment described herein.
  • FIGS.9A–B illustrate schematics of a laser line with which example embodiments can be implemented.
  • FIG.10A illustrates a schematic of a format of a toolpath in accordance with an embodiment described herein.
  • FIG.10B illustrates a schematic of an adjusted waypoint location based on a surface profile scan in accordance with an embodiment described herein.
  • FIGS.11A–C illustrate XYZ components of a robot TCP trajectory in accordance with an embodiment described herein.
  • FIG.11D illustrates a 3D plot of the robot TCP trajectory in accordance with an embodiment described herein.
  • FIG.12A illustrates a schematic of online orientation specifying an axial direction of a nozzle in which example embodiments can be implemented.
  • FIG.12B illustrates a schematic of online orientation specifying rotation around the nozzle axis in which example embodiments can be implemented.
  • FIG.13 illustrates a schematic of an inline surface normal estimation in accordance with an embodiment described herein.
  • FIGS.14A–B illustrate schematics of orientation adjustment with non-perpendicularity with which example embodiments can be implemented.
  • FIG.15 illustrates computed surface normal vectors along a robot toolpath of an embodiment described herein.
  • FIGS.16A–B illustrate schematics showing an observability issue when TCP follows the tangent direction of the toolpath.
  • FIG.17A illustrates a schematic of a lack of in-plane rotation around a nozzle axis.
  • FIG.17B illustrates a schematic of in-plane rotation around a nozzle axis for optimal field of view along the toolpath.
  • FIG.18 illustrates a schematic showing an adjustment of waypoint location based on the midpoint of rib features within the surface profile scan in accordance with an embodiment described herein.
  • FIG.19 is a photograph of a test fixture having a curvilinear surface and parallel ribs in accordance with an embodiment described herein.
  • FIG.20 includes graphs of recorded robot trajectory for rib feature tracking when rib feature tracking was applied in accordance with an embodiment described herein.
  • FIG.21 illustrates a schematic showing adjustment of a waypoint location based on the edge point within a surface profile scan in accordance with an embodiment described herein.
  • FIG.22 is a photograph of a test fixture having a curved edge in accordance with an embodiment described herein.
  • FIGS.23A–C include graphs of recorded robot trajectory for edge feature tracking of XYZ components in accordance with an embodiment described herein.
  • FIG.23D includes a graph of recorded robot trajectory for edge feature tracking that is a 3D plot of TCP trajectory in accordance with an embodiment described herein.
  • FIG.24 illustrates a schematic of a method of modifying process parameters with which example embodiments can be implemented.
  • FIG.25 illustrates a workpiece modification system in an example network architecture.
  • FIGS.26–28 illustrate example computing devices that can be used in embodiments herein.
  • FIG.29 illustrates an example system of this disclosure.
  • FIG.30 illustrates a non-limiting example of a closed-loop robotic dispensing testbed of this disclosure.
  • FIG.31 is a block diagram illustrating an example closed-loop robotic dispensing according to aspects of this disclosure.
  • FIG.32 is a schematic showing an example format of a toolpath for the closed-loop robotic dispensing techniques of this disclosure.
  • FIGS.33 illustrate aspects of bead location detection according to aspects of this disclosure.
  • FIGS.34 illustrate bead geometry estimation on two scan profile samples.
  • FIG.35 is a graph illustrating a log of time consumption for each computation cycle during a closed-loop dispensing test run.
  • FIG.36 is a schematic showing an example of how variation of waypoint position and tool velocity can affect time parameterization along a toolpath.
  • FIGS.37 illustrate schematics showing the proposed time re-parametrization strategy.
  • FIGS.38 illustrate effectiveness aspects of utilizing the path streaming technique of this disclosure with time re-parametrization for closed loop-dispensing with variable velocity.
  • FIGS.39 illustrate results of bead shape compensation with a na ⁇ ve control law.
  • FIGS.40 illustrate one or more potential issues with the na ⁇ ve control law for bead shape compensation.
  • FIGS.41 illustrate a transient state of bead width and tool velocity along the dispense path.
  • FIGS.42 illustrate schematics of steady state checkers.
  • FIG.43 illustrates a tool velocity profile in the upper plot and a bead width profile in the lower plot along a straight-line dispense path in a test run with bead shape control.
  • FIGS.44 illustrate experimental results with desired bead width of 10mm (FIG.44A), 8mm (FIG. 44B), and 5mm (FIG.44C).
  • FIGS.45 illustrates experimental results with control gains of 0.2 (FIG.45A), 0.5 (FIG.45B), and 0.8 (FIG.45C).
  • FIGS.46 illustrate experimental results with wait time at the beginning of dispense process being 2 seconds (FIG.46A) and 3 seconds (FIG.46B).
  • FIG.47 shows experimental results of dispensing on a relatively flatter substrate (FIG.47A) and a relatively more curved substrate (FIG.47B) with online part shape and bead shape compensation.
  • the present disclosure relates to methods and systems for modifying a workpiece, for instance using closed-loop feedback control of robot trajectory based on sensory feedback of surface geometry.
  • the methods and/or systems are used to achieve a higher level of process autonomy and higher dispensing quality for dispensing a material (e.g., an adhesive, a sealant, a paint, a thermally conductive material, etc.) on parts having complex geometry.
  • a “workpiece” is meant to include any an object being worked on with a tool or machine, and the term may be used interchangeably with “part” herein. For certain tasks, a highly precise registration of location of a workpiece is needed for successful modification to the workpiece.
  • FIG.1 illustrates a system for modifying a workpiece in which example embodiments can be implemented.
  • the system 100 includes main components of a controller 110, an inspection system 120, a modifier 130.
  • System 100 is illustrated in FIG.1 as in communication with a data store 140. However, it is expressly contemplated that, in some embodiments, data store 140 may be local to, or integrated into system 100.
  • system 100 is illustrated as projecting to a display 10. However, it is expressly contemplated that system may be integrated into a processor of a device that includes display 10.
  • the system 100 may be implemented by one or more suitable computing devices in communication with each of these main components.
  • the controller 110 comprises an offline registration unit 111 that receives as inputs at least an image of a workpiece taken by image capture equipment 121, a digital model of the workpiece (e.g., workpiece model 142 from data store 140), a predefined modification toolpath (e.g., nominal path 141) on the workpiece model, and a tool center point (TCP) pose from a robot motion controller 116 (e.g., controlling an articulated arm of the robot).
  • the controller 110 also processes an image of a workpiece that includes a part of the robot (e.g., an end effector, such as a workpiece modification tool), and a digital model of at least a portion of the robot 153.
  • the offline registration unit 111 outputs a spatially transformed toolpath (in the format of a list of waypoints) based on the estimated position and orientation of the workpiece in the robot coordinate system. These waypoints may be sent to a path planner unit 113. In some cases, the waypoints may be sent to an optional predictive model 148 and/or an optional information database 149 for future use/reference. For example, Referring to FIG.1A, there are optionally several units within the offline registration unit 111.
  • a nominal toolpath retriever 111a retrieves a nominal toolpath 141 (e.g., from the data store 140).
  • An image retriever 113b retrieves the image from the image capture equipment 121.
  • An image processor 111c processes the image.
  • Processing may include at least part registration (i.e., determining position and/or orientation) based on the taken images, feature detection (e.g., of features such as an edge, a rib, a seam, a corner, a fiducial marker, etc.) both in the image and on the surface model,), and 3D pose estimation based on corresponding features.
  • a waypoint modifier 111d generates an updated toolpath based on the (e.g., processed) image of the workpiece and the nominal toolpath, plus one of a set of joint positions of a robot art (e.g., from which at least one TCP pose can be determined) or an image of an end effector of the workpiece-modifying equipment 131.
  • the controller 110 further comprises an online localization unit 112 that receives as inputs at least workpiece surface profiles streamed in real-time from an online sensor A 122 and TCP poses streamed in real-time from the motion controller of the articulated robot arm.
  • the online localization unit 112 outputs the updated waypoint locations that trace the actual workpiece surface with desired clearance from the workpiece-modifying equipment 131.
  • the controller 110 also comprises a path planner unit 113 that receives as inputs at least the list of waypoint locations from the online localization unit 112, and outputs a smooth robot trajectory (e.g., minimum-jerk) for a modification equipment controller 114 to execute.
  • “Jerk” is defined as the 3rd derivative of joint position trajectory, which is minimized throughout the path to achieve smoothness.
  • the controller 110 additionally comprises a modification equipment controller 114 that applies modification parameters 144 exported from a modification model 145 and controls the on/off state of the workpiece-modifying equipment 131.
  • a modification equipment controller 114 that applies modification parameters 144 exported from a modification model 145 and controls the on/off state of the workpiece-modifying equipment 131.
  • a waypoint retriever 113a retrieves a current set of waypoints (e.g., from the data store 140).
  • a path retriever 113b retrieves an actual path of the robot arm across the workpiece surface from the online sensor A 122.
  • a path analyzer 113c compares the actual path with the waypoints. Based on the comparison, a waypoint modifier 113d calculates a new set of waypoints.
  • a robot motion controller 116 may receive input from the path planner 113 and sends instructions to the workpiece-modifying equipment 131 to move the workpiece-modifying equipment 131 along the updated toolpath, e.g., that is provided by the waypoint modifier 113d.
  • the controller 110 optionally also comprises a graphical user interface (GUI) generator 115, which may be configured to send information to a display 10.
  • GUI graphical user interface
  • the GUI generator 115 may generate a graphical user interface for display on a display component 10 based on some or all of the information gathered or generated by the controller 110. Suitable displays include for instance and without limitation, a computer screen, a smart phone, or some other user device.
  • Other units 117 may further be included in the controller 110.
  • the controller 110 is described as having the functionality of receiving and sending communicable information to and from other devices. This may be done through an application program interface, for example, such that the controller 110 can receive and communicate with any units and/or models within each of the inspection system 120, the modifier 130, and the data store 140.
  • the inspection system 120 comprises image capture equipment 121 that captures an image of the workpiece and outputs the image to the offline registration unit 111. Any suitable equipment may be employed that captures an image, for instance and without limitation, a red, blue, and green (RGB) camera, a black and white (B&W) camera, or a three-dimensional (3D) image sensor.
  • RGB red, blue, and green
  • B&W black and white
  • 3D three-dimensional
  • the inspection system 120 also comprises an online sensor A 122 that may be attached to a surface of the workpiece-modifying equipment 131 and that performs online scanning of a surface of a workpiece ahead of the workpiece-modifying equipment 131.
  • the inspection system 120 further comprises an online sensor B 123 that may be attached to a different (e.g., opposite) surface of the workpiece-modifying equipment 131 and that performs online scanning of a modification to the workpiece performed by the workpiece-modifying equipment 131.
  • Each of the online sensor A 122 and the online sensor B 123 senses in situ and obtains data in (e.g., near) real-time conditions.
  • any suitable equipment for each of the online sensors A and B may be employed that scans, for instance and without limitation, equipment independently selected from a laser profilometer, an area snapshot sensor, a triangulation-based sensor, a time-of-flight sensor, a laser point sensor, an optical coherence tomography sensor, a confocal sensor, and a dynamic vision sensor.
  • a laser profilometer is preferred as an online sensor as it advantageously scans faster and simpler than point clouds that provide data in 2.5D instead of the 2 dimension of the laser profilometer.
  • Other units 124 may further be included in the inspection system 120.
  • the modifier 130 comprises a workpiece-modifying equipment 131.
  • any suitable equipment that is configured to modify a workpiece may be employed in the system, for instance and without limitation, a dispenser, a sander, a polisher, a cutter, a drill, a sprayer, a welder, etc.
  • a dispenser is employed and will be discussed in detail below.
  • the modifier 130 may be a robotic adhesive dispensing unit with a robot arm having a dispenser (e.g., 131).
  • the current status may be a TCP received from the robot motion controller 116 of the controller 110.
  • the data store 140 is configured to communicate with the controller 110, the inspection system 120, and the modifier 130.
  • the data store 140 may be local to the controller 110 or may be accessible through a cloud-based network.
  • the controller 110 is illustrated in FIG.1 as local to the system 100, it is expressly contemplated that the controller 110 may be remote from the system 100 and may receive signals, and send commands, using a wireless or cloud-based network.
  • the data store 140 comprises a nominal path unit 141 that contains a predefined toolpath for modifying the workpiece.
  • the toolpath is defined on a surface model 142, which may be a computer aided design (CAD) model, a depth image, a point cloud, or other model, which is also included in the data store 140 or otherwise retrievable.
  • CAD computer aided design
  • the data store 140 additionally comprises a modification parameters unit 144 that contains information about measurable factors of the workpiece-modifying equipment (e.g., speed, distance from the workpiece surface, angle from the workpiece surface, temperature, rate of deposition/spraying, etc.).
  • the data store 140 further comprises a modifier characteristics unit 143 that contains information about properties that can be sensed by the online sensor B 123 of whatever material is used to modify the workpiece (e.g., size, shape, location, etc.).
  • the data store 140 comprises a material characteristics unit 146 that contains information about physical characteristics of whatever material is used to modify the workpiece (e.g., chemical composition, state of matter, temperature, viscosity, Mohs hardness, sharpness, adhesiveness, color, drill bit size, etc.).
  • the data store 140 additionally comprises a workpiece characteristics unit 147 that contains information about properties of the workpiece (e.g., size, shape, material composition, etc.).
  • the data store 140 optionally further comprises an information database 149 that contains any additional relevant information for access by any of the units in the data store 140 or in the controller 110.
  • the data store 140 optionally also comprises a predictive model 148 that receives an input of at least historic information from prior modification involving the same or similar workpieces and/or modifiers.
  • the model 148 outputs a modification model to the modification equipment controller 114 to assist in rapidly responding to discrepancies between planned paths/modification and actual paths/modification.
  • the data store 140 optionally also comprises a machine learning unit 151 that is configured to forecast process parameters for modification of a workpiece, such as inputting to a modification model 145.
  • the data store 140 also comprises a modification model 145 that receives as inputs at least modifier characteristics 143 streamed in situ from the online sensor B 123.
  • the modification model also receives as an input template/desired modifier characteristics defined by the user.
  • the modification model 145 may also receive information from one or more of the material characteristics unit 146, the workpiece characteristics unit 147, the predictive model 148, the information database 149, or other units 154.
  • the modification model 145 outputs modification parameters 144 to the modification equipment controller 114 to provide closed-loop feedback for prompt adjustment of the modification profile, in order to correct the errors between the actual modifier characteristics and the template modifier characteristics.
  • the data store 140 further comprises an image capture calibration parameters unit 152 that contains information that the offline registration unit 111 uses to calibrate the image capture equipment 121 prior to capturing any images.
  • the data store 140 additionally comprises a robot model unit 153 that contains a digital model of at least a portion of the robot, for instance a robot arm and/or an end effector of the robot that is configured to modify a workpiece (e.g., the workpiece-modifying equipment).
  • the robot model unit 153 contains a digital model of the entire robot.
  • FIG.2 illustrates another system for modifying a workpiece in which example embodiments can be implemented. More particularly, FIG.2 illustrates an exemplary system in which a bead of adhesive is deposited on a workpiece.
  • the system includes main components of a controller 210, an inspection system 220, a dispenser 230, and a data store 240.
  • the system may be implemented by one or more suitable computing devices in communication with each of these main components.
  • the controller 210 comprises an offline registration unit 211 that receives as inputs at least an image of a workpiece taken by image capture equipment 221, a digital model of the workpiece (e.g., CAD model 242), a predefined modification toolpath (e.g., nominal path) on the workpiece model, and a TCP pose from a robot motion controller 216 (e.g., controlling an articulated arm of the robot).
  • the controller 210 also processes an image of a workpiece that includes part of the robot (e.g., nozzle of the dispenser), and a digital model of at least a portion of the robot 253.
  • the offline registration unit 211 outputs a spatially transformed toolpath (in the format of a list of waypoints) based on the estimated position and orientation of the workpiece in the robot coordinate system. These waypoints may be sent to a path planner unit 213. In some cases, the waypoints may be sent to an optional predictive model 248 and/or an optional information database 249 for future use/reference. For example, Referring to FIG.2A, there are optionally several units within the offline registration unit 211.
  • a nominal toolpath retriever 211a retrieves a nominal toolpath 241 (e.g., from the data store 240).
  • An image retriever 213b retrieves the image from the image capture equipment 221.
  • An image processor 211c processes the image.
  • Processing may include at least part registration (i.e., determining position and/or orientation) based on the taken images, feature detection (e.g., of features such as an edge, a rib, a seam, a corner, a fiducial marker, etc.) both in the image and on the surface model,), and 3D pose estimation based on corresponding features.
  • a waypoint modifier 211d generates an updated toolpath based on the (e.g., processed) image of the workpiece and the nominal toolpath, plus one of a set of joint positions of a robot art (e.g., from which at least one TCP pose can be determined) or an image of an end effector (e.g., nozzle) of the dispenser 230.
  • the controller 210 further comprises an online localization unit 212 that receives as inputs at least workpiece surface profiles streamed in real-time from an online sensor A 222, TCP poses streamed in real-time from the motion controller of the articulated robot arm, and the waypoint list from the offline registration unit 211.
  • the online localization unit 212 outputs the updated waypoint locations that trace the actual workpiece surface with desired clearance from the workpiece-modifying equipment 231.
  • the controller 210 also comprises a path planner unit 213 that receives as inputs at least the list of waypoint locations from the online localization unit 212, and outputs a smooth robot trajectory (e.g., jerk- free) to a buffer for a dispenser controller 214 to execute.
  • a smooth robot trajectory e.g., jerk- free
  • a waypoint retriever 213a retrieves a current set of waypoints (e.g., from the data store 240).
  • a path retriever 213b retrieves an actual path of the robot arm across the workpiece surface from the online sensor A 222.
  • a path analyzer 213c compares the actual path with the waypoints. Based on the comparison, a waypoint modifier 213d calculates a new set of waypoints.
  • the controller 210 additionally comprises a dispenser controller 214 that applies dispensing parameters 244 exported from a dispensing model 245 and controls the on/off state of the dispenser 230.
  • the controller 210 optionally also comprises a graphical user interface (GUI) generator 215, which may be configured to send information to a display 20.
  • GUI graphical user interface
  • the GUI generator 215 may generate a graphical user interface for display on the display 20 based on some or all of the information gathered or generated by the controller 210. Suitable displays include for instance and without limitation, a computer screen, a smart phone, or some other user device.
  • Other units 217 may further be included in the controller 210.
  • the controller 210 is described as having the functionality of receiving and sending communicable information to and from other devices. This may be done through an application program interface, for example, such that the controller 210 can receive and communicate with any units and/or models within each of the inspection system 220, the dispenser 230, and the data store 240.
  • the inspection system 220 comprises image capture equipment 221 that captures an image of the workpiece and outputs the image to the offline registration unit 211. Any suitable equipment may be employed that captures an image, for instance and without limitation, an RGB camera, a B&W camera, or a 3D image sensor.
  • the inspection system 220 also comprises an online sensor A 222 that may be attached to a surface of the workpiece-modifying equipment 231 and that performs online scanning of a surface of a workpiece ahead of the workpiece-modifying equipment 231.
  • the inspection system 220 further comprises an online sensor B 223 that may be attached to a different (e.g., opposite) surface of the dispenser 230 and that performs online scanning of a bead of adhesive dispensed on the workpiece by the dispenser 230.
  • Each of the online sensor A 222 and the online sensor B 223 senses in situ and obtains data in (e.g., near) real-time conditions.
  • Any suitable equipment for each of the online sensors A and B may be employed that scans, for instance and without limitation, equipment independently selected from a laser profilometer, an area snapshot sensor, a triangulation-based sensor, a time-of-flight sensor, a laser point sensor, an optical coherence tomography sensor, a confocal sensor, and a dynamic vision sensor.
  • a laser profilometer is preferred as an online sensor as it advantageously scans faster and simpler than point clouds that provide data in 2.5D instead of the 2 dimension of the laser profilometer or laser point sensor.
  • Other units 224 may further be included in the inspection system 220.
  • the dispenser 230 comprises any suitable dispenser of an adhesive material (e.g., including an extruder and a nozzle from which a bead of adhesive is deposited), for instance as described in more detail in at least FIGS.3, 12, and 14.
  • an adhesive material e.g., including an extruder and a nozzle from which a bead of adhesive is deposited
  • one suitable dispenser is as described in detail in PCT Publication No. WO 2020/174394 (Napierala et al.), incorporated herein by reference in its entirety.
  • the data store 240 is configured to communicate with the controller 210, the inspection system 220, and the dispenser 230.
  • the data store 240 may be local to the controller 210 or may be accessible through a cloud-based network.
  • the data store 240 comprises a nominal path unit 241 that contains a predefined toolpath for modifying the workpiece.
  • the toolpath is defined on a CAD model 242, which is also included in the data store 240 or otherwise retrievable.
  • the data store 240 additionally comprises a dispensing parameters unit 244 that contains information about measurable factors of the dispenser (e.g., speed, distance from the workpiece surface, angle from the workpiece surface, temperature, rate of deposition, etc.).
  • the data store 240 further comprises a bead profile unit 243 that contains information about adhesive bead properties that can be sensed by the online sensor B 223 (e.g., size, shape, location, etc.).
  • the data store 240 comprises a material characteristics unit 246 that contains information about physical characteristics of the adhesive material (e.g., chemical composition, state of matter, temperature, viscosity, adhesiveness, color, etc.).
  • the data store 240 additionally comprises a workpiece characteristics unit 247 that contains information about properties of the workpiece (e.g., size, shape, material composition, etc.).
  • the data store 240 optionally further comprises an information database 249 that contains any additional relevant information for access by any of the units in the data store 240 or in the controller 210.
  • the data store 240 optionally also comprises a predictive model 248 that receives an inputs historic information from prior modification involving the same or similar workpieces and/or adhesives.
  • the predictive model 248 outputs a modification model to the dispenser controller 214 to assist in rapidly responding to discrepancies between planned paths/modification and actual paths/modification.
  • the data store 240 optionally also comprises a machine learning unit 251 that is configured to forecast process parameters for dispensing on a workpiece, such as inputting to a dispensing model 245.
  • the data store 240 also comprises a dispensing model 245 that receives as inputs at least the bead profile 243 streamed in situ from the online sensor B 223.
  • the modification model also receives as an input template/desired modifier characteristics defined by the user.
  • the dispensing model 245 may also receive information from one or more of the material characteristics unit 246, the workpiece characteristics unit 247, the predictive model 248, the information database 249, or other units 255.
  • the dispensing model 245 outputs dispensing parameters 244 to the dispenser controller 214 to provide closed-loop feedback for prompt adjustment of the adhesive bead profile, in order to correct the errors between the actual adhesive bead characteristics and the template adhesive bead characteristics.
  • the data store 240 further comprises an image capture calibration parameters unit 252 that contains information that the offline registration unit 211 uses to calibrate the image capture equipment 121 prior to capturing any images.
  • the data store 240 additionally comprises a robot model unit 253 that contains a digital model of at least a portion of the robot, for instance a robot arm and/or an end effector of the robot that is configured to modify a workpiece (e.g., the dispenser).
  • the robot model unit 253 contains a digital model of the entire robot.
  • Other units 254 may further be included in the data store 240. Modification of dispensing process parameters As an example, and not by limitation, in one embodiment the material modifier is an adhesive dispenser dispensing adhesive on a surface.
  • Adhesives have different properties based on ambient conditions, making it difficult to know exactly what speed to move a dispenser (i.e., the speed of movement of the dispenser through space), what force to apply to the material to achieve a desired volumetric flow rate, what temperature to heat one or more components to, etc.
  • the adhesive is a 2 (or more) part mixture
  • each component presents these concerns. While some products may have a high tolerance for variation, others – such as airplane components, require precise adhesive application to ensure proper function. Therefore, it is important to have a feedback system that can, in-situ, characterize adhesive flow and adjust parameters to achieve the desired adhesive dispensing profile.
  • FIG.24 illustrates a schematic of a method of modifying process (e.g., dispensing) parameters with which example embodiments can be implemented.
  • FIG.24 includes incorporating information of at least one of equipment (e.g., robot arm) velocity 2410, distance of the dispenser from a workpiece 2420, or volumetric flow rate 2430, for varying process parameters (2440).
  • the method further includes detecting the effect(s) on bead shape (2450) as a result of varying the process parameters (2440) and selecting new process parameters (2460).
  • the method implements at least one repetition of a loop between varying process parameters (2440) and detecting the effect(s) on bead shape (2450), e.g., by using online sensor B.
  • the bead is a bead of adhesive (e.g., pressure sensitive adhesive, structural adhesive, etc.), although other materials are expressly contemplated to be dispensed in a form of a bead on a workpiece.
  • Machine learning may be employed to assist in varying the process parameters (2440), selecting new process parameters (2460), or both. Machine learning is described below in more detail. Offline registration of part location Referring to FIG.3, a schematic is provided of a portion of a modifying system in which example embodiments can be implemented.
  • FIG.3 includes an illustration of a portion of a system in use.
  • the system includes a dispenser 330, an online sensor A 322, an online sensor B 323, and image capture equipment 321.
  • the system is shown as a snapshot in time during a process of dispensing a bead of adhesive 334 onto a major surface 362 of a workpiece 360, through a nozzle 332 of the dispenser 330.
  • the arrow shows the direction D that the dispenser 330 is traveling with respect to the workpiece 360.
  • the online sensor A 322 directs a signal 325 at a major surface 362 of the workpiece 360, then receives a return signal 326 that provides information of the actual profile of the major surface 362 of the workpiece 360 to allow for adjustment of the dispensing characteristics if needed before the dispenser arrives at the location the online sensor A 322 has sensed.
  • the online sensor B 323 directs a signal 327 at the bead of adhesive 334 that has been dispensed onto the major surface 362 of the workpiece 360, then receives a return signal 328 that provides information of the actual profile of the bead of adhesive 334 to be used in determining if adjustments are needed to the dispensing parameters.
  • FIG.4 a schematic of a robotic dispensing system is shown, which consists of a 6- axis robotic arm, a dispensing tool mounted on the tool flange of the robot, and a camera mounted on the dispensing tool.
  • ⁇ A ⁇ , ⁇ B ⁇ and ⁇ D ⁇ denote the coordinate systems of robot base, image capture equipment, and dispensing tool, respectively.
  • ⁇ E ⁇ denote the coordinate system fixed to the workpiece (user coordinate system).
  • T ⁇ ⁇ R ⁇ For representation of coordinate transformation, 3D translation and rotation from coordinate system 1 to 2 is expressed in a compact form as a transformation matrix T ⁇ ⁇ R ⁇ : ⁇ T ⁇ ⁇ C ⁇ ⁇ ⁇ ⁇ 1 to 2, and ⁇ ⁇ ⁇ ⁇ R ⁇ denote the origin of coordinate system 1 expressed in coordinate system 2.
  • T ⁇ and T ⁇ can be calibrated using standard hand-eye procedures.
  • T ⁇ is the unknown to be estimated based on vision-based localization method, which will be described below.
  • the pose of the workpiece expressed in the base coordinate system T ⁇ (required in toolpath planning algorithm) can then be computed using the chain rules: T ⁇ ⁇ T ⁇ T ⁇ Image capture equipment parameters (intrinsic and extrinsic matrices, distortion coefficients) and the 3D model of the workpiece are assumed known prior to the workpiece localization process.
  • Image capture equipment parameters can be acquired via standard calibration procedures.
  • the 3D model of the workpiece can be acquired from the original CAD design file or via 3D scanning of a sample workpiece.
  • one or more joint positions (e.g., poses) of the workpiece-modifying equipment (e.g., robot) relative to the robot coordinate system can be streamed from the robot controller upon request.
  • a body section 422 and several arm sections 424, 426, and 428 may be present, connected via joints 442, 444, 446 (e.g., the body section 422 and the arm section 424 are connected through the joint 442.
  • Manipulating the joint 442 forms an angle ⁇ 1 between the body section 422 and the arm section 424, thus the angle ⁇ 1 varies with the particular relative positions of the body section 422 and the arm section 424.
  • the arm section 424 and the arm section 426 are connected through the joint 444.
  • Manipulating the joint 444 forms a variable angle ⁇ 2 between the arm section 424 and the arm section 426.
  • FIG.5 illustrates a process 50 for modifying a workpiece in accordance with one or more aspects of this disclosure. While it will be appreciated that a number of systems may be configured to perform process 50 in accordance with this disclosure, process 50 is described as being performed by the systems illustrated in FIGS.1 and 2 for ease of discussion.
  • FIG.5 provides an exemplary schematic of how various system components may suitably interact and/or be implemented to carry out such a method, although other interactions are expressly contemplated.
  • Process 50 may begin with image capture equipment 121 or 221 capturing images (500).
  • the captured images may include one or more of RGB images 501, B&W images 502, 3D images 503, or other images 504.
  • system 100 or 200 may process the captured images offline (510).
  • system 100 or 200 may perform post-processing operations, comprising at least part registration (i.e., determining position and/or orientation) based on the taken images.
  • the process also optionally includes feature detection (e.g., of features such as an edge, a rib, a seam, a corner, a fiducial marker, etc.) both in the image and on the surface model,), and 3D pose estimation based on corresponding features. Additional post-processing operations may also be performed, such as smoothing, filtering, compression, fish eye correction, etc.
  • Image processing optionally incorporates data from the surface model 511 of the workpiece, e.g., a CAD model, a 3D rendering based on images of an identical part, and/or a previously taken laser scan of this same part.
  • the post-processing further includes searching for feature correspondences between the captured image (e.g., 501-504) and the surface model 511.
  • path planner 113 or 213 generates a toolpath (520).
  • the toolpath may be a spatially transformed toolpath in a format of a list of waypoints, based on the estimated position and orientation of the workpiece from at least joint positions (e.g., which can be used to determine TCP poses) or a digital model of a robot 521, the nominal path 522, the captured images 501-504, and/or processing of the images offline.
  • an online sensor A 122 or 222 measures workpiece surfaces (530).
  • Process 50 further generates an updated toolpath (540), which incorporates at least measurements of the workpiece surfaces and streamed TCP poses or joint angles 541.
  • An online sensor B 123 or 223 provides updated toolpath and optionally other information; based on at least this data, process 50 plans a path (550).
  • Process 50 typically incorporates desired modifier characteristics 561 when outputting modifier parameters (560).
  • modification equipment control and a robot controller 575 cooperatively modify the workpiece (570).
  • the modification equipment control may include, for instance and without limitation, at least one of a dispenser control unit 571, a welding equipment control unit 572, a paint repair equipment control unit 573, or some other control unit 574.
  • the actions involved in the process may be carried out in various orders and some repeated numerous times to provide in situ adjustments of toolpath, workpiece modification, etc.
  • FIG.6 illustrates an example system for modifying a workpiece, in which some possible interactions between certain components are depicted.
  • a workpiece 602 is depicted as interacting with each of image capture equipment 604, online sensor B 606, modification equipment control unit 608, robot control unit 610, and online sensor A 612.
  • the image capture equipment 604 can interact with offline registration unit 614.
  • Offline registration unit 614 can further interact with an online localization unit 616 and/or the robot control unit 610.
  • the online localization unit 616 can also interact with the robot control unit 610 and/or a path planner unit 618.
  • the path planner unit can also interact with the robot control unit 610.
  • the robot control unit 610 can interact with each of a modification model 620, the workpiece 602, the modification equipment control unit 608, the path planner 618, the online localization unit 616, and/or the offline registration unit 614.
  • the modification model 620 can interact with each of the online sensor B 606, a machine learning unit 622, the modification equipment control unit 608, and/or the robot control unit 610. Based on the sensed location of a surface of the workpiece 602 and on desired modification parameters, each of the path planner unit 618 and the modification equipment control unit 608 can provide input to the robot control unit 610 regarding specific toolpath and modification characteristics when the system is in use to modify the workpiece 602.
  • a machine learning unit 622 may be configured to forecast process parameters for modification of a workpiece, such as inputting to the modification model 620.
  • a non-exhaustive list of machine learning techniques that may be used on data obtained from systems or methods of the present disclosure include: support vector machines (SVM), logistic regression, Gaussian processes, decision trees, random forests, bagging, neural networks, Deep Neural Networks (DNN), linear discriminants, Bayesian models, k-Nearest Neighbors (k-NN), and the gradient boosting algorithm (GBA).
  • SVM support vector machines
  • DNN Deep Neural Networks
  • k-NN k-Nearest Neighbors
  • GBA gradient boosting algorithm
  • the 6D (3D translation and 3D rotation) of the target in the image capture system can be estimated based on a Perspective-n-Point (PnP) of 2D-3D correspondences for PnP without ambiguity in the solution is four, but more correspondences are preferred in real applications, so that optimization-based PnP methods can be performed for robust localization when noises are present in the system.
  • PnP Perspective-n-Point
  • the method to compute 2D-3D correspondences can be a major challenge in this process. Rather than 2D textures or corner features which can be found on everyday objects such as labeled bottles and cardboard boxes, edge features (e.g., straight or curved) were selected for identification of 2D-3D correspondences, due to the abundancy of edges on all categories of workpieces.
  • edge features are sharp changes in pixel intensities. They can be extracted using a Canny edge detector, for instance, which is widely adopted as the default edge detector in image processing, or another suitable model. To achieve automatic edge detection and avoid manual tuning of the thresholds, a parameter-free Canny detector was implemented that determines the lower and upper thresholds based on the median of the image.
  • FIG.7A illustrates edge features 710 of a target (e.g., a part) obtained from image capture equipment.
  • 3D model edges To find 3D edge features in a CAD model (e.g., “3D model edges”) that correspond to the 2D Canny edge in the image, the edges from the CAD model need to be projected to a 2D image (e.g., “2D model edges”) for subsequent search of correspondences in image space.
  • the model edges can be the result of two effects: (1) edge features based on curvature change in the 3D model (denoted as “edges” here), and (2) edge features based on depth change in the 3D model (denoted as “contours” here).
  • FIG. 7B illustrates edge features 720 of a target (e.g., a part) obtained from a CAD model and FIG.7C illustrates contours 730 of a target obtained from the CAD model.
  • Edges The edges from curvature can be extracted by first computing the angles between the normal directions of each adjacent mesh surfaces in the CAD model (e.g., in the form of polygon mesh), and then selecting the element edges with angles above a pre-determined threshold (e.g., 90°). Next, given an initial guess of the camera pose, back-face culling is implemented to remove the invisible edges, as they do not correspond to any 2D image edge features.
  • a pre-determined threshold e.g. 90°
  • Contours The contours are formulated due to rapid depth change in model geometry measured in the image capture equipment coordinate system. The resulting contours in the image plane are shown as the boundaries between the workpiece and the background.
  • a digital twin of the image capture equipment is implemented to render the CAD model.
  • the CAD model is masked in black and the background is masked in white, so that the contour edges can be robustly extracted using a Canny edge detector.
  • the 3D coordinates of the extracted 2D contours can then be retrieved based on the depth data from the renderer.
  • Various software packages may be utilized for 3D edge extraction and contour detection, respectively.
  • these software packages may include application programming interfaces (APIs) and/or other combinations of modules/subroutines.
  • APIs application programming interfaces
  • these software packages may incorporate open-source and/or free-and-open-source software libraries.
  • Multiple formats of CAD model are supported, including STL and OBJ.
  • the overall method for workpiece localization based on 2D-3D correspondence includes the following actions: In a user interface, the user selects a few 2D keypoints in the image that correspond to a set of pre-determined 3D keypoints in the CAD model. A schematic image of the CAD model with keypoint labels is also presented to the user to guide manual selection of the 2D keypoints in the image.
  • a pose is estimated based on the 2D-3D correspondences specified by the user. This pose will be utilized as the initial guess of the image capture equipment pose in the following workpiece localization steps (i.e., pose refinement based on the initial guess).
  • New 2D-3D correspondences are computed based on the current image capture equipment pose.
  • the 2D model edges can be either edges from curvatures or edges from contours.
  • points are first sampled along the 2D model edges with constant spacing and then projected to the image capture equipment image. Second, for each sampled point, the corresponding point on the Canny edges can be calculated via 1D search along the direction orthogonal to the 2D model edges.
  • FIG.7D illustrates a schematic of a difference between edge features 720 and contours 730 of a target from a CAD model and edge features 710 of a target determined from the image capture equipment.
  • the pose estimation is updated using PnP-RANSAC method or another suitable method.
  • the computing and the updating are repeated until the pose estimation result converges (e.g., with error below a predetermined threshold).
  • FIG.7E illustrates a schematic of the 1D searching method to find 2D-3D correspondences in the image.
  • the top, angular, line 740 denote 2D model edges.
  • the dots 750 denote the sampled points along the 2D model edges.
  • the bottom, curved, line 760 denotes the Canny edge.
  • the algorithm was tested using an iPhone 8 (back) camera with 12MP resolution, f/1.8 aperture and autofocus, as the image capture equipment.
  • a printed chessboard was used for calibration of the intrinsic matrix and distortion parameters of the camera.
  • the workpiece localization algorithm was first tested on a 3D printed block with a wavy top surface. Eight keypoints at the corners of the block were labeled in the CAD model. For user initialization, a guiding image with the labeled keypoints was shown to the user, to aid manual selection of the keypoints in the camera image. Note that not all the keypoints are required to be identified by the user for a successful initialization.
  • the user can pick the keypoints with the best confidence and skip those that are not visible or hard to identify (e.g., the 8th keypoint was skipped in this example).
  • an artificial positioning error with uniform distribution in the range of (-2, 2) mm was added to the workpiece location calculated based on user initialization.
  • Edge features from the CAD model were detected based on edges from contours and edges from curvatures. Given the initial guess of the pose of the workpiece, correspondences between the model edges and the Canny edges were then calculated via bi-directional 1D search in the camera image. Ten iterations of pose refinement were conducted on the testing image of the wavy block.
  • the 3D edges from the CAD model were projected to the camera image based on the pose of the workpiece from the initial guess and the final iteration.
  • Localization error in the image space was characterized by computing the Euclidean distance (L2 norm) from the sampled points along the 2D model edges to their corresponding points along the Canny edges.
  • the mean pixel error of all the sampled points in each iteration is plotted in FIG.8.
  • the methods based on edges and contours both successfully corrected the localization error from user initialization with similar converging speed.
  • the residues could be due to geometrical difference between the CAD model and the 3D printed part, as well as the calibration error of the camera focal length.
  • Online toolpath control based on part surface geometry Methods according to at least certain embodiments of the present disclosure are advantageously capable of adjusting a robot toolpath in order to have a workpiece-modifying equipment (e.g., a dispenser nozzle tip) precisely track a part surface geometry at a desired (e.g., dispensing) angle with respect to surface normal and desired gap distance from the part surface.
  • a workpiece-modifying equipment e.g., a dispenser nozzle tip
  • the knowns are: (1) The nominal toolpath that is defined on the CAD model of the part and expressed in the robot coordinate system (based on part registration process as described above).
  • the two-dimensional surface profiles streamed from an online sensor A e.g., a laser profilometer mounted on the robot.
  • the pose of the online sensor A relative to the TCP coordinate system which is fixed to the robot.
  • FIG.9A is a top view schematic of a laser line 910 from a laser profilometer that could be used in embodiments of closed-loop part registration.
  • the width d of the laser line may vary, for instance 20 to 100 mm projected onto a part surface from a distance of 100 to 300 mm (although other widths and distances may be suitable).
  • a suitable laser sensor lookahead distance L can be defined by the distance from the TCP to the laser projection plane (e.g., 20 to 100 mm).
  • the laser sensor may be placed high above the TCP tip so that the dimension of the laser sensor does not affect the reachable radius of the TCP tip.
  • FIG.9B a perspective schematic is provided of an assembly including a laser profilometer 920 attached to a robotic arm 930, directing a laser line 910 ahead of a TCP of a workpiece- modifying equipment 940.
  • a toolpath buffer stores a list of unfinished waypoints within a lookahead distance.
  • a waypoint is extracted from the toolpath buffer and sent to a robot controller via a Real-Time Data Exchange (RTDE) interface.
  • RTDE Real-Time Data Exchange
  • each target waypoint in Cartesian space is transformed to joint positions via an inverse kinematic model, which is then fed to a proportional feedback controller for precise robot motion control.
  • the actual joint positions from the encoders are fed back to the RTDE interface after being transformed back to Cartesian space (i.e., TCP pose) via a forward kinematic model.
  • the laser profilometer streams the part surface profile to a scan data buffer.
  • the latest surface profile is extracted from the scan data buffer and transformed to the TCP coordinate system. This is done within the online localization unit which outputs the transformed surface profile to the path planner unit.
  • the path planner unit then computes the updated waypoints based on the nominal toolpath and the surface profile (to be described in more detail below).
  • the updated waypoints can either be sent to an online path planner for path smoothing or be directly added to the toolpath buffer for future execution of the robot controller.
  • the aforementioned robot controller framework was implemented in Python 3.6 and ran on a Linux virtual machine on a 64-bit Windows 10 laptop (HP ZBook).
  • each waypoint w i is defined as a tuple consisting of a TCP pose p i , and at least one of tool velocity, tool velocity and tool acceleration, timestamp, or some combination of these. If a timestamp is specified, this takes precedence over tool velocity and acceleration. In this example, timestamps were specified.
  • the motion command for each waypoint is sent to the robot via the RTDE library function, which can take as arguments joint positions, timestamps, velocities, accelerations, lookahead time, and gain.
  • the joint positions can be computed from the Cartesian expression p i using an inverse kinematic model.
  • Lookahead is a variable that set the lookahead time in order to smoothen the trajectory. Lookahead may be set to 0.03 s, 0.04 s, 0.05 s, 0.06 s, 0.07 s, 0.08 s, 0.09 s, 0.10 s, 0.11 s, 0.12 s, 0.13 s, 0.14 s, 0.15 s, 0.16 s, 0.17 s, 0.18 s, 0.19 s, or 0.20 s; e.g., optionally being within a range from 0.03 to 0.2 s.
  • Gain is the proportional gain for following target position.
  • FIG.10B An exemplary method to adjust waypoint locations along the nominal path based on scanner data is visualized in FIG.10B.
  • the nominal path 1010 defined in the robot base coordinate system and the scan profile 1020 defined in the scanner coordinate are all transferred to the TCP coordinate.
  • the intersection of the nominal path 1010 with the laser projection plane 1030 is computed and becomes the adjusted waypoint.
  • a sequence of the adjusted waypoint will then form the actual toolpath 1040 that is conformal to the part surface (plus a dispenser nozzle gap offset value could be added in the final motion command).
  • FIGS.11A-C the XYZ components, respectively, of the robot TCP trajectory are plotted, including both the nominal 1110 and actual 1120 paths in each of FIGS.11B-D. It is noted that the nominal and actual paths overlap so extensively in FIG.11A that they are not readily distinguishable from each other.
  • FIG.11D shows a 3D plot of the robot TCP trajectory. The robot followed the nominal path 1120 in XY plane as plotted in FIGS.11A-B.
  • the TCP coordinate system is defined such that the Y-axis is aligned with the nozzle axis, and the X-axis is aligned with the marching direction of the nozzle (e.g., the same direction as the lookahead direction of the laser scanner).
  • Axis-angle expression is used to represent 3D orientation adjustment: rotation of an angle around the nozzle axis.
  • online orientation adjustment can be divided into two steps. Referring to FIG.12A, the direction of the Y axis (e.g., axial direction of nozzle) is first specified based on surface normal of the part. Referring to FIG.12B, second, the rotation angle around the Y axis (e.g., rotation in XZ plane) is specified.
  • Orientation adjustment based on surface normal Surface normal of a part at a specific dispensing location may be estimated based on multiple slices of laser scans acquired ahead and behind a dispensing location along the toolpath.
  • the surface profiles from multiple laser scans form a point cloud that approximates the part surface geometry around the nozzle location.
  • a polynomial fit e.g., a plane fit or a paraboloid fit
  • a plane was fit to the point cloud as an example, which can provide good estimation of the local surface shape when the sampled region is small.
  • FIG.13 an exemplary schematic of an estimated surface normal of a curved part surface is shown as determined according to the above process.
  • the locally fitted plane 1300 is depicted including three slices of the scan profile 1320, three trajectory waypoints 1330, and the axial direction of the nozzle 1340. Based on the computed surface normal at each waypoint, the nozzle axis (Y-axis) can be set to be always aligned with the estimated normal direction of the part surface.
  • Non-perpendicularity can also be imposed to the orientation adjustment, for instance, by first setting the nozzle axis to be the surface normal direction, and then rotating the nozzle to maintain an arbitrary angle between the nozzle Y-axis and the estimated normal direction of the workpiece surface (e.g., rotating the nozzle around the marching/lookahead direction (X-axis) with an angle ⁇ , as depicted in the schematic illustration of FIG. 14A).
  • non-perpendicularity may be for controlling width of a dispensed bead 1460 via variable jet printing angles of a nozzle 1440, for example as depicted in the schematic illustration of FIG.14C.
  • workpiece-modifying equipment e.g., a sander, a polisher, a cutter, a drill, a sprayer, a welder, etc.
  • Surface normal estimation based on the point cloud could possibly be subject to errors caused by sensory noise from the online sensor (e.g., laser profilometer).
  • small features such as small step changes on the part surface could lead to non-smooth transition of surface normal along the robot toolpath.
  • the robot may exhibit jerky moves when following the surface normal, resulting in reduced modification (e.g., dispensing) quality.
  • Two approaches are proposed to address these issues.
  • thresholding may be applied to the change of angle between two subsequent surface normal vectors along the toolpath (e.g., at tn-1 and tn). If this angle is larger than a predetermined value, the surface normal estimated for time stamp t n will not be updated. The surface normal at t n-1 will be assigned to t n .
  • outliers in the point cloud may be identified and rejected automatically before applying plane fitting for surface normal estimation.
  • Random sample consensus is one suitable example method to determine an outlier-free set within the point cloud in an iterative way (see, e.g., Fischler, Martin A., and Robert C. Bolles. “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography.” Communications of the ACM 24.6 (1981): 381-395).
  • Points with distance errors larger than a predetermined threshold are classified as part of the consensus set.
  • RANSAC is based on a stochastic process, it can achieve (e.g., near) real-time performance with small number of iterations.
  • the probability ⁇ to find an outlier- free set of points can be computed based on the following equation: ⁇ ⁇ 1 ⁇ ⁇ 1 ⁇ ⁇ 1 ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇
  • denotes the probability that a point is an outlier
  • denotes the minimum number of points to fit a model (form a set of hypothetical inliers)
  • denotes the number of total iterations.
  • the time parametrization of the nominal path was set based on a tool speed of 20 mm/s.
  • Robot orientation along the nominal path was set to be constant, with the nozzle axis aligned with the Z-axis of the robot coordinate system, and the lookahead direction at each waypoint was set to be tangent to the toolpath.
  • Parameter settings of surface normal estimation algorithm were as follows: Subsampling of laser points in each scan slice was applied to reduce the size of the final point cloud for surface normal estimation. Specifically, one of every four points were sampled around the adjusted waypoint from the Z-tracking algorithm. Referring to FIG.15, a graph is provided of computed surface normal vectors plotted along the robot toolpath.
  • denote the distance from the laser projection plane to TCP
  • denote half of the width of the projected laser line.
  • the critical turning radius ⁇ ⁇ is defined as: ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ 2 ⁇
  • the toolpath does not fall into the scanning region of the laser line, thus is not observable to the laser sensor. Consequently, online adjustment of waypoints along the toolpath (e.g., for Z-tracking) based on laser sensor feedback is not possible. Only when ⁇ is larger than or equal to ⁇ ⁇ , the toolpath is observable by the laser for potential closed-loop adjustment.
  • the rotation angle around the nozzle axis may be set such that the online sensor is always pointing at the newly updated waypoint from the previous time step, for instance as depicted in the schematic top view illustrations of in-plane rotation around the nozzle axis along the toolpath in FIGS. 17A-B.
  • the waypoint to be updated 1710 within the lookahead horizon 1720 is not visible to the laser sensor when the robot follows the tangent direction of the toolpath.
  • the next waypoint to follow 1730 and the front laser line 1740 are also indicated in FIGS.17A-B.
  • FIG.17B further indicates the most recently updated waypoint 1750.
  • in-plane rotation robot orientation is adjusted such that the waypoint to be updated always falls inside the field of view of the laser sensor. Specifically, when the newly added slice of surface profile at the front laser line is added, the adjusted waypoint for TCP within that scan slice becomes the most recently updated waypoint ⁇ ⁇ . After the adjusted nozzle axis is computed (e.g., based on surface normal estimation) for the next waypoint ⁇ ⁇ , the new lookahead direction is computed via the so-called in-plane rotation (a rotation within a plane perpendicular to the nozzle axis so that the lookahead direction is pointing from ⁇ ⁇ to ⁇ ⁇ ).
  • in-plane rotation a rotation within a plane perpendicular to the nozzle axis so that the lookahead direction is pointing from ⁇ ⁇ to ⁇ ⁇ ).
  • rib features having a small elevation from the part surface are presented as markers that indicate locations for adhesive bead deposition or ultrasonic welding.
  • the ability to track these rib features can enable precise and repeatable deposition of adhesives on desired locations for optimal bonding performance.
  • One suitable method to adjust waypoint locations along the nominal path based on rib locations is depicted in FIG.18.
  • the nominal path 1810 defined in the robot base coordinate system and the scan profile 1820 defined in the scanner coordinate are all transferred to the TCP coordinate.
  • the intersection of the nominal path 1810 with the laser projection plane 1830 is computed by finding the first waypoint on the nominal toolpath that is behind the laser projection plane, denoted by ⁇ ⁇ .
  • rib peaks are detected, and the midpoint of the peaks becomes the position of the adjusted waypoint, denoted by ⁇ ⁇ ⁇ ⁇ . If no rib feature is presented, the strategy for Z-tracking can be executed. The time stamp from ⁇ ⁇ is then assigned to ⁇ ⁇ .
  • a sequence of the adjusted waypoint ⁇ ⁇ forms the actual toolpath 1840 that is conformal to the part surface and also follows the midpoint of the parallel ribs. If two rib peaks are detected from the profile, the midpoint on the surface profile between the two peaks can be computed, which will be the location for the adjusted waypoint. This detection method was tested on a curvilinear surface with rib features.
  • the rib feature tracking function combined with the orientation adjustment method described above was tested on a fixture with parallel ribs on a curvilinear surface.
  • the 3D nominal path was defined as a piecewise linear trajectory passing through the space between the parallel ribs but not precisely following the midline.
  • the time parametrization of the nominal path was set based on a tool speed of 20 mm/s.
  • Robot orientation along the nominal path was set to be constant, with the nozzle axis aligned with the Z-axis of the robot coordinate system.
  • FIG.19 shows a photograph of the test fixture 1910 with curvilinear surface 1920 and parallel ribs 1930, with the directions of XYZ axes of the robot base coordinate system labeled.
  • the TCP poses were recorded while the robot traversed through the nominal toolpath with rib feature tracking, as plotted in FIGS.20A-E, including both the nominal 2010 and actual 2020 paths in each of FIGS.20B-D. It is noted that the nominal and actual paths overlap so extensively in FIG.20A that they are not readily distinguishable from each other.
  • the toolpath was adjusted in each of the X, Y and Z directions, so that the resulting trajectory precisely followed the midline of the parallel ribs.
  • Tracking edge features Another example of surface feature tracking is edge detection. The goal is to detect the edge of a part and to track the waypoints with desired offsets from the detected edge.
  • one potential application in the automotive industry is to apply a structural adhesive along the edge of a metallic sheet for subsequent bonding with another panel to form a door panel assembly.
  • the metallic sheet may have a variable edge location due to manufacturing tolerance.
  • the dispensed adhesive bead cannot precisely follow the edge line on multiple panels with variable shape, thus resulting in inconsistent bonding quality.
  • FIG.21 Such a method to adjust waypoint locations along the nominal path based on edge location is depicted in FIG.21.
  • the nominal path 2110 defined in the robot base coordinate system and the scan profile 2120 defined in the scanner coordinate are all transferred to the TCP coordinate.
  • the intersection of the nominal path with the laser projection plane 2130 is computed by finding the first waypoint on the nominal toolpath 2110 that is behind the laser projection plane, denoted by ⁇ ⁇ . This will be the waypoint to be updated based on the scan profile.
  • the edge point is detected in the scan profile, and the adjusted waypoint ⁇ ⁇ is computed such that it maintains a predetermined offset from the edge. The time stamp from ⁇ ⁇ is then assigned to ⁇ ⁇ .
  • a sequence of the adjusted waypoint ⁇ ⁇ forms the actual toolpath 2140 that is conformal to the part surface and also follows the contour of a curved edge.
  • the edge feature tracking function combined with the orientation adjustment method described above was tested on a fixture with a 3D curved edge.
  • the 3D nominal path was defined as a piecewise linear trajectory on one side of the curved edge.
  • the goal was to adjust the nominal toolpath to maintain a constant offset of 10 mm from the actual edge.
  • the time parametrization of the nominal path was set based on a tool speed of 20 mm/s.
  • Robot orientation along the nominal path was set to be constant, with the nozzle axis aligned with the Z-axis of the robot coordinate system, as depicted in FIG.22, which is a photograph of the test fixture 2210 having a curved edge 2220 and with the directions of the XYZ axes of the robot base coordination system labeled.
  • the TCP poses were recorded while the robot traversed through the toolpath with waypoint adjustment based on the inline scan data.
  • FIGS.23A-C The robot followed the actual edge contour in X, Y, and Z direction as shown in FIGS.23A-C, including both the nominal 2310 and actual 2320 paths in each of FIGS.23B-D. It is noted that the nominal and actual paths overlap so extensively in FIG.23A that they are not readily distinguishable from each other. Because the general trend of the edge line follows the X direction, the major adjustment induced by the inline feature tracking was most obvious in Y and Z directions. As a result, the adjusted toolpath based on edge tracking deviated significantly from the nominal path in order to maintain the 10 mm offset from the curved edge, as plotted in FIG.23D.
  • FIG.25 illustrates a workpiece modification system architecture. Architecture 2500 illustrates one embodiment of an implementation of a workpiece modification system 2510.
  • architecture 2500 can provide computation, software, data access, and storage services that do not require end-user knowledge of the physical location or configuration of the system that delivers the services.
  • remote servers can deliver the services over a wide area network, such as the internet, using appropriate protocols.
  • remote servers can deliver applications over a wide area network and they can be accessed through a web browser or any other computing component.
  • Software or components shown or described in FIGS.1-24 as well as the corresponding data, can be stored on servers at a remote location.
  • the computing resources in a remote server environment can be consolidated at a remote data center location or they can be dispersed.
  • Remote server infrastructures can deliver services through shared data centers, even though they appear as a single point of access for the user.
  • FIG. 25 specifically shows that a controller 2510 can be located at a remote server location 2502. Therefore, a computing device 2520 accesses the controller 2510 through the remote server location 2502. A user 2550 can use the computing device 2520 to access user interfaces 2522 as well.
  • a user 2550 may be a user wanting to check on the progress of modification of a workpiece while sitting in a parking lot, and interacting with an application on the user interface 2522 of their smartphone 2520, or laptop 2520, or other computing device 2520, e.g., an augmented reality (AR) device such as AR glasses.
  • FIG.25 shows that it is also contemplated that some elements of systems described herein are disposed at a remote server location 2502 while others are not.
  • each of a data store 2530, an inspection system 2560, and the modifier 2570 can be disposed at a location separate from the location 2502 and accessed through the remote server at location 2502.
  • the data store 2530 can be accessed directly by a computing device 2520, through a network (either a wide area network or a local area network), hosted at a remote site by a service, provided as a service, or accessed by a connection service that resides in a remote location.
  • the data can be stored in substantially any location and intermittently accessed by, or forwarded to, interested parties.
  • physical carriers can be used instead of, or in addition to, electromagnetic wave carriers. This may allow a user 2550 to interact with the controller 2510 through their computing device 2520. It will also be noted that the elements of systems described herein, or portions of them, can be disposed on a wide variety of different devices.
  • FIGS.26-28 illustrate example devices that can be used in the embodiments shown in previous Figures.
  • FIG.26 illustrates an example mobile device that can be used in the embodiments shown in previous Figures.
  • FIG.26 is a simplified block diagram of one illustrative example of a handheld or mobile computing device that can be used as either a worker’s device or a supervisor / safety officer device, for example, in which the present system (or parts of it) can be deployed.
  • FIG.26 provides a general block diagram of the components of a mobile cellular device 2616 that can run some components shown and described herein.
  • the mobile cellular device 2616 interacts with them or runs some and interacts with some.
  • a communications link 2613 is provided that allows the handheld device to communicate with other computing devices and under some embodiments provides a channel for receiving information automatically, such as by scanning. Examples of communications link 2613 include allowing communication though one or more communication protocols, such as wireless services used to provide cellular access to a network, as well as protocols that provide local wireless connections to networks.
  • applications can be received on a removable Secure Digital (SD) card that is connected to an interface 2615.
  • SD Secure Digital
  • the interface 2615 and communication links 2613 communicate with a processor 2617 (which can also embody a processor) along a bus 2619 that is also connected to a memory 2621 and input/output (I/O) components 2623, as well as clock 2625 and location system 2627.
  • I/O components 2623 are provided to facilitate input and output operations and the device 2616 can include input components such as buttons, touch sensors, optical sensors, microphones, touch screens, proximity sensors, accelerometers, orientation sensors and output components such as a display device, a speaker, and or a printer port.
  • Other I/O components 2623 can be used as well.
  • the clock 2625 illustratively comprises a real time clock component that outputs a time and ate. It can also provide timing functions for the processor 2617.
  • the location system 2627 includes a component that outputs a current geographical location of the device 2616. This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. It can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions.
  • a memory 2621 stores operating system 2629, network settings 2631, applications 2633, application configuration settings 2635, data store 2637, communication drivers 2639, and communication configuration settings 2641.
  • the memory 2621 can include all types of tangible volatile and non-volatile computer-readable memory devices. It can also include computer storage media (described below). Memory 2621 stores computer readable instructions that, when executed by the processor 2617, cause the processor to perform computer-implemented steps or functions according to the instructions. Processor 2617 can be activated by other components to facilitate their functionality as well. It is expressly contemplated that, while a physical memory store 2621 is illustrated as part of a device, that cloud computing options, where some data and / or processing is done using a remote service, are available.
  • processor 2617 may include one or more processors, including one or more microprocessors, CPUs, GPUs, DSPs, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), processing circuitry (e.g., fixed function circuitry, programmable circuitry, or any combination of fixed function circuitry and programmable circuitry), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components.
  • processors including one or more microprocessors, CPUs, GPUs, DSPs, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), processing circuitry (e.g., fixed function circuitry, programmable circuitry, or any combination of fixed function circuitry and programmable circuitry), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components.
  • processors including one or more microprocessors, CPUs, GPUs, DSPs, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs),
  • FIG.27 shows that the device can also be a smart phone 2771.
  • the smart phone 2771 has a touch sensitive display 2773 that displays icons or tiles or other user input mechanisms 2775. Mechanisms 2775 can be used by a user to run applications, make calls, perform data transfer operations, etc.
  • the smart phone 2771 is built on a mobile operating system and offers more advanced computing capability and connectivity than a feature phone. Note that other forms of the devices are possible.
  • FIG.27 illustrates an embodiment where a device 2700 is a smart phone 2771, it is expressly contemplated that a display may be presented on another comping device.
  • FIG.28 is one example of a computing environment in which elements of systems and methods described herein, or parts of them (for example), can be deployed.
  • an example system for implementing some embodiments includes a general-purpose computing device in the form of a computer 2810.
  • Components of the computer 2810 may include, but are not limited to, a processing unit 2820 (which can comprise a processor), a system memory 2830, and a system bus 2821 that couples various system components including the system memory to the processing unit 2820.
  • the system bus 2821 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • Computer readable media can be any available media that can be accessed by the computer 2810 and includes both volatile/nonvolatile media and removable/non-removable media.
  • Computer readable media may comprise computer storage media and communication media.
  • Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile/nonvolatile and removable/non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 2810.
  • Communication media may embody computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • the system memory 2830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 2831 and random-access memory (RAM) 2832.
  • ROM read only memory
  • RAM random-access memory
  • BIOS basic input/output system 2833
  • RAM 2832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 2820.
  • FIG.28 illustrates an operating system 2834, application programs 2835, other program modules 2836, and program data 2837.
  • the computer 2810 may also include other removable/non-removable and volatile/nonvolatile computer storage media.
  • FIG.28 illustrates a hard disk drive 2841 that reads from or writes to non-removable, nonvolatile magnetic media, nonvolatile magnetic disk 2852, an optical disk drive 2855, and nonvolatile optical disk 2856.
  • the hard disk drive 2841 is typically connected to the system bus 2821 through a non-removable memory interface such as interface 2840
  • optical disk drive 2855 is typically connected to the system bus 2821 by a removable memory interface, such as interface 2850.
  • the functionality described herein can be performed, at least in part, by one or more hardware logic components.
  • illustrative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (e.g., ASICs), Application-specific Standard Products (e.g., ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
  • FPGAs Field-programmable Gate Arrays
  • ASICs Application-specific Integrated Circuits
  • ASSPs Application-specific Standard Products
  • SOCs System-on-a-chip systems
  • CPLDs Complex Programmable Logic Devices
  • FIG.28 for example, a hard disk drive 2841 is illustrated as storing operating system 2844, application programs 2845, other program modules 2846, and program data 2847.
  • a user may enter commands and information into the computer 2810 through input devices such as a keyboard 2862, a microphone 2863, and a pointing device 2861, such as a mouse, trackball or touch pad.
  • Other input devices may include a joystick, game pad, satellite receiver, scanner, or the like.
  • a visual display 2891 or other type of display device is also connected to the system bus 2821 via an interface, such as a video interface 2890.
  • computers may also include other peripheral output devices such as speakers 2897 and printer 2896, which may be connected through an output peripheral interface 2895.
  • the computer 2810 is operated in a networked environment using logical connections, such as a Local Area Network (LAN) or Wide Area Network (WAN) to one or more remote computers, such as a remote computer 2880.
  • LAN Local Area Network
  • WAN Wide Area Network
  • the computer 2810 is connected to the LAN 2871 through a network interface or adapter 2870.
  • the computer 2810 When used in a WAN networking environment, the computer 2810 typically includes a modem 2872 or other means for establishing communications over the WAN 2873, such as the Internet.
  • program modules may be stored in a remote memory storage device.
  • FIG.28 illustrates, for example, that remote application programs 2885 can reside on a remote computer 2880.
  • Velocity-based Closed-loop Control of Bead Shape Aspects of this disclosure are directed to velocity-based closed-loop control of bead shape. Automation capabilities widely exist for application of liquid adhesives and hot-bonded thermoplastics. Adhesives development has yielded dispensable pressure sensitive adhesives (PSA) and structural adhesives. The robotically dispensed adhesive (RDA) platforms described above better address user needs for complex bonding applications by providing automated dispensing solutions. Material flowrate inconsistencies have been observed during this process, due to factors such as batch-to-batch variation of adhesive material and back pressure at the beginning of dispensing process.
  • systems of this disclosure address and mitigate several potential problems.
  • systems of this disclosure may address the need for online bead sensing for cycle-to-cycle dispensing process control.
  • cycle-to-cycle dispensing control bead shape data collected from the past process cycles are used to generate control commands for the current cycle. This helps to compensate for system and environmental uncertainties/disturbances that are consistent or change slowly over multiple process cycles, such as batch-to-batch variation of adhesive material and fluctuation of humidity/temperature on production line.
  • the bead shape data for cycle-to-cycle process control can be acquired with offline or online sensing systems.
  • Offline sensing either requires two separate runs in a cycle for dispensing and scanning respectively (dispense-then-scan process) using a single robot arm, or requires two robot arms working in parallel for dispensing and bead sensing.
  • online bead sensing with coordinated robot motion only requires a single run per cycle using a single robot arm (dispense-while-scan), with the potential benefit of reduced cycle time and system complexity compared to offline sensing.
  • Another example of potential problems that the systems of this disclosure address is the need for online bead sensing and a velocity-based controller operable to provide in-cycle process control to improve bead quality and yield rate.
  • the systems of this disclosure may collect bead shape data in real-time fashion to generate control commands for the current cycle.
  • in-cycle control provides a faster response to the detected bead defect.
  • In-cycle control compensates for not only slow-varying disturbances from the system and environment, but also compensates for incidental disturbances that result in cycle-to-cycle variation in material flowrate.
  • the in-cycle controller of this disclosure may further improve bead quality on each sample, and may reduce the number of defective samples, thereby resulting in higher yield rate for end users.
  • the online bead sensing system for in-cycle control may capture bead parameters with low latency and high temporal resolution, coupled with advanced detection algorithm to robustly extract bead parameters such as bead shape and location.
  • one control strategy is to adjust material flowrate via changing dispensing variables such as extruder motor speed (e.g., for screw-driven dispensers), pump pressure (e.g., for pressure-driven dispensers), or temperature (e.g., for hot-melt adhesives).
  • the velocity-based bead shape controller of this disclosure may provide more rapid compensation for bead shape error without the need of direct control of material flowrate.
  • robot tool velocity can be adjusted more rapidly in response to over- or under-extrusion detected by the bead sensing system.
  • controllers of this disclosure are also designed and configured to handle sensor noise and latency that are inevitable from the online sensing system, in order to deliver smooth and stable robot motion with variable tool velocity.
  • Systems and techniques of this disclosure are directed to material dispensing with online bead sensing and closed-loop feedback control of tool velocity to improve precision of the dispensing process.
  • FIG.29 illustrates an example system of this disclosure.
  • An example of this disclosure incorporates one or more of: 1.
  • a system with a dispensing robot mounted with a bead sensor (examples of which are described below); 2. estimation processes for bead/seam parameters based on laser profilometer data (examples of which are described below) a. bead/seam location estimation techniques based on rising/falling edge detection (examples of which are described below); b. techniques for bead geometric parameter estimation (e.g., width, thickness, section area) based on bead profile segmentation (examples of which are described below); 3. techniques for robot tool velocity control to compensate for detected bead shape error (examples of which are described below) a.
  • aspects of this disclosure use feedback data from a bead sensor to adjust the dispense velocity in order to compensate for bead shape error.
  • various techniques of this disclosure employ tool velocity as the independent variable to correct detected bead shape error, using a robust velocity controller that takes into account of online bead parameter estimation, noise handling, and steady-state checking.
  • Robotic dispensing technology such as those related to advanced robotics and/or RDAA platforms are described in International Patent Applications with Publication Numbers WO2020/174394, WO2020/174397, WO2021/074744, and WO2021/124081, the entire disclosure of each of which is incorporated herein by reference.
  • the velocity-based bead control aspects of this disclosure represent improvements in the areas of closed-loop robotic dispensing apparatuses and techniques for compensation of part shape and bead shape variation.
  • FIG.30 illustrates a non-limiting example of a closed-loop robotic dispensing testbed of this disclosure.
  • the system of FIG.30 includes a six-degree-of-freedom (6-DoF) robot arm (such as a model UR 10 available from Universal Robots), a hot-melt adhesive dispenser mounted on the robot arm, a first laser profilometer (part scanner) mounted on one side of the dispenser for part surface scanning, and a second laser profilometer (bead scanner) mounted on the other side of the dispenser for bead shape sensing.
  • 6-DoF six-degree-of-freedom
  • robot arm such as a model UR 10 available from Universal Robots
  • a hot-melt adhesive dispenser mounted on the robot arm
  • a first laser profilometer part scanner
  • a second laser profilometer mounted on the other side of the dispenser for bead shape sensing.
  • Both scanners are laser profilometers in the particular example of FIG.30, while it will be appreciated that other types of scanners are compatible with the systems of this disclosure.
  • FIG.31 is a block diagram illustrating an example closed-loop robotic dispensing according to aspects of this disclosure.
  • the “bead scanner” and “velocity adjustment” blocks of FIG.31 pertain to velocity-based bead shape control aspects of this disclosure.
  • the controller program of FIG.31 was implemented in Python 3 that can run on a personal computer (PC) equipped with either a Linux®-based or Windows®-based operating system (OS).
  • the PC-based controller was connected to a UR CB3 controller via a TCP/IP connection using an Ethernet® cable.
  • various actions of streaming robot motion command and reading current robot pose were communicated via a Real-Time Data Exchange (RTDE) interface at a frequency of 125 Hz (maximum).
  • RTDE Real-Time Data Exchange
  • the PC-based controller was also connected to the part scanner and bead scanner via UDP using Ethernet® cables.
  • each target waypoint in Cartesian space is transformed to joint positions via inverse kinematic model of the robot, which is then fed to a proportional feedback controller for motion control.
  • the actual joint positions from the encoders are fed back to the RTDE interface after being transformed to tool center pose (TCP) via a forward kinematic model of the robot.
  • a nominal path defining projected adhesive bead locations on the workpiece/part is first generated using an offline path planner, which includes part registration methods and a CAD-to-path process (where “CAD” stands for computer-aided design).
  • the user-defined nominal path is then discretized into sequences of waypoints for robot execution.
  • the current robot TCP is first read from robot controller.
  • the most recent laser profiles from the scan data buffer of both scanners are transformed to the robot TCP coordinate system, which are used to determine adjustment for waypoint pose and velocity.
  • FIG.32 is a schematic showing an example format of a toolpath for the closed-loop robotic dispensing techniques of this disclosure.
  • Each waypoint w i along the robot toolpath is defined as a tuple consisting of a time step duration ⁇ t i and a TCP pose p i .
  • tool velocity is not defined explicitly in this waypoint expression (direct velocity control is not provided by the UR CB3 Controller), but can be controlled implicitly by varying the value of time step duration ⁇ t i .
  • the motion command for each waypoint is sent to the robot via RTDE command servoJ (q, ⁇ t, lookahead, gain).
  • q is a vector of the joint positions, which can be computed from the Cartesian expression p i using an inverse kinematic model.
  • ⁇ t represents the blocking time to move to the next pose, which is the same as ⁇ t i .
  • Lookahead is a variable used to set the lookahead time in order to smoothen the trajectory (ranging from 0.03s to 0.2s).
  • Gain represents the proportional gain to track target position (ranging from 100 to 2000).
  • 2D surface profile captured by the bead scanner can be expressed as points in an XZ plane.
  • the Z-axis defines the direction of laser projection.
  • the bead scanner used for a particular experiment of this disclosure has a measuring range of [190, 290] (mm) along Z-axis and [-72, 72] (mm) along the X-axis.
  • the bead sensing algorithm takes a 2D profile consisting of 640 points as the raw data inputs, and extracts useful bead parameters via bead location detection (described below in greater detail) and bead geometry estimation (described below in greater detail). These parameters are saved and used as sensory inputs for the velocity-based bead shape controller that is described below in greater detail.
  • FIGS.33 illustrate aspects of bead location detection according to aspects of this disclosure.
  • FIG.33A shows schematics for detection of the rising and falling edges (indicated by arrows) within bead or seam profile F(x) via detection of zero-crossing locations in 2nd order derivative F’’(x) and thresholding in 1st derivative F’(x).
  • FIG.33B shows an example graphical user interface (GUI) for configuration of search window for bead location detection (shown in a box).
  • GUI graphical user interface
  • the profile of a bead on the substrate can be characterized as a rising edge followed by a falling edge along a positive X-direction.
  • a shape of a seam is the opposite of the bead, with a falling edge followed by a rising edge along positive x direction (FIG.33A).
  • the systems of this disclosure may receive a user input defining a refined search window in the profile where the bead is most likely to appear.
  • the systems of this disclosure may implement a GUI that enables the user to provide an input defining the center of the search window with the reference index of the selected center point in the profile (e.g., close to a nozzle location).
  • the search range can then be defined by distance along the X-axis and Z-axis in millimeters (as shown in FIG.33B).
  • a refined search window may mitigate or potentially even eliminate ambiguity in bead detection when multiple rising and falling edges are presented on the part surface in the neighborhood of the bead.
  • first order and second order Gaussian filters are applied to the surface profile within the user-defined search window, producing new profiles of the first derivative (gradient) and second derivative (curvature) of the original profile shown in FIG.33A.
  • the level of smoothing to remove noise is controlled by the standard deviation parameter of the gaussian filter, which can be received via the GUI in the form of user input (e.g., as shown in FIG.33B).
  • FIGS. 34 illustrate bead geometry estimation on two scan profile samples.
  • FIG.34A is a scan profile with a rounded bead on a tilted and curved substrate.
  • FIG.34B illustrates a scan profile of a flat bead on flat substrate with discontinuous and erroneous point data at bead boundaries (e.g., points below the substrate plane).
  • a user-defined search window is indicated using a box with a dashed-line boundary.
  • the center of the search window is indicated with a circle with a solid-lined boundary
  • the bead location is indicated with a cross
  • rising and falling edges are indicated with adjacent circles
  • the fitted substrate plane is indicated with a labeled line.
  • the goal of the bead geometry estimation process is to extract bead shape characteristics that define the quality of the deposited bead, such as width, maximum, and/or mean thickness and section area.
  • the rising and falling edges from the previous bead location detection step are estimation of bead boundaries due to the following reasons.
  • the detected bead boundaries are zero-curvature locations, which could be off from the true bead boundaries by millimeters (as shown in FIG.34B).
  • portions of scan profile that are close to bead boundaries sometimes suffer from sparse and noisy data points caused by height discontinuity and reflection between the bead and the substrate surface (as shown in FIG.34B). As such, these edge locations cannot be used to estimate bead shape characteristics such as width with reliably high precision and consistency.
  • the bead geometry estimation techniques of this disclosure provide a technical solution to the technical problems set forth above by refining the location of bead boundaries by segmenting the substrate profile from the bead profile.
  • the bead geometry estimation techniques of this disclosure can fit a straight line to the “flat” portion of the substrate profile using random sample consensus (RANSAC) to remove noise and outliers, followed by linear regression to find an optimal line fit. All of the points above this fitted substrate line profile can be treated as part of the bead, and can be used to compute bead width, thickness, and/or section area.
  • RANSAC random sample consensus
  • the current recommendation is to stream waypoint commands at a constant rate of 125 Hz.
  • ⁇ t be a constant value (e.g., 8ms)
  • path streaming method for online part shape compensation e.g., in the example of FIG.31
  • N waypoints [[p 0 , o 0 , ⁇ t 0 ], ..., [p N-1 , o N-1 , ⁇ t N-1 ]]
  • the nominal tool velocity for the jth waypoint can be estimated by ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ / ⁇ ⁇ ⁇ ⁇
  • the executed tool velocity can be estimated by ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ / ⁇ ⁇ ⁇ ⁇ .
  • This deviation from nominal tool velocity is relatively small when the adjusted waypoint ⁇ ⁇ is close to the nominal waypoint ⁇ ⁇ .
  • FIG.35 is a graph illustrating a log of time consumption for each computation cycle during a closed-loop dispensing test run. For instance, according to the log of time consumption of each computation cycle during a closed-loop dispensing test run (as shown in FIG.35), the cycle time allocated for processing of sensory data and computation of control command should be no lower than 14ms. ⁇ A time interval that is too large could lead to a jagged movement of the robot due to coarse temporal discretization of pose commands.
  • FIG.36 is a schematic showing an example of how variation of waypoint position
  • time parametrization along nominal path is configured to be evenly spaced with a constant updating time interval of 20ms.
  • variation of time interval is introduced along the resulting adaptive path.
  • a new waypoint (the circular point that is not shaded in) is interpolated with a time interval of 20ms.
  • the current waypoint is skipped.
  • the strategy for time re-parametrization to achieve a relatively consistent time interval ⁇ ⁇ ⁇ (with tolerable range ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ ⁇ ⁇ ) is as follows: (1) If ⁇ ⁇ ⁇ is smaller than ⁇ ⁇ ⁇ and larger than ⁇ ⁇ ⁇ , execute the current waypoint command [ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ ]; (2) If ⁇ ⁇ ⁇ is larger than ⁇ ⁇ ⁇ , interpolate new waypoints with constant time interval ⁇ ⁇ ⁇ between previously commanded waypoint [ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ ] and current waypoint [ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ ]( FIG.37A); and (3) If ⁇ ⁇ ⁇ is smaller than ⁇ ⁇ ⁇ , skip current waypoint [ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ ] for execution (FIG
  • time re-parametrization An additional benefit of time re-parametrization is that the part surface can be sampled with higher temporal resolution compared to the case without time re-parametrization.
  • the conformal toolpath planned based on this surface mesh with higher fidelity may yield improved bead quality with more precise gap height control.
  • the experimental results for the closed-loop dispensing using path streaming techniques without time re- parameterization are shown in FIG.38A.
  • FIG.38B The experimental results for the closed-loop dispensing using path streaming techniques with time re-parameterization are shown in FIG.38B.
  • the plots from the 1st row show velocity profile of the nominal path (“nominal”) and the adjusted path (“planned”) for bead shape compensation.
  • FIGS.39 illustrate results of bead shape compensation with a na ⁇ ve control law.
  • FIG.39A shows bead width profiles for three runs with good control quality (top plot, bead width converging to the target width), bad control quality (middle plot, bead width diverging from target width), and intermediate control quality (bottom plot, bead width oscillating around target width).
  • FIG.39B shows tool velocity profiles of a nominal path (“nominal”), adjusted/commanded path (“planned”), and executed path (“executed,” smoothed using moving average filter with window size of 20).
  • nominal nominal
  • planned adjusted/commanded path
  • executed path executed path
  • FIG.39B shows tool velocity profiles of a nominal path (“nominal”), adjusted/commanded path (“planned”), and executed path (“executed,” smoothed using moving average filter with window size of 20).
  • one of the bead geometric parameters estimated by the bead sensing algorithm can be selected as the dependent variable to be controlled along the toolpath.
  • the bead width can be treated as the dependent variable in this example.
  • the control law can be applied to other bead parameters such as section area without loss of generality.
  • the control law of this disclosure is based on the following constant volume assumption: the volume of the deposited material within a constant time period ⁇ t can be approximated as a constant through two adjacent time steps when ⁇ t is close to zero. This assumption is valid when material flowrate changes smoothly or remains constant over time.
  • ⁇ and ⁇ ⁇ denote measured bead width from current time step and desired bead width for the next time step, respectively.
  • ⁇ and ⁇ ⁇ denote measured tool velocity from current time step and desired tool velocity for the next time step, respectively.
  • the controller did not deliver repeatable quality for bead shape control, with the bead width sometimes diverging from or oscillating around the desired width.
  • the tool velocity profiles from a test run expose an issue with the controller (as in the example of t FIG.39B), namely, that the adjusted tool velocity command computed by the control law is overly oscillatory and cannot effectively correct bead shape errors. Additionally, even if this oscillatory velocity commands can potentially compensate for bead shape errors to an extent, the actual velocity profile of the executed toolpath could vary from the commanded values due to the dynamic limits of the robot arm.
  • the causes of the oscillatory behavior of the computed velocity commands are threefold, namely,: (1) Delayed measurement of bead width ⁇ .
  • a laser line from bead scanner is projected to a location with an offset distance from the nozzle (as shown in FIG.40B).
  • Control commands computed based on this delayed measurement could lead to instability issues (oscillatory motion) and overshoots in bead width response (as shown in FIG.40A).
  • High noise level in bead width measurement ⁇ (as shown in FIG.39A) and tool velocity measurement ⁇ (as shown in FIG.40C). These noise values are transferred to the controller without any noise compensation, thereby resulting in oscillatory velocity command ⁇ ⁇ .
  • (3) Long duration of transient state for bead width ⁇ and tool velocity ⁇ (as shown in FIG.41).
  • Bead width is under transient state after change of tool velocity or when fluctuation of flowrate is presented within the extrusion system (as shown in FIG.41A).
  • Tool velocity is under a transient state after a tool velocity set point is commanded and before the set point is reached (as shown in FIG.41B).
  • the duration of this transient state depends on the gain settings for robot motion controller and acceleration/deceleration limit(s) of the robot.
  • the constant volume assumption could be invalid when using these transient-state variables to compute control commands, resulting in errors in bead width tracking.
  • FIGS.40 illustrate one or more potential issues with the na ⁇ ve control law for bead shape compensation.
  • FIG.40A is a schematic (top view) of a nozzle (circle with solid-line border) traversing through a straight-line path.
  • the tracking error of bead width is caused by delayed sensing of bead scanner (with a vertical line indicating the laser line).
  • the deposited bead is indicated by the filled-in (or shaded-in) region, and the desired bead width is indicated by two dashed lines.
  • the laser line is indicated by a vertical line intersecting the deposited bead and the desired bead.
  • FIG.40B shows delayed sensing distance for bead scanner.
  • FIG.40C shows tool velocity profile read from robot controller.
  • FIGS.41 illustrate a transient state of bead width and tool velocity along the dispense path.
  • FIG. 41A is a plot of bead width profile along a dispense path, with the transient state indicated by a box with dashed boundary lines (and labeled “transient state”), and steady state indicated by a box with dashed boundary lines (and labeled “steady state”).
  • FIG.41B is a plot of tool velocity profile along a dispense path, with the commanded velocity indicated by the “planned” curve and the measured/executed velocity indicated by the “executed_raw” curve.
  • the transient state of executed velocity is indicated by boxes with dashed boundary lines (labeled “transient state”).
  • FIGS.42 illustrate schematics of steady state checkers.
  • FIG.42A illustrates steady state checking for bead width measurement.
  • the “return false” and “return “true” regions indicate time steps where False and True values are returned by the algorithm, respectively.
  • FIG.42B illustrates steady state checking for tool velocity measurement.
  • the “return false” regions indicate time steps with zero steady state distance d (return False), and also indicate time steps with increasing steady state distance d (return False).
  • the “return true” region indicates time steps where steady state distance d is larger than the threshold value d_thres (return True).
  • FIG.42C is a schematic showing the minimum steady state distance d_thres equal to the delayed sensing distance of the bead scanner.
  • Bead steady state checker Inputs Sliding average of bead width: ⁇ Standard deviation of bead width: ⁇ ⁇ Desired bead width: ⁇ ⁇ Maximum allowable standard deviation: ⁇ ⁇ _ ⁇ Dead zone for bead width tracking error: ⁇ ⁇ If ⁇ ⁇ ⁇ ⁇ ⁇ _ ⁇ : Bead width is in steady state.
  • Increment ⁇ with the distance traveled in this time step Else: Reset ⁇ to 0 If ⁇ ⁇ ⁇ ⁇ : Return True (ready for new control command) Else: Return False (not ready for new control command)
  • Velocity_steady_state_checker presented above the minimum steady state distance ⁇ ⁇ can be set to be equal or larger than the delayed sensing distance of the bead scanner (and relating to FIG.42C), in order to synchronize tool velocity with the delayed bead width measurement.
  • the “nominal” curve indicates nominal velocity
  • the “planned” curve indicates commanded/adjusted velocity
  • the “executed_raw” curve indicates raw velocity measurement
  • the “executed_smooth” curve indicates filtered velocity measurement.
  • the “steady state” regions indicate time steps where steady state distance d is incremented.
  • bead width vs.
  • the “desired width” line indicates desired width
  • the “sliding average of bead width” curve above the “desired width” line indicates a smoothed bead width measurement
  • “sliding std of bead width” curve below the “desired width” line indicates standard deviation (“std”) of bead width measurement.
  • the “Deadzone” region indicates dead zone of bead width tracking error
  • the “oscillation threshold” region indicates range of allowable standard deviation that indicates a steady state of bead width.
  • the thick vertical lines indicate time steps for bead width condition checking when steady state distance d reaches the target threshold.
  • FIG.43 shows tool velocity and bead width data recorded from a test run with bead shape control.
  • the filter window sizes for sliding average of bead width and tool velocity were set to be 50 and 20 samples, respectively, in this instance.
  • FIGS.44 illustrate experimental results with desired bead width of 10mm (FIG.44A), 8mm (FIG. 44B), and 5mm (FIG.44C). Control gain is 0.8.
  • FIGS.44 illustrate experimental results demonstrating effectiveness of the bead shape controller to achieve different desired bead width.
  • FIGS.45 illustrates experimental results with control gains of 0.2 (FIG.45A), 0.5 (FIG.45B), and 0.8 (FIG.45C). Desired bead width is 5mm. Wait time at the beginning of dispense process is 2 seconds (compensating for delayed extrusion of hot melt adhesive). As such, FIGS.45 illustrate experimental results demonstrating how control gain affects tool velocity commands and the resulting bead shape.
  • FIGS.46 illustrate experimental results with wait time at the beginning of dispense process being 2 seconds (FIG.46A) and 3 seconds (FIG.46B). Desired bead width is 5mm. Control gain is 0.8.
  • the dispensing experiments that produced the results shown in FIGS.46 were conducted under different wait time settings at the beginning of the tool path to demonstrate effectiveness of the bead shape controller under different initial conditions, dispensing experiments.
  • FIG.47 shows experimental results of dispensing on a relatively flatter substrate (FIG.47A) and a relatively more curved substrate (FIG.47B) with online part shape and bead shape compensation. Control gain is 0.8. Desired bead width is 5mm.
  • processors including one or more microprocessors, CPUs, GPUs, DSPs, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), processing circuitry (e.g., fixed function circuitry, programmable circuitry, or any combination of fixed function circuitry and programmable circuitry), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components.
  • processors including one or more microprocessors, CPUs, GPUs, DSPs, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), processing circuitry (e.g., fixed function circuitry, programmable circuitry, or any combination of fixed function circuitry and programmable circuitry), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components.
  • a control unit comprising hardware may also perform one or more of the techniques of this disclosure.
  • Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various operations and functions described in this disclosure.
  • any of the described units, modules or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware or software components or integrated within common or separate hardware or software components.
  • the techniques described in this disclosure may also be embodied or encoded in a computer- readable medium, such as a computer-readable storage medium, containing instructions.
  • Computer readable storage media may include random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, a hard disk, a CD-ROM, a floppy disk, a cassette, magnetic media, optical media, or other computer readable media.
  • RAM random access memory
  • ROM read only memory
  • PROM programmable read only memory
  • EPROM erasable programmable read only memory
  • EEPROM electronically erasable programmable read only memory
  • flash memory a hard disk, a CD-ROM, a floppy disk, a cassette, magnetic media, optical media, or other computer readable media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

La présente divulgation concerne des processus et des systèmes pour modifier une pièce. Un processus comprend la capture d'au moins une image de la pièce, le traitement de la ou des images, l'obtention d'un trajet d'outil nominal, la mesure d'une surface de la pièce pour obtenir une mesure de surface de pièce, la génération d'un trajet d'outil mis à jour, la génération d'un trajet planifié pour l'équipement de modification de pièce, le balayage d'une modification de la pièce effectuée par un équipement de modification de pièce, la détermination de paramètres de modificateur et la modification de la pièce par l'équipement de modification de pièce selon le trajet planifié et les paramètres de modificateur. La modification peut être destinée à distribuer un matériau sur la pièce. Un système comprend un équipement de capture d'image, des capteurs en ligne, un dispositif de commande de mouvement de robot, une unité d'enregistrement hors ligne, un trajet d'outil nominal, une unité de localisation en ligne, une unité de planification de trajet, un modificateur comprenant un équipement de modification de pièce, et un dispositif de commande d'équipement de modification.
PCT/US2023/033377 2022-09-21 2023-09-21 Systèmes et techniques de modification de pièce WO2024064281A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202263408526P 2022-09-21 2022-09-21
US63/408,526 2022-09-21
US202363539307P 2023-09-19 2023-09-19
US63/539,307 2023-09-19

Publications (1)

Publication Number Publication Date
WO2024064281A1 true WO2024064281A1 (fr) 2024-03-28

Family

ID=88504896

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/033377 WO2024064281A1 (fr) 2022-09-21 2023-09-21 Systèmes et techniques de modification de pièce

Country Status (1)

Country Link
WO (1) WO2024064281A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4568816A (en) * 1983-04-19 1986-02-04 Unimation, Inc. Method and apparatus for manipulator welding apparatus with improved weld path definition
DE102005051533B4 (de) * 2005-02-11 2015-10-22 Vmt Vision Machine Technic Bildverarbeitungssysteme Gmbh Verfahren zur Verbesserung der Positioniergenauigkeit eines Manipulators bezüglich eines Serienwerkstücks
WO2020174397A1 (fr) 2019-02-25 2020-09-03 3M Innovative Properties Company Système de distributeur d'adhésif à filaments
WO2020174394A1 (fr) 2019-02-25 2020-09-03 3M Innovative Properties Company Distributeur d'adhésif à filaments
WO2021074744A1 (fr) 2019-10-14 2021-04-22 3M Innovative Properties Company Distribution d'adhésif liquide automatisée à l'aide d'une modélisation linéaire et d'une optimisation
WO2021124081A1 (fr) 2019-12-20 2021-06-24 3M Innovative Properties Company Procédé d'application d'adhésif à filaments
US20220048194A1 (en) * 2019-01-23 2022-02-17 Nuovo Pignone Tecnologie - S.R.L. Industrial robot apparatus with improved tooling path generation, and method for operating an industrial robot apparatus according to an improved tooling path
US20220193709A1 (en) * 2019-05-07 2022-06-23 Dürr Systems Ag Coating method and corresponding coating installation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4568816A (en) * 1983-04-19 1986-02-04 Unimation, Inc. Method and apparatus for manipulator welding apparatus with improved weld path definition
DE102005051533B4 (de) * 2005-02-11 2015-10-22 Vmt Vision Machine Technic Bildverarbeitungssysteme Gmbh Verfahren zur Verbesserung der Positioniergenauigkeit eines Manipulators bezüglich eines Serienwerkstücks
US20220048194A1 (en) * 2019-01-23 2022-02-17 Nuovo Pignone Tecnologie - S.R.L. Industrial robot apparatus with improved tooling path generation, and method for operating an industrial robot apparatus according to an improved tooling path
WO2020174397A1 (fr) 2019-02-25 2020-09-03 3M Innovative Properties Company Système de distributeur d'adhésif à filaments
WO2020174394A1 (fr) 2019-02-25 2020-09-03 3M Innovative Properties Company Distributeur d'adhésif à filaments
US20220193709A1 (en) * 2019-05-07 2022-06-23 Dürr Systems Ag Coating method and corresponding coating installation
WO2021074744A1 (fr) 2019-10-14 2021-04-22 3M Innovative Properties Company Distribution d'adhésif liquide automatisée à l'aide d'une modélisation linéaire et d'une optimisation
WO2021124081A1 (fr) 2019-12-20 2021-06-24 3M Innovative Properties Company Procédé d'application d'adhésif à filaments

Similar Documents

Publication Publication Date Title
US20180326591A1 (en) Automatic detection and robot-assisted machining of surface defects
US10427300B2 (en) Robot program generation for robotic processes
Chen et al. The autonomous detection and guiding of start welding position for arc welding robot
US20150235367A1 (en) Method of determining a position and orientation of a device associated with a capturing device for capturing at least one image
JP2020521641A (ja) 工具経路の自動生成
US10921816B2 (en) Method and apparatus for producing map based on hierarchical structure using 2D laser scanner
CN111028340B (zh) 精密装配中的三维重构方法、装置、设备及系统
Chen et al. Seam tracking of large pipe structures for an agile robotic welding system mounted on scaffold structures
KR20240036606A (ko) 작업 표면을 처리하기 위한 시스템 및 방법
JP2023539728A (ja) ロボット補修制御システム及び方法
Xiao et al. An automatic calibration algorithm for laser vision sensor in robotic autonomous welding system
US20230419531A1 (en) Apparatus and method for measuring, inspecting or machining objects
US6597967B2 (en) System and method for planning a tool path along a contoured surface
WO2024064281A1 (fr) Systèmes et techniques de modification de pièce
Jing et al. Rgb-d sensor-based auto path generation method for arc welding robot
Hanh et al. Simultaneously extract 3D seam curve and weld head angle for robot arm using passive vision
Yu et al. Multiseam tracking with a portable robotic welding system in unstructured environments
Hanh et al. Visual guidance of a sealant dispensing robot for online detection of complex 3D-curve seams
Zaki et al. On the use of low-cost 3D stereo depth camera to drive robot trajectories in contact-based applications
Kwon et al. Rescan strategy for time efficient view and path planning in automated inspection system
Hanh et al. 3D complex curve seam tracking using industrial robot based on CAD model and computer vision
Koch et al. Evaluating continuous-time slam using a predefined trajectory provided by a robotic arm
US11951635B1 (en) Automatically identifying locations to apply sealant and applying sealant to a target object
CN116100562B (zh) 多机器人协同上下料的视觉引导方法及系统
CN117693720A (zh) 用于处理工作表面的系统和方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23793128

Country of ref document: EP

Kind code of ref document: A1