EP3271104A1 - Armbandbasierte systeme und verfahren zur steuerung von schweissausrüstung unter verwendung von gesten und ähnlichen bewegungen - Google Patents

Armbandbasierte systeme und verfahren zur steuerung von schweissausrüstung unter verwendung von gesten und ähnlichen bewegungen

Info

Publication number
EP3271104A1
EP3271104A1 EP16705609.2A EP16705609A EP3271104A1 EP 3271104 A1 EP3271104 A1 EP 3271104A1 EP 16705609 A EP16705609 A EP 16705609A EP 3271104 A1 EP3271104 A1 EP 3271104A1
Authority
EP
European Patent Office
Prior art keywords
welding
operator
command
gesture
gestures
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP16705609.2A
Other languages
English (en)
French (fr)
Inventor
Todd Gerald Batzler
Robert Arthur Batzler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Illinois Tool Works Inc
Original Assignee
Illinois Tool Works Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/659,853 external-priority patent/US10987762B2/en
Application filed by Illinois Tool Works Inc filed Critical Illinois Tool Works Inc
Publication of EP3271104A1 publication Critical patent/EP3271104A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/095Monitoring or automatic control of welding parameters
    • B23K9/0956Monitoring or automatic control of welding parameters using sensing means, e.g. optical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/10Other electric circuits therefor; Protective circuits; Remote controls
    • B23K9/1087Arc welding using remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • Welding can be performed in automated manner or in manual manner.
  • welding may be automated in certain contexts, a large number of applications continue to exist where manual welding operations are used (e.g., where a welding operator uses a welding gun or torch to perform the welding).
  • success of welding operations relies heavily on proper use of the welding equipment, e.g., success of manual welding depends on proper use of a welding gun or torch by a welding operator. For instance, improper torch angle, contact-tip-to-work-distance, travel speed, and aim are parameters that may dictate the quality of a weld. Even experienced welding operators, however, often have difficulty weld monitoring and maintaining these important parameters throughout welding processes.
  • FIG. 1 shows an example arc welding system in accordance with aspects of this disclosure.
  • FIG. 2 shows example welding equipment in accordance with aspects of this disclosure.
  • FIG. 3 is a block diagram illustrating an example use of a motion detection system operating within a welding system, in accordance with aspects of the present disclosure.
  • FIG. 4 is a block diagram illustrating an example motion detection system, in accordance with aspects of the present disclosure.
  • FIG. 5 is a block diagram illustrating an example gesture accessory device that may be used in conjunction with motion detection systems, communicating therewith wirelessly, in accordance with aspects of the present disclosure.
  • FIG. 6 is a flowchart illustrating an example method for communicating a welding command to a welding system from a motion detection system, in accordance with aspects of the present disclosure.
  • FIG. 7 is a flowchart illustrating an example method for associating a welding command with a particular gesture or motion, in accordance with aspects of the present disclosure.
  • FIG. 8 shows an example gesture-based armband device for use in remotely controlling welding operations, in accordance with aspects of this disclosure.
  • FIG. 9 shows example circuitry of a gesture-based armband device for use in remotely controlling welding operations, in accordance with aspects of this disclosure.
  • FIG. 10 is a flowchart illustrating an example method for providing feedback during gesture-based remote control of welding operations, in accordance with aspects of the present disclosure.
  • FIG. 1 shows an example arc welding system in accordance with aspects of this disclosure.
  • an example welding system 10 in which an operator 18 is wearing welding headwear 20 and welding a workpiece 24 using a torch 504 to which power is delivered by equipment 12 via a conduit 14, with weld monitoring equipment 28 being available for use to monitor welding operations.
  • the equipment 12 may comprise a power source, optionally a source of an inert shield gas and, where wire/filler material is to be provided automatically, a wire feeder.
  • the welding system 10 of FIG. 1 may be configured to form a weld joint 512 by any known technique, including electric welding techniques such shielded metal arc welding (i.e., stick welding), metal inert gas welding (MIG), tungsten inert gas welding (TIG), and resistance welding.
  • electric welding techniques such shielded metal arc welding (i.e., stick welding), metal inert gas welding (MIG), tungsten inert gas welding (TIG), and resistance welding.
  • the welding equipment 12 may be arc welding equipment that provides a direct current (DC) or alternating current (AC) to a consumable or non-consumable electrode 16 (better shown, for example, in FIG. 5C) of a torch 504.
  • the electrode 16 delivers the current to the point of welding on the workpiece 24.
  • the operator 18 controls the location and operation of the electrode 16 by manipulating the torch 504 and triggering the starting and stopping of the current flow.
  • an arc 26 is developed between the electrode and the workpiece 24.
  • the conduit 14 and the electrode 16 thus deliver current and voltage sufficient to create the electric arc 26 between the electrode 16 and the workpiece.
  • the arc 26 locally melts the workpiece 24 and welding wire or rod supplied to the weld joint 512 (the electrode 16 in the case of a consumable electrode or a separate wire or rod in the case of a non-consumable electrode) at the point of welding between electrode 16 and the workpiece 24, thereby forming a weld joint 512 when the metal cools.
  • the weld monitoring equipment 28 may be used to monitor welding operations.
  • the weld monitoring equipment 28 may be used to monitor various aspects of welding operations, particularly in real-time (that is as welding is taking place).
  • the weld monitoring equipment 28 may be operable to monitor arc characteristics such as length, current, voltage, frequency, variation, and instability. Data obtained from the weld monitoring may be used (e.g., by the operator 18 and/or by an automated quality control system) to ensure proper welding.
  • the equipment 12 and headwear 20 may communicate via a link 25 via which the headwear 20 may control settings of the equipment 12 and/or the equipment 12 may provide information about its settings to the headwear 20.
  • a link 25 via which the headwear 20 may control settings of the equipment 12 and/or the equipment 12 may provide information about its settings to the headwear 20.
  • the link may be wireless, wired, or optical.
  • the operator may need to interact with equipment used in welding operations and/or in weld monitoring of welding operations.
  • the operator 18 may need to interact with the welding equipment 12 and/or with the weld monitoring equipment 28, such as to control the equipment (e.g., adjust settings of the equipment), to obtain real-time feedback information (e.g., real-time equipment status, weld monitoring related information, etc.), and the like.
  • the welding environment may impose certain limitations on possible solutions for interacting with the equipment used in conjunction with welding operations.
  • welding environments may be cluttered (e.g., with many equipment, cords or connectors, etc.), and/or may have spatial limitations (e.g., tight work space, awkward position or placement of workpieces, etc.).
  • adding more devices into such environments to enable interacting with the welding or weld monitoring equipment may be undesirable, particularly when adding devices that require wired connectors.
  • Use of such systems or devices, particularly ones that take too much space, may result in additional undesirable clutter and/or may take up valuable weld cell space.
  • use of wired connections or connectors e.g., cords
  • use of wired connections or connectors may limit use distance for such devices (e.g., from power sources, from equipment that operator is trying to interact with, etc.), and may create safety problems (e.g., trip hazards).
  • small control devices configured to utilize non-wired based solutions (e.g., wireless communication technologies; audio, video, and/or sensory input/output (I/O) solutions, etc.) may be used.
  • control devices implemented in accordance with the present disclosure may be small enough such that they can be worn by the operator, or integrated into equipment or clothing that the operator directly uses or wears during welding operations (e.g., welding helmets, welding gloves, etc.).
  • the devices may be small enough that they can be worn by the operator on or in a belt, an arm, a welding helmet, or welding gloves.
  • control devices may be configured to support and use wireless technologies (e.g., WiFi or Bluetooth) to perform the communications necessary for the interfacing operations.
  • control devices may use or support motion detection and recognition, to enable detecting and recognizing gestures or motions of operators.
  • control devices may be configured or programmed to recognize particular gestures by the operator, which the operator may perform when attempting to remotely interact with a particular piece of equipment in the welding environment (e.g., to remotely control that piece of equipment or to obtain data therefrom).
  • these gestures may mimic the actions that the operator would perform when directly interacting with equipment.
  • the gesture or motion may comprise the operator mimicking the turning of a volume control knob, which when detected and interpreted as such may be communicated to the corresponding equipment to trigger a response by the equipment, as if the knob actually existed.
  • control devices may be implemented such that they may be affixed to a particular part of the operator's body or clothing worn by the operator.
  • An example embodiment may comprise an elastic-like, and/or form-fitting arm band(s), such that it may be affixed to one of the operator's arms.
  • the motion detection and recognition may be performed using any solutions suitable for use in conjunction with welding arrangements in accordance with the present disclosure.
  • Such solutions may comprise, for example, devices or components worn by the operator or integrated into equipment or clothing that the operator directly uses or wears during welding operations.
  • a detection component which may be implemented as standalone device or as built-in component (e.g., of a control device), may be configured to detect gestures or motions of the operator.
  • a motion recognition component which may be implemented as stand-along device or as built-in component (e.g., of a control device), may be configured to receive the detected gestures or motions, and to determine when/if detected gestures or motions correspond to particular user input (e.g., command).
  • the motion recognition component identifies the welding command from the plurality of welding commands based on successful matching of the detected gesture or motion with a gesture or motion associated with a welding command, and transmits the identified welding command to a component of the welding system.
  • the motion detection and recognition function may be configurable and/or programmable. For example, in addition to normal mode of operation, where the motion recognition component is simply used to identify and trigger particular welding commands based on detected gestures or movement, the motion recognition component may also operate in "configuration" or "programing" mode. When operating in such configuration (or programing) mode the motion recognition component may receive one or more detected gestures or motions as well as a welding related command (e.g., provided by the operator using suitable means), and may associate the welding-related command with at least one of the detected gestures or motions, and stores the association for future comparison.
  • a welding related command e.g., provided by the operator using suitable means
  • FIG. 2 shows example welding equipment in accordance with aspects of this disclosure.
  • the equipment 12 of FIG. 2 comprises an antenna 202, a communication port 204, communication interface circuitry 206, user interface module 208, control circuitry 210, power supply circuitry 212, wire feeder module 214, and gas supply module 216.
  • the antenna 202 may be any type of antenna suited for the frequencies, power levels, etc. used by the communication link 25.
  • the communication port 204 may comprise, for example, an Ethernet over twisted pair port, a USB port, an HDMI port, a passive optical network (PON) port, and/or any other suitable port for interfacing with a wired or optical cable.
  • the communication interface circuitry 206 is operable to interface the control circuitry 210 to the antenna 202 and/or port 204 for transmit and receive operations. For transmit, the communication interface 206 may receive data from the control circuitry 210 and packetize the data and convert the data to physical layer signals in accordance with protocols in use on the communication link 25. For receive, the communication interface may receive physical layer signals via the antenna 202 or port 204, recover data from the received physical layer signals (demodulate, decode, etc.), and provide the data to control circuitry 210.
  • the user interface module 208 may comprise electromechanical interface components (e.g., screen, speakers, microphone, buttons, touchscreen, etc.) and associated drive circuitry.
  • the user interface 208 may generate electrical signals in response to user input (e.g., screen touches, button presses, voice commands, etc.).
  • Driver circuitry of the user interface module 208 may condition (e.g., amplify, digitize, etc.) the signals and them to the control circuitry 210.
  • the user interface 208 may generate audible, visual, and/or tactile output (e.g., via speakers, a display, and/or motors/actuators/servos/etc.) in response to signals from the control circuitry 210.
  • the control circuitry 210 comprises circuitry (e.g., a microcontroller and memory) operable to process data from the communication interface 206, the user interface 208, the power supply 212, the wire feeder 214, and/or the gas supply 216; and to output data and/or control signals to the communication interface 206, the user interface 208, the power supply 212, the wire feeder 214, and/or the gas supply 216.
  • circuitry e.g., a microcontroller and memory
  • the power supply circuitry 212 comprises circuitry for generating power to be delivered to a welding electrode via conduit 14.
  • the power supply circuitry 212 may comprise, for example, one or more voltage regulators, current regulators, inverters, and/or the like.
  • the voltage and/or current output by the power supply circuitry 212 may be controlled by a control signal from the control circuitry 210.
  • the power supply circuitry 212 may also comprise circuitry for reporting the present current and/or voltage to the control circuitry 210.
  • the power supply circuitry 212 may comprise circuitry for measuring the voltage and/or current on the conduit 14 (at either or both ends of the conduit 14) such that reported voltage and/or current is actual and not simply an expected value based on calibration.
  • the wire feeder module 214 is configured to deliver a consumable wire electrode
  • the wire feeder 214 may comprise, for example, a spool for holding the wire, an actuator for pulling wire off the spool to deliver to the weld joint 512, and circuitry for controlling the rate at which the actuator delivers the wire.
  • the actuator may be controlled based on a control signal from the control circuitry 210.
  • the wire feeder module 214 may also comprise circuitry for reporting the present wire speed and/or amount of wire remaining to the control circuitry 210.
  • the wire feeder module 214 may comprise circuitry and/or mechanical components for measuring the wire speed, such that reported speed is actual value and not simply an expected value based on calibration.
  • the gas supply module 216 is configured to provide shielding gas via conduit 14 for use during the welding process.
  • the gas supply module 216 may comprise an electrically controlled valve for controlling the rate of gas flow.
  • the valve may be controlled by a control signal from control circuitry 210 (which may be routed through the wire feeder 214 or come directly from the control 210 as indicated by the dashed line).
  • the gas supply module 216 may also comprise circuitry for reporting the present gas flow rate to the control circuitry 210.
  • the gas supply module 216 may comprise circuitry and/or mechanical components for measuring the gas flow rate such that reported flow rate is actual and not simply an expected value based on calibration.
  • FIG. 3 is a block diagram illustrating an example use of a motion detection system operating within a welding system, in accordance with aspects of the present disclosure. Shown in FIG. 3 is a gesture-based welding arrangement 310, implemented in accordance with an example embodiment, comprising a welding system 312 and a motion detection system 314.
  • the motion detection system 314 may comprise detection circuitry 316, a motion recognition system 318, and communications circuitry 320.
  • the detection circuitry 316 may comprise an accessory device 322 (e.g., sensors, accelerometers, computing devices, tags, etc. which may be incorporated into a worn device or clothing article) which may be remote from the motion detection system 314, such as disposed on or near a welding operator 324, but may communicate with the motion detection system 314 via wired or wireless systems.
  • the motion detected by the motion detection system 314 is translated into one or more command signals that the welding system 312 utilizes to change a welding operating parameter.
  • the detection circuitry 316 may comprise one or more cameras or a sensor system that may detect gestures and/or movements of the welding operator 324. It should be noted that in some situations, the detection circuitry 316 may comprise the accessory device 322. Further, the detection circuitry 316 may be configured to detect the motion of the accessory device 322. For example, the detection circuitry 316 may capture the movement of a sensor disposed within the accessory device 322. In other situations, the detection circuitry 316 directly detects the gestures and/or movements of the welding operator 324 without the intermediary accessory device 322. For example, the detection circuitry 316 may identify the welding operator and capture the movements of the welding operator (e.g., movement of the welding operator's joints, appendages, etc.).
  • the detection circuitry 316 may identify the welding operator and capture the movements of the welding operator (e.g., movement of the welding operator's joints, appendages, etc.).
  • the detection circuitry 316 receives motion information from the accessory device 322, which is used to detect the gestures and/or movements of the welding operator 324.
  • the accessory device 322 may detect the movements of the welding operator, such as a blinking of the eye or a pinching of the fingers, and may process and communicate the detected movements to the motion detection system 314.
  • the detection circuitry 316 may incorporate various types of audio/video detection technologies to enable it to detect the positions, movements, gestures, and/or motions of the welding operator 324.
  • the detection circuitry 316 may comprise digital cameras, video cameras, infrared sensors, optical sensors (e.g., video/camera), radio frequency energy detectors, sound sensors, vibration sensors, heat sensors, pressure sensors, magnetic sensors, and the like to detect the positions and/or movements of the welding operator 324 and/or to detect the motion of the accessory device 322.
  • any of these audio/video detection technologies may also be incorporated into the accessory device 322.
  • the cameras may be incorporated with motion-detection components that are triggered by motion, heat, or vibration, and that may be used to detect the motion of the welding operator 324 or the accessory device 322.
  • infrared sensors may be utilized to measure infrared light radiating from the welding operator 324 or the accessory device 322 to determine or detect gestures or motions.
  • other types of sensors e.g., heat, vibration, pressure, sound, magnetic, etc.
  • a plurality of sensors may be positioned in a variety of locations (on or disposed remote from the motion detection system 314) to determine these parameters, and thereby the motion of the welding operator 324 or the accessory device 322, with greater accuracy.
  • one or more different types of sensors may be incorporated into the detection circuitry 316.
  • a heat sensor may be configured to detect motion of the welding operator 324 or the accessory device 322.
  • radio frequency energy sensors may be utilized to detect the motion of the welding operator 324 or the accessory device 322 via radar, microwave, or tomographic motion detection.
  • the motion recognition system 318 may translate the detected motions into various welding commands that correspond to the detected motions. After determining the welding command that corresponds to the detected motions, the motion recognition system 318 may send the welding command to the welding system 312 via the communications circuitry 320.
  • the welding system 312, or more particularly, a component of the welding system 312, may implement the welding command.
  • the motion recognition system 318 may receive a detected motion from the detection circuitry 316 and may interpret the detected motion as a command to stop the function of a component of the welding system 312. Further, the communications circuitry 320 may send a signal to the welding system 312 to stop the component of the welding system 312, as desired by the welding operator 324.
  • the welding system 312 may comprise various components that can receive the control command signals.
  • the systems and methods described herein may be utilized with a gas metal arc welding (GMAW) system, other arc welding processes (e.g., FCAW, FCAW-G, GTAW (TIG), SAW, SMAW), and/or other welding processes (e.g., friction stir, laser, hybrid).
  • GMAW gas metal arc welding
  • other arc welding processes e.g., FCAW, FCAW-G, GTAW (TIG), SAW, SMAW
  • other welding processes e.g., friction stir, laser, hybrid
  • the welding system 312 may comprise a welding power source 326, a welding wire feeder 328, a welding torch 330, and a gas supply system 332.
  • various other welding components 334 can receive the control command signals from the motion detection system 314.
  • the welding power supply unit 326 generally supplies power to the welding system 312 and other various accessories, and may be coupled to the welding wire feeder 328 via a weld cable.
  • the welding power supply 326 may also be coupled to a workpiece (not illustrated) using a lead cable having a clamp.
  • the welding wire feeder 328 is coupled to the welding torch 330 via a weld cable in order to supply welding wire and power to the welding torch 330 during operation of the welding system 312.
  • the welding power supply 326 may couple and directly supply power to the welding torch 330.
  • the welding power supply 326 may generally comprise power conversion circuitry that receives input power from an alternating current power source 454 (e.g., the AC power grid, an engine/generator set, or a combination thereof), conditions the input power, and provides DC or AC output power. As such, the welding power supply 326 may power the welding wire feeder 328 that, in turn, powers the welding torch 330, in accordance with demands of the welding system 312.
  • the illustrated welding system 312 may comprise a gas supply system 332 that supplies a shielding gas or shielding gas mixtures to the welding torch 330.
  • a variety of control devices are often provided to enable an operator to control one or more parameters of the welding operation.
  • a control panel is provided with various knobs and buttons that enable the welding operator to alter the amperage, voltage, or any other desirable parameter of the welding process.
  • the welding operator may control a wide variety of welding parameters on one or more components of the welding system 312 (e.g., voltage output, current output, a wire feed speed, pulse parameters, etc.).
  • a wide variety of welding parameters may be controlled via detected positions, gestures, and/or motions received by detection circuitry 316, and translated into various welding commands via the motion recognition system 318.
  • a welding operator may wish to adjust the speed of the wire feed from the weld location. Accordingly, the welding operator may gesture a preset motion that the motion detection system 314 will detect, recognize, and translate into a command for adjusting the wire feed speed. Further, the welding system 312 receives the command, and implements the command to adjust the wire feed speed as desired. In some situations, the operator may implement several successive gestures for a series of commands that operate the welding system 312 in a desired manner. For example, to adjust a voltage output of the welding system 312, the operator may first provide a gesture that is associated with the welding power source 326, and that is indicative of wanting to control a feature of the welding power source 326.
  • the operator may gesture to increase or decrease the voltage output of the welding system 312.
  • the motion detection system 312 may translate and store each welding command before communicating the final welding command to the welding system 312.
  • the motion detection system 312 may communicate each welding command directly to the welding system 312.
  • the motion detection system 312 may receive only one welding command, but may interpret the welding command into one or more control signals. Accordingly, one or more successive control signals may be implemented by the welding system 312, where each control signal is one step of the received welding command.
  • the motion detection system 314 is coupled to a cloud network 336 having a storage 338 that may comprise a library 440 of gestures associated with a particular welding command and/or a type of welding command.
  • the motion recognition system 318 may utilize the cloud 336 to determine one or more welding commands based on motion detected by the detection circuitry 316.
  • the cloud 336 may refer to various evolving arrangements, infrastructure, networks, and the like that are typically based upon the Internet. The term may refer to any type of cloud, including a client clouds, application clouds, platform clouds, infrastructure clouds, server clouds, and so forth.
  • Such arrangements will generally allow for a various number of entities to receive and store data related to welding applications, transmit data to welders and entities in the welding community for welding applications, provide software as a service (SaaS), provide various aspects of computing platforms as a service (PaaS), provide various network infrastructures as a service (IaaS) and so forth.
  • SaaS software as a service
  • PaaS provide various aspects of computing platforms as a service
  • IaaS network infrastructures as a service
  • included in this term should be various types and business arrangements for these products and services, including public clouds, community clouds, hybrid clouds, and private clouds.
  • the cloud 336 may be a shared resource accessible to various number of welding entities, and each welding entity (e.g., operator, group of operators, company, welding location, facility, etc.) may contribute welding gestures associated with welding commands, which may be utilized by the motion recognition system 318 at a later time.
  • each welding entity e.g., operator, group of operators, company, welding location, facility, etc.
  • each welding entity may contribute welding gestures associated with welding commands, which may be utilized by the motion recognition system 318 at a later time.
  • multiple motion detection components or elements may be used to enhance motion detection and/or interfacing based thereon.
  • multiple accessory devices 322 may be used.
  • the accessory device 322 is an armband based device, for example, the operator may wear an accessory device 322 on each arm.
  • the use of multiple motion detection devices (or elements) may allow detecting (and thus recognizing based thereon) more complex gestures or movements (e.g., complex three- dimensional (3D) gestures or movements).
  • FIG. 4 is a block diagram illustrating an example motion detection system, in accordance with aspects of the present disclosure. Shown in FIG. 4 is the motion detection system 314, which may comprise the detection circuitry 316, the motion recognition system 318, and the communications circuitry 320.
  • the detection circuitry 316 may comprise may incorporate various types of audio/video detection technologies to enable it to detect the positions, movements, gestures, and/or motions of the welding operator 324 and/or the accessory device 322.
  • the communications circuitry 320 enables wired or wireless communications between the motion detection system 314 and the cloud 336, the welding system 312, and/or the accessory device 322.
  • the motion detection system 314 also may comprise a memory 441, a processor 442, a storage medium 444, input/output (I/O) ports 446, and the like.
  • the processor 442 may be any type of computer processor or microprocessor capable of executing computer- executable code.
  • the memory 441 and the storage 444 may be any suitable articles of manufacture that can serve as media to store processor-executable code, data, or the like. These articles of manufacture may represent computer-readable media (i.e., any suitable form of memory or storage) that may store the processor-executable code used by the processor 442 to perform the presently disclosed techniques.
  • the motion recognition system 318 may receive motion and/or gesture data related to the welding operator 324 and/or the accessory device 322 via wired and/or wireless communications. In particular, the motion recognition system 318 interprets the received data to determine the welding commands (e.g., welding control signals) for one or more components of the welding system 312.
  • the memory 441 and the storage 444 may also be used to store the data, the respective interpretation of the data, and the welding command that corresponds to the data within the library 440.
  • the illustrated embodiment depicts the storage 444 of the motion recognition system 318 storing information related to the data and the welding command corresponding to the data (as further described below), but it should be noted that in other embodiments, the memory 441 and/or the cloud 336 (as described with respect to FIG. 3) may be utilized to store the same information.
  • the library 440 may comprise a particular type of motion and/or a particular motion (e.g., gesture) and a welding command associated with that motion or type of motion.
  • a mode of operation engine 448 within the processor 442 of the motion recognition system 318 may be utilized to change the mode of operation of the motion recognition system 318.
  • the mode of operation engine 448 may be set to, for example, an operating mode or a configuration mode.
  • the motion recognition system 318 is programmed to associate a particular motion or gesture with a particular welding command.
  • the operator 324 may provide an input to the motion recognition system 318 via the I/O ports 446 indicating a welding command for a particular component of the welding system 312.
  • the welding operator 324 may then position himself in a manner that allows the detection circuitry 316 to detect the particular motion or gestures that the operator 324 intends to be associated with the inputted welding command.
  • the motion recognition system 318 may store the pattern of motion and/or the gesture collected by the detection circuitry 316 within the library 440, and may associate the motion with the respective welding command.
  • the operator 324 may provide an input to the motion recognition system 318 to enter into the configuration mode and associate a particular motion or gesture with a particular welding command for a particular component of the welding system 312, such as the welding power source 326.
  • the motion recognition system 318 may detect the gestures of the operator 324 such as, for example, holding one arm out straight with a palm out and figures up, while the operator 324 is in the view window of detection circuitry 316.
  • the operator 324 need not within the view of the detection circuitry 316, but may be wearing the accessory device 322 which may comprise one or more sensors (e.g., accelerometers) that tracks the motion of the operator 324 and communicates the motion to the detection circuitry 316.
  • the detection circuitry 316 may be configured to track the movement of the accessory device 322 from the motion recognition system 318, and more specifically, may be tracking the movement of the accessory device 322 and/or one or more sensors disposed within the accessory device 322.
  • the motion recognition system 318 may store the motion and/or gestures as data within the gesture library 440.
  • the data is associated with the welding command or task, and may be tagged as such within the storage 444 and/or memory 441.
  • the operator 324 may configure an upwards motion of the palm of the hand as an gesture associated with increasing the wire speed of the welding system 312.
  • the motion recognition system 318 may enter and exit the configuration mode by receiving some input from the operator 324 that does not comprise any detected motion or gesture.
  • the configuration mode may be secure and may not be compromised by any inadvertent motions or gestures.
  • the mode of operation engine of the processor 442 of the motion recognition system 318 is set to an operating mode.
  • the welding operator 324 may be performing a welding task with the welding system 312, and may have the motion detection system 314 enabled.
  • the operator 324 may wish to adjust a welding parameter via one or more gestures or motion.
  • the detection circuitry 316 receives the gesture and/or motion in one of the methods described above, and the welding command is re-tried from the library 440 based on the detected gestures and/or motions of the operator 324 (or the accessory device 322).
  • the motion recognition system 318 may compare the detected motion to the motions or patterns of motion stored in the library 440 and determine that the motion corresponds to increasing the wire speed of the welding system 312.
  • the motion detection system 314 may support gestures for disabling (and/or enabling) gesture-based functions.
  • the library 440 may comprise an association between particular gesture (or movement) and a command for disabling the motion detection system 314, and the mode of operation engine 448 may support a non-operating mode.
  • the disabling command may be issued and executed. Disabling the motion detection system 314 may be done by powering off the system.
  • notifications may be generated and communicated to other systems that are interacting with the motion detection system 314, such as the welding system 312, to ensure that the system(s) take necessary steps to account for the motion detection system 314 being powered off.
  • disabling comprises powering off the motion detection system 314.
  • (re)enabling the motion detection system 314 may require manually or directly (re)powering on the system.
  • disabling the motion detection system 314 may simply comprise shutting down various components and/or functions therein, and transitioning to a minimal-function state, where only functions and/or components required for re-enabling the motion detection system 314 remain running.
  • an enabling gesture (which may be the same as the disabling gesture) is detected via the accessory device 322
  • the gesture may trigger re-enabling or re-activating the motion detection system 314 into full operation mode. Transitioning to such minimum functionality states may improve power consumption (as many functions and/or components are shut down or powered off) without affecting the ability to resume motion detection related operations when needed (and do so quickly).
  • the motion detection system 314 (and/or other components or devices used to support the gesture-related operations) may be configured to account for disabling and/or deactivating the system, by providing other suitable for the (re)enabling of the system.
  • the re-enabling may be configured or implemented such that it may be triggered by the user by something other than visually recognizing gestures.
  • non-visual means may comprise button/control, sensors that may sense (rather than visually perceive) movement of the user.
  • such visually-based motion detection systems may be configured to supplement remote gesture detection with local (on-person) gesture detection elements for detection re-enabling gestures.
  • the library 440 may comprise a plurality of motions 452 and a corresponding welding command 454 for each motion.
  • the welding commands may include any command to control the welding system 312, and/or components of the welding system 312, such as the welding power source 326, the gas supply system 332, the welding wire feeder 328, the welding torch 330, or other welding components 334 (e.g., grinders, lights, etc.) of the welding system 312.
  • the welding commands may include, but are not limited to, starting a device, stopping a device, increasing a speed or output of a device, decreasing a speed or output of a device, and the like.
  • welding commands related to the gas supply system 332 may include adjusting a gas flow rate.
  • welding commands related to the welding wired feeder 328 may include adjusting a welding wire speed, changing between push/pull feeding system, and the like. Further, welding commands related to the welding power source 326 may include varying a voltage or power routed to the welding torch 330. Moreover, the library 440 may include other commands associated with various motions such as disabling the motion recognition system 318, limiting the control or ability of an operator to engage with the motion recognition system 318, or the like.
  • FIG. 5 is a block diagram illustrating an example gesture accessory device that may be used in conjunction with motion detection systems, communicating therewith wirelessly, in accordance with aspects of the present disclosure. Shown in FIG. 5 is the motion detection system 314 being operatively coupled to the accessory device 322 in accordance with an example embodiment. In this regard, in various embodiments the accessory device 322 may be in wired or wireless communication with the motion detection system 314.
  • the detection circuitry 316 may comprise the accessory device 322. Further, the detection circuitry 316 may be configured to directly track the movement of the accessory device 322 and/or one or more sensors disposed within the accessory device 322 from the motion detection system 314. Specifically, the accessory device 322 may comprise sensors 556 (e.g., infrared, optical, sound, magnetic, vibration, etc.), accelerometers, computing devices, smart phones, tablets, GPS devices, wireless sensor tags, one or cameras, or the like that are configured to aid the detection circuitry 316 in detecting the motion and/or gestures of the operator 324.
  • sensors 556 e.g., infrared, optical, sound, magnetic, vibration, etc.
  • accelerometers e.g., accelerometers, computing devices, smart phones, tablets, GPS devices, wireless sensor tags, one or cameras, or the like that are configured to aid the detection circuitry 316 in detecting the motion and/or gestures of the operator 324.
  • the accessory device 322 may be incorporated into a clothing article that is worn, disposed, or carried by the operator 324 (e.g., bracelet, wristlet, anklet, necklace, etc.), or may be a device that is held by the operator 324.
  • the sensor systems 556 are configured to gather gesture and/or motion data from the operator 324, similar in manner to the detection circuitry 316.
  • the motion and/or gesture data gathered may be digitized via one or more processors within processing circuitry 558, which may also be associated with a memory 560.
  • the processing circuitry 558 may be any type of computer processor or microprocessor capable of executing computer- executable code.
  • the memory 560 may be any suitable articles of manufacture that can serve as media to store processor-executable code, data, or the like. These articles of manufacture may represent computer-readable media (i.e., any suitable form of memory or storage) that may store the processor-executable code used by the processing circuitry to perform the presently disclosed techniques.
  • the digitized data may be communicated via wired and/or wireless communications circuitry 562 to the motion detection system 314.
  • the motion recognition system 318 interprets the received data to determine the welding commands (e.g., welding control signals) for one or more components of the welding system 312, and transfers the welding commands to the welding system 312 via the communications circuitry 320 of the motion detection system 314. It should be noted that the communications between the components of the gesture-based welding arrangement 310 might be over secure channels.
  • the communications circuitry 562 of the gesture accessory device 322 also communicates with the welding system 312.
  • the gesture accessory device 322 may be paired with the welding device 322 before welding operations are commenced, to ensure that the gestures provided by the operator 324 are securely intended for the paired devices. In this manner, though a plurality of gesture accessory devices 322 are proximate to the welding system 312, only the paired device 322 is able to provide gesture commands to the welding system 312 via the motion detection system 314.
  • the accessory device 322 may comprise I/O ports
  • the accessory device 322 may comprise a display 566 that enables an operator 324 to visualize the welding commands sent by the motion detection system 314 to the welding system 312.
  • the display 566 may be utilized to receive and display various welding related information from the welding system 312, such the status of the current operating parameters, the status of welding commands (e.g., control signals) sent to the welding system 312, the status of a wireless connection, whether or not the welding commands were implemented, error or alerts, or generally any information related to the gesture-based welding arrangement 310.
  • various welding related information from the welding system 312, such the status of the current operating parameters, the status of welding commands (e.g., control signals) sent to the welding system 312, the status of a wireless connection, whether or not the welding commands were implemented, error or alerts, or generally any information related to the gesture-based welding arrangement 310.
  • non-visual feedback may be used during gesture-based interactions (e.g., control) of welding operations to allow for feedback to the operator in non- visual manner, e.g., without requiring the operator to use the display or similar visual interfaces which would result in the operator taking his/her eyes off the workpiece(s) being welded.
  • the accessory device 322 may be configured to provide feedback to the operator with respect to gesture-based interaction (e.g., control) of welding operations using non- visual means such as audio (e.g., buzzer), haptic (e.g., vibration), or similar output.
  • the feedback may, for example, acknowledge receipt of a particular gesture.
  • two long vibrations may indicate recognition of a "increase amperage” gesture
  • one short vibration may indicate recognition of a "decrease amperage” gesture.
  • the system may then, for example, wait for a second confirmatory gesture before putting the recognized gesture into effect.
  • non-visual feedback may be particularly desirable where use of a display or visual output may be neither practical nor desirable (e.g., for safety reasons), or to obviate the need for any such display altogether.
  • supporting non-visual feedback may allow wearing accessory device 322 in manner which would not be possible where feedback is provided visually, e.g., it may be worn underneath a welding jacket and thus be protected from the welding environment.
  • the use of non-visual feedback may be more desirable in welding operations as operators may not be able to clearly see visual feedback since the operator would be wearing specialized helmets/hoods and/or be looking at the display and trying to read the visual feedback through special welding lenses or glasses.
  • characteristics of the utilized non-visual output may be adjusted to provide different feedback.
  • the vibration may be adjusted (e.g., in term in one or more of duration, frequency, and intensity of the vibration) to indicate different feedback.
  • characteristics of the non-visual output may be configured and/or adjusted by the operator to indicate particular feedback.
  • I/O ports 564 of the accessory device 322 may be used to specify particular type of non-visual feedback (e.g., audio or vibration), and/or to specify particular characteristics (e.g., duration, frequency, and intensity of vibration) for each particular desired feedback.
  • the accessory device 322 and the motion detection system 314 are illustrated and described as two separate elements in FIGs. 3-5, the disclosure need not be so limited. Accordingly, in an example embodiment, the accessory device 322 and the motion detection system 314 (and functions of them) may be combined into a single device, which may preferably be configured or designed for portable use in the same manner described with respect to the accessory device 322. As such, this single device would be operable to perform the functions of the accessory device 322 (e.g., gesture detection, sensory, input/output) as well as the function of the motion detection system 314 (e.g., motion recognition, command determination, command association, etc.).
  • the functions of the accessory device 322 e.g., gesture detection, sensory, input/output
  • the function of the motion detection system 314 e.g., motion recognition, command determination, command association, etc.
  • FIG. 6 is a flowchart illustrating an example method for communicating a welding command to a welding system from a motion detection system in accordance with aspects of the present disclosure. Shown in FIG. 6 is a flow chart of a method 600, comprising a plurality of example steps (represented as blocks 602-608), for communicating a welding command to the welding system 312 from the motion detection system 314 of FIG. 3 in accordance with an example embodiment.
  • the method 600 may be used in enabling an operating mode of the motion detection system 314 on the mode of operation engine 448 via the I/O ports 446 (step 602).
  • the motion detection system 314 may be configured to detect a motion and/or gesture and utilize the gesture library 440 to translate the detected motion and/or gesture into a welding command.
  • the method 600 may comprise detecting the gesture and/or motion
  • the detection circuitry 316 may comprise may incorporate various types of audio/video detection technologies to enable it to detect the positions, movements, gestures, and/or motions of the welding operator 324 and/or the accessory device 322. Further, the method 600 may comprise determining a welding command that is associated with the detected motion and/or gesture (step 606).
  • the motion recognition system 318 interprets the received data to determine the welding commands (e.g., welding control signals) for one or more components of the welding system 312. The welding command may be determined by comparing the received data with data within the gesture library 440.
  • the method 600 may comprise communicating the welding command to the welding system 312 (step 608).
  • the welding commands may comprise any command to control the welding system 312, and/or components of the welding system 312, such as the welding power source 326, the gas supply system 332, the welding wire feeder 328, the welding torch 330, or other welding components 334 of the welding system 312.
  • the gesture and/or motion provided by the operator 324 and/or the accessory device 322 may be utilized to control one or more welding parameters of the welding system 312.
  • FIG. 7 is a flowchart illustrating an example method for associating a welding command with a particular gesture or motion in accordance with aspects of the present disclosure. Shown in FIG. 7 is a flow chart of a method 700, comprising a plurality of example steps (represented as blocks 702-708) for associating a particular welding command with a particular gesture and/or motion in accordance with an example embodiment.
  • the motion detection system 314 may be configured in an operating mode to detect a motion and/or gesture and utilize the gesture library 440 to translate the detected motion and/or gesture into a welding command.
  • the illustrated method 700 includes enabling a configuration mode (e.g., learning, pairing, association, etc.) of the motion detection system 314 on the mode of operation engine 448 via the I/O ports 446 (step 702).
  • the motion detection system 314 may be configured to associate and store within the memory 441 and/or the storage 444 a particular motion or gesture with a particular welding command.
  • the method 700 may comprise the motion detection system 314 receiving a welding command that the operator 324 wishes to set a gesture and/or motion for via the I/O ports 446 (step 704).
  • the welding command may be for any component of the welding system 312, as discussed above.
  • the welding operator 324 may then position himself in a manner that allows the detection circuitry 316 of the motion detection system 318 to detect the particular motion or gestures that the operator 324 intends to be associated with the inputted welding command (step 706).
  • the method 700 may comprise the motion recognition system 318 storing the motion and/or the gesture collected by the detection circuitry 316 within the library 440, and associating the motion with the respective welding command (step 708).
  • associations may be made and stored within the library 440 of the cloud network 336, and retrieved by the local systems as desired.
  • the pre- associated global welding commands may be overwritten with local welding commands that are more personal to the welding operator 324.
  • FIG. 8 shows an example gesture-based armband device for use in remotely controlling welding operations in accordance with aspects of this disclosure. Shown in FIG. 8 is a gesture -based control device 800 that is worn by an operator (e.g., operator 18) during welding operations.
  • an operator e.g., operator 18
  • the gesture-based control device 800 may comprise suitable circuitry operable to support interacting with and controlling equipment used in welding operations and/or monitoring of welding operations.
  • the gesture -based control device 800 may be configured such that it may be affixed to an operator (e.g., operator 18), and/or to an item worn by or directly handled by the operator (e.g., welding helmet, welding glove, welding torch, etc.).
  • the gesture-based control device 800 may be configured as an armband-based implementation.
  • the gesture-based control device 800 may be operable to receive operator input, which is specifically provided in the form of gestures and/or movement.
  • the gesture-based control device 800 may be operable to detect gestures and/or motions by the operator, and then process the detected gestures and/or motions.
  • the processing may comprise determining whether (or not) the detected gestures and/or motions correspond to particular command (or action(s)).
  • particular gestures and/or motions may be associated with specific welding commands, or with actions that may be performed or handled directly by the device itself (e.g., generating new associations for detected gestures and/or motions, disabling/(re-)enabling gesture/motion related components or operations, etc.).
  • the gesture-based control device 800 may be configured to communicate with other devices or systems, such as when detected gestures or motions are determined to be associated with particular welding commands.
  • the gesture-based control device 800 may preferably be operable to perform such communications wirelessly.
  • the gesture- based control device 800 may be operable to connect to other devices or systems (e.g., welding and/or weld monitoring equipment) wirelessly, e.g., by setting up and using connections based on suitable wireless technologies, such as WiFi, Bluetooth, etc.
  • the armband-based gesture-based control device 800 of FIG. 8 comprises an elastic band 810 (e.g., wrist band), which may allow the operator 18 to wear the gesture-based control device 800 on his/her arm (as shown in the top part of FIG. 8).
  • an elastic band 810 e.g., wrist band
  • a dedicated armband arrangement may be used instead.
  • Such armband arrangement may comprise the band 810 and a holder 820 to which the gesture-based control device 800 may be attached.
  • the holder 820 may comprise suitable securing means (e.g., a clip), configured to secure the device 800 into the armband arrangement.
  • suitable securing means e.g., a clip
  • the gesture -based control device 800 may be configured to support non-visual based feedback.
  • the gesture-based control device 800 may be operable to support tactile feedback, such as vibration, or audio feedback.
  • the gesture-based control device 800 may comprise vibration and/or audio components, which may generate vibrations and/or audio signals as means of providing feedback.
  • the vibration component may be operable to vary the frequency, amplitude, and duration of vibrations to provide different feedback, e.g., indicating recognition of certain gestures or validating that a corresponding action has taken place. This feedback would be detectible by the operator since the device is in intimate contact with the operator's arm.
  • the audio component may provide feedback, e.g., by providing particular audio outputs (e.g., a particular one of a plurality of predefined ring tones, similar to the ones typically used in phones) with each being uniquely indicative of certain feedback.
  • particular audio outputs e.g., a particular one of a plurality of predefined ring tones, similar to the ones typically used in phones
  • the gesture-based control device 800 may be a dedicated device that is designed and implemented specifically for use in interacting with and controlling welding components (e.g., welding and/or weld monitoring equipment). In some example implementations, however, devices which may not be specifically designed or made as a "control device" may be nonetheless configured for use as such. In this regard, devices having capabilities and/or characteristics that may be necessary for functioning as a control device, in the manner described in the present disclosure, may be used.
  • devices that may be used include those having (1) suitable communicative capabilities (e.g., wireless technologies such as WiFi, Bluetooth, or the like), (2) suitable resources for detecting gestures and/or motions interactions (e.g., sensors), (3) suitable processing capabilities for processing and analyzing detected motions or gestures, or (4) suitable resources for providing feedback (e.g., keypads, buttons, textual interface, or touchscreens) that are also sufficiently small and/or light to be conveniently worn by the operator and/or integrated into the items worn or directly used by the operator.
  • suitable communicative capabilities e.g., wireless technologies such as WiFi, Bluetooth, or the like
  • suitable resources for detecting gestures and/or motions interactions e.g., sensors
  • suitable processing capabilities for processing and analyzing detected motions or gestures
  • suitable resources for providing feedback e.g., keypads, buttons, textual interface, or touchscreens
  • devices such as smartphones, smartwatches, and the like may be used as "control devices.”
  • the interfacing functions may be implemented in software (e.
  • FIG. 9 shows example circuitry of a gesture-based armband device for use in remotely controlling welding operations in accordance with aspects of this disclosure. Shown in FIG. 9 is circuitry of an example gesture-based control device 900. The gesture -based control device 900 may correspond to the device 800 of FIG. 8.
  • the gesture-based control device 900 may comprise a main controller (e.g., a central processing unit (CPU)) circuitry 910, a communication interface circuitry 920, an audio controller circuitry 930, a haptic controller circuitry 940, and a sensor controller 950.
  • main controller e.g., a central processing unit (CPU)
  • CPU central processing unit
  • the main controller circuitry 910 is operable to process data, execute particular tasks or functions, and/or control operations of other components in the device 900.
  • the main controller circuitry 910 may receive sensory data from the sensor controller circuitry 950, which may correspond to gestures or movement by the operator wearing the device 900.
  • the main controller circuitry 910 may process such sensory data by, for example, applying pre-programmed gesture recognition algorithms (and/or using pre-stored information, such as a gesture based library) to discern if the sensory data indicates a gesture or motion belonging to a set of trained gestures. If the gesture belongs to a set of trained gestures, the main controller circuitry 910 may identify the associated control actions.
  • the main controller circuitry 910 may then control other components of the device in response to any identified actions or commands. For example, where a particular welding command is identified, the main controller circuitry 910 may send data and/or signals to the communication interface circuitry 920 for transmission of the command thereby (e.g., to the appropriate welding or weld monitoring equipment via WiFi or Bluetooth signal 921). The main controller circuitry 910 may also generate or determine suitable feedback (e.g., acknowledgment of detection of gestures/motions, validation of detected gestures/motion, acknowledgment of taking corresponding action, etc.) and may transmit data and/or signals to feedback related components (e.g., audio controller circuitry 930 or haptic controller circuitry 940) to effectuate providing the feedback.
  • suitable feedback e.g., acknowledgment of detection of gestures/motions, validation of detected gestures/motion, acknowledgment of taking corresponding action, etc.
  • feedback related components e.g., audio controller circuitry 930 or haptic controller circuitry 940
  • the communication interface circuitry 920 is operable to handle communications in the gesture-based control device 900.
  • the communication interface circuitry 920 may be configured to support various wired or wireless technologies.
  • the communication interface circuitry 920 may be operable to, for example, configure, setup, and/or use wired and/or wireless connections, such as over suitable wired/wireless interface(s) and in accordance with wireless and/or wired protocols or standards supported in the device, to facilitate transmission and/or reception of signals (e.g., carrying data). Further, the communication interface circuitry 920 may be operable to process transmitted and/or received signals, in accordance with applicable wired or wireless technologies.
  • Examples of wireless technologies that may be supported and/or used by the communication interface circuitry 920 may comprise wireless personal area network (WPAN), such as Bluetooth (IEEE 802.15); near field communication (NFC); wireless local area network (WLAN), such as WiFi (IEEE 802.11); cellular technologies, such as 2G/2G+ (e.g., GSM/GPRS/EDGE, and IS-95 or cdmaOne) and/or 3G/3G+ (e.g., CDMA2000, UMTS, and HSPA); 4G, such as WiMAX (IEEE 802.16) and LTE; Ultra-Wideband (UWB); etc.
  • WPAN wireless personal area network
  • WLAN wireless local area network
  • WiFi IEEE 802.11
  • 2G/2G+ e.g., GSM/GPRS/EDGE, and IS-95 or cdmaOne
  • 3G/3G+ e.g., CDMA2000, UMTS, and HSPA
  • 4G such as WiMAX (IEEE 802.1
  • Examples of wired technologies that may be supported and/or used by the communication interface circuitry 920 comprise Ethernet (IEEE 802.3), Universal Serial Bus (USB) based interfaces, etc.
  • Examples of signal processing operations that may be performed by the device 900 comprise, for example, filtering, amplification, analog-to-digital conversion and/or digital-to-analog conversion, up-conversion/down-conversion of baseband signals, encoding/decoding, encryption/decryption, modulation/demodulation, etc.
  • the communication interface circuitry 920 may be configured to use an antenna 922 for wireless communications and a port 414 for wired communications.
  • the antenna 922 may be any type of antenna suited for the frequencies, power levels, etc. required for wireless interfaces/protocols supported by the gesture-based control device 900.
  • the antenna 922 may particularly support WiFi and/or Bluetooth transmission/reception.
  • the port 924 may be any type of connectors suited for the communications over wired interfaces/protocols supported by the gesture -based control device 900.
  • the port 924 may comprise an Ethernet over twisted pair port, a USB port, an HDMI port, a passive optical network (PON) port, and/or any other suitable port for interfacing with a wired or optical cable.
  • PON passive optical network
  • the audio controller circuitry 930 is operable to handle audio input and/or output
  • the audio controller circuitry 930 may be operable to drive one or more audio I/O elements 932 (e.g., audio transducers, each being operable to convert audio signals into electromagnetic signals or vice versa).
  • the audio controller circuitry 930 may generate and/or condition (e.g., amplify or, digitize) data corresponding to audio input or output (signals) in the device 900.
  • the audio controller circuitry 930 may be operable to generate data or signals that cause an audio transducer to output an audio signal 931, which represents feedback provided to the operator using (e.g., wearing) the device 900 during welding operations in response to detection of gestures or motions by the operators.
  • the haptic controller circuitry 940 is operable to handle haptic input and/or output
  • the haptic controller circuitry 940 may be operable to drive one or more haptic elements 942 (e.g., a buzzer or vibration transducer).
  • the haptic controller circuitry 940 may generate and/or condition (e.g., amplify or digitize) data corresponding to haptic output (signals) in the device 900.
  • the haptic controller circuitry 940 may be operable to generate data or signals that cause a haptic element to output a haptic signal 941 (e.g., vibration).
  • the haptic signal 941 may provide feedback to the operator using (e.g., wearing) the device 900 during welding operations.
  • the haptic signal 941 may be generated in response to detection of gestures or motions by the operator.
  • the sensor controller circuitry 950 is operable to handle sensory related operations of the device 900.
  • the sensor controller circuitry 950 may be operable to drive or control one or more sensors that may be used in obtaining sensory data germane to operations of the device 900.
  • Various types of sensors may be handled and/or supported by the sensor controller circuitry 950.
  • each muscle sensor 952 may comprise a series of segmented electrodes arranged such that when the muscle sensor 952 is in contact with the operator's body (e.g., forearm) the electrodes may detect differential electrical impulses of nearby muscles.
  • the sensory data obtained in this manner may be used to detect and decode gestures made by the operator using part(s) of the body near the sensor on which the sensors reside (e.g., the arm, wrist, hand or fingers of the arm on which the sensors reside).
  • MEMS sensors 954 may also be used. They may be embedded directly into the device 900 and/or within other items worn or used by the operator (e.g., gloves). These sensors may generate sensory data relating to gestures or movements made by the operator (e.g., using hands, fingers, or wrists).
  • the MEMS sensors 954 may comprise gyroscopic-based, accelerometer-based, and/or magnetic-based sensors. The MEMS sensors 954 may generate sensory data, which may be used (separately or in conjunction with sensory data from other sensors, such as muscle sensors 952) in detecting and decoding gestures or motions made by the operator.
  • the movement of the operator's arms may change the orientation of the device 900.
  • the MEMS sensors 954 may generate sensory data that allows a determination of the orientation of the device, and changes thereof.
  • Gyroscopic-based sensors which can easily be used for sensing in the xyz orthogonal Cartesian coordinate system, may be used individually or in combination with other sensors to sense the overall movement and direction of movement of the entire arm on which the armband resides.
  • the accelerometer-based sensors (particularly when configured for 3D sensing) may detect the pull of gravity or the gravity vector so the outputs of these can be used individually or in combination to determine the orientation of the device on the arm with respect to the generally accepted direction of "down" and "up".
  • the magnetic -based sensors while perhaps being overwhelmed by the magnetic field in the proximity of the welding arc, could (when the arc is not in operation) determine the direction of "magnetic north,” which can be used to direct an operator to move in a particular direction, perhaps to locate the piece of equipment being controlled when it is a long distance from the site upon which the operator is working.
  • the disclosure is not so limited, and as such some of these components may represent separate dedicated 'devices' which are merely coupled to the other components of the device 900.
  • the device 900 or at least a main component thereof (e.g., a component comprising to at least the main controller circuitry 910) may be implemented as a rechargeable battery-powered platform, comprising the minimally required resources (e.g., processing or memory) to provide or enable the required detection, analysis, communication, and/or feedback functions.
  • the device 900 may be operable to log and/or store gestures, and/or gesture attempts (e.g., those that fail to match existing predefined gestures), such as to improve gesture recognition through statistical training.
  • FIG. 10 is a flowchart illustrating an example method for providing feedback during gesture-based remote control of welding operations in accordance with aspects of the present disclosure. Shown in FIG. 10 is flow chart of a method 1000, comprising a plurality of example steps (represented as blocks 1002-1010).
  • step 1002 gesture and/or motion of an operator may be detected, such as based on sensory data obtained from sensors placed on or near the operator (or integrated in item worn or used by the operator).
  • the detected gesture and/or motion may be processed, such as to determine if the gesture and/or motion matches any pre-defined gesture or motion, and whether there are any associated actions corresponding thereto (e.g., a welding command or the programming of a new association).
  • step 1006 it may be determined what feedback (if any) need be provided to the operator.
  • the feedback may be an acknowledgement (e.g., that gesture/motion is detected, that action is determined, or that action is performed), a validation (e.g., requesting operator confirmation), and the like.
  • corresponding output may be generated based on the feedback.
  • the output may be configured based on the feedback itself and the type of the output (e.g., modifying characteristics of output based on particular feedback). For example, with vibration based outputs, characteristics like intensity, frequency, and/or duration may be adjusted to reflect different feedback.
  • the feedback may be output to the operator.
  • the present methods and systems may be realized in hardware, software, or a combination of hardware and software.
  • the present methods and/or systems may be realized in a centralized fashion in at least one computing system, or in a distributed fashion where different elements are spread across several interconnected computing systems. Any kind of computing system or other apparatus adapted for carrying out the methods described herein is suited.
  • a typical combination of hardware and software may include a general-purpose computing system with a program or other code that, when being loaded and executed, controls the computing system such that it carries out the methods described herein.
  • Another typical implementation may comprise an application specific integrated circuit or chip.
  • Some implementations may comprise a non-transitory machine-readable (e.g., computer readable) medium (e.g., FLASH drive, optical disk, magnetic storage disk, or the like) having stored thereon one or more lines of code executable by a machine, thereby causing the machine to perform processes as described herein.
  • a non-transitory machine-readable (e.g., computer readable) medium e.g., FLASH drive, optical disk, magnetic storage disk, or the like
  • circuits and “circuitry” refer to physical electronic components (i.e. hardware) and any software and/or firmware ("code”) which may configure the hardware, be executed by the hardware, and or otherwise be associated with the hardware.
  • code software and/or firmware
  • a particular processor and memory may comprise a first "circuit” when executing a first set of one or more lines of code and may comprise a second "circuit” when executing a second set of one or more lines of code.
  • and/or means any one or more of the items in the list joined by “and/or”.
  • x and/or y means any element of the three-element set ⁇ (x), (y), (x, y) ⁇ .
  • x and/or y means “one or both of x and y”.
  • x, y, and/or z means any element of the seven-element set ⁇ (x), (y), (z), (x, y), (x, z), (y, z), (x, y, z) ⁇ .
  • x, y and/or z means “one or more of x, y and z”.
  • example means serving as a non-limiting example, instance, or illustration.
  • the terms "e.g. and for example” set off lists of one or more non-limiting examples, instances, or illustrations.
  • circuitry is "operable" to perform a function whenever the circuitry comprises the necessary hardware and code (if any is necessary) to perform the function, regardless of whether performance of the function is disabled or not enabled (e.g., by a user-configurable setting, factory trim, etc.).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Arc Welding Control (AREA)
EP16705609.2A 2015-03-17 2016-01-19 Armbandbasierte systeme und verfahren zur steuerung von schweissausrüstung unter verwendung von gesten und ähnlichen bewegungen Withdrawn EP3271104A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/659,853 US10987762B2 (en) 2014-09-30 2015-03-17 Armband based systems and methods for controlling welding equipment using gestures and like motions
PCT/US2016/013867 WO2016148772A1 (en) 2015-03-17 2016-01-19 Armband based systems and methods for controlling welding equipment using gestures and like motions

Publications (1)

Publication Number Publication Date
EP3271104A1 true EP3271104A1 (de) 2018-01-24

Family

ID=55404786

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16705609.2A Withdrawn EP3271104A1 (de) 2015-03-17 2016-01-19 Armbandbasierte systeme und verfahren zur steuerung von schweissausrüstung unter verwendung von gesten und ähnlichen bewegungen

Country Status (3)

Country Link
EP (1) EP3271104A1 (de)
CN (1) CN107635710B (de)
WO (1) WO2016148772A1 (de)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018073422A2 (de) * 2016-10-21 2018-04-26 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Innenraum-personenortung-basierte fertigungssteuerung in der metallverarbeitenden industrie
JP2019190733A (ja) * 2018-04-25 2019-10-31 日本電産サンキョー株式会社 製氷機および製氷機の制御方法
CN110413135A (zh) * 2018-04-27 2019-11-05 开利公司 姿势进入控制系统和操作方法
GB201812080D0 (en) * 2018-07-24 2018-09-05 Kano Computing Ltd Motion sensing controller
CN111975171A (zh) * 2019-05-22 2020-11-24 伊利诺斯工具制品有限公司 具有未知停机时间禁用的焊接监视系统
CN113211390B (zh) * 2020-06-03 2022-05-27 德丰电创科技股份有限公司 一种用于控制电动设备运行的系统

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR950003258B1 (ko) * 1990-03-31 1995-04-07 기아자동차 주식회사 스포트용접용 로보트의 용착검출장치
JP3852635B2 (ja) * 1997-08-08 2006-12-06 株式会社安川電機 アーク溶接モニタ装置
JP3810417B2 (ja) * 2004-06-16 2006-08-16 菅機械産業株式会社 制御装置
US9352411B2 (en) * 2008-05-28 2016-05-31 Illinois Tool Works Inc. Welding training system
AT508094B1 (de) * 2009-03-31 2015-05-15 Fronius Int Gmbh Verfahren und vorrichtung zur bedienung einer mit einem handbetätigten arbeitsgerät verbundenen stromquelle
US9993891B2 (en) * 2010-07-14 2018-06-12 Illinois Tool Works Inc. Welding parameter control via welder motion or position monitoring

Also Published As

Publication number Publication date
WO2016148772A1 (en) 2016-09-22
CN107635710B (zh) 2020-05-19
CN107635710A (zh) 2018-01-26

Similar Documents

Publication Publication Date Title
US10987762B2 (en) Armband based systems and methods for controlling welding equipment using gestures and like motions
CN107635710B (zh) 用于使用姿势和类似动作来控制焊接设备的基于臂带的系统及方法
US10201868B2 (en) Systems and methods for gesture control of a welding system
US20210346975A1 (en) Systems and methods for a personally allocated interface for use in a welding system
JP6404551B2 (ja) 触覚教示ペンダント
EP2917001B1 (de) Hybrides system mit haptischer steuerung oder gestensteuerung
US20220161349A1 (en) Remote Power Supply Parameter Adjustment
JPWO2017033367A1 (ja) 遠隔操作ロボットシステム
JP6073893B2 (ja) 工業用ロボットの運動および/またはプロシージャをプログラミングまたは設定する方法、制御システムおよび運動設定手段
US20160318114A1 (en) Wearable technology for interfacing with welding equipment and monitoring equipment using wireless technologies
EP4044155A1 (de) Schweissnahtverfolgungssysteme
CN114939706A (zh) 基于面罩的焊接跟踪系统
CA3187857A1 (en) Calibration procedures for helmet-based weld tracking systems
JP2017052031A (ja) ロボット操作装置、ロボット操作方法
US20160228971A1 (en) Wearable technology for interfacing with welding equipment and monitoring equipment using wireless technologies
US11817006B2 (en) Weld modules for weld training systems
EP4141592A1 (de) Steuerung von industriemaschinen mittels verfolgung von bewegungen ihrer bediener
CA3030385A1 (en) Wearable technology for interfacing with welding equipment and monitoring equipment using wireless technologies

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20170825

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200330

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20220223