US20230080145A1 - Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems - Google Patents
Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems Download PDFInfo
- Publication number
- US20230080145A1 US20230080145A1 US17/991,939 US202217991939A US2023080145A1 US 20230080145 A1 US20230080145 A1 US 20230080145A1 US 202217991939 A US202217991939 A US 202217991939A US 2023080145 A1 US2023080145 A1 US 2023080145A1
- Authority
- US
- United States
- Prior art keywords
- simulated
- visual effect
- welding
- rendering
- helmet
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/24—Use of tools
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0118—Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
Definitions
- the present disclosure generally relates to augmented reality welding systems and, more particularly, to adjustable visual effects that simulate auto darkening lenses in augmented reality welding systems.
- Conventional arc welding systems generate electrical arcs that are bright enough to blind if viewed by the naked eye.
- Conventional welding helmets therefore provide shaded (and/or tinted) lenses to diminish the brightness.
- Some welding helmets have an auto-darkening lens that provides substantial shading (and/or darkening, tinting, etc.) only when exposed to a threshold level of light (e.g., from a bright electrical arc), while providing a largely unshaded viewing lens when exposed to lower light levels (e.g., from a light bulb).
- Some augmented reality welding systems present welding simulations (e.g., for training) via a display screen. However, as there are no actual welding arcs, and/or associated bright visible light, simulating the shading and/or auto-darkening effects of welding helmets is more complicated.
- the present disclosure is directed to adjustable visual effects that simulate auto darkening lenses in augmented reality welding systems, for example, substantially as illustrated by and/or described in connection with at least one of the figures, and as set forth more completely in the claims.
- FIG. 1 is a diagram illustrating components of an example augmented welding system, in accordance with aspects of this disclosure.
- FIG. 2 is a block diagram further illustrating the components of the example augmented welding system of FIG. 1 , in accordance with aspects of this disclosure.
- FIG. 3 is a flow diagram illustrating an example primary control process that may be used with the example augmented welding system of FIGS. 1 and 2 , in accordance with aspects of this disclosure.
- FIG. 4 is a flow diagram illustrating an example visual effect determination process that may be used with the example primary control process of FIG. 4 , in accordance with aspects of this disclosure.
- FIG. 5 is a graph illustrating an example spectral transmissivity curve, in accordance with aspects of this disclosure.
- FIGS. 6 a and 6 b are diagrams illustrating example applications of a visual effect to a simulated rendering, in accordance with aspects of this disclosure.
- the examples described herein are not limited to only the recited values, ranges of values, positions, orientations, and/or actions but rather should include reasonably workable deviations.
- “and/or” means any one or more of the items in the list joined by “and/or”.
- “x and/or y” means any element of the three-element set ⁇ (x), (y), (x, y) ⁇ . In other words, “x and/or y” means “one or both of x and y”.
- “x, y, and/or z” means any element of the seven-element set ⁇ (x), (y), (z), (x, y), (x, z), (y, z), (x, y, z) ⁇ . In other words, “x, y and/or z” means “one or more of x, y and z”.
- the term “attach” means to affix, couple, connect, join, fasten, link, and/or otherwise secure.
- the term “connect” means to attach, affix, couple, join, fasten, link, and/or otherwise secure.
- circuits and “circuitry” refer to physical electronic components (i.e., hardware) and any software and/or firmware (“code”) which may configure the hardware, be executed by the hardware, and or otherwise be associated with the hardware.
- code software and/or firmware
- a particular processor and memory may comprise a first “circuit” when executing a first one or more lines of code and may comprise a second “circuit” when executing a second one or more lines of code.
- circuitry is “operable” and/or “configured” to perform a function whenever the circuitry comprises the necessary hardware and/or code (if any is necessary) to perform the function, regardless of whether performance of the function is disabled or enabled (e.g., by a user-configurable setting, factory trim, etc.).
- a control circuit may include digital and/or analog circuitry, discrete and/or integrated circuitry, microprocessors, DSPs, etc., software, hardware and/or firmware, located on one or more boards, that form part or all of a controller, and/or are used to control a welding process, and/or a device such as a power source or wire feeder.
- processor means processing devices, apparatus, programs, circuits, components, systems, and subsystems, whether implemented in hardware, tangibly embodied software, or both, and whether or not it is programmable.
- processor includes, but is not limited to, one or more computing devices, hardwired circuits, signal-modifying devices and systems, devices and machines for controlling systems, central processing units, programmable devices and systems, field-programmable gate arrays, application-specific integrated circuits, systems on a chip, systems comprising discrete elements and/or circuits, state machines, virtual machines, data processors, processing facilities, and combinations of any of the foregoing.
- the processor may be, for example, any type of general purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, an application-specific integrated circuit (ASIC).
- DSP digital signal processing
- ASIC application-specific integrated circuit
- the processor may be coupled to, and/or integrated with a memory device.
- the term “memory” and/or “memory device” means computer hardware or circuitry to store information for use by a processor and/or other digital device.
- the memory and/or memory device can be any suitable type of computer memory or any other type of electronic storage medium, such as, for example, read-only memory (ROM), random access memory (RAM), cache memory, compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically-erasable programmable read-only memory (EEPROM), a computer-readable medium, or the like.
- ROM read-only memory
- RAM random access memory
- CDROM compact disc read-only memory
- EPROM erasable programmable read-only memory
- EEPROM electrically-erasable programmable read-only memory
- controlling “power” may involve controlling voltage, current, energy, and/or enthalpy, and/or controlling based on “power” may involve controlling based on voltage, current, energy, and/or enthalpy.
- welding-type power refers to power suitable for welding, cladding, brazing, plasma cutting, induction heating, carbon arc cutting, and/or hot wire welding/preheating (including laser welding and laser cladding), carbon arc cutting or gouging, and/or resistive preheating.
- a welding-type power supply and/or power source refers to any device capable of, when power is applied thereto, supplying welding, cladding, brazing, plasma cutting, induction heating, laser (including laser welding, laser hybrid, and laser cladding), carbon arc cutting or gouging, and/or resistive preheating, including but not limited to transformer-rectifiers, inverters, converters, resonant power supplies, quasi-resonant power supplies, switch-mode power supplies, etc., as well as control circuitry and other ancillary circuitry associated therewith.
- a welding training system comprising a display screen configured to display a simulated rendering and control circuitry configured to determine a visual effect based on one or more settings and a simulated arc state, and apply the visual effect to at least a portion of the simulated rendering.
- the visual effect comprises a filter.
- the filter reduces a brightness of at least a portion of the simulated rendering.
- the filter is uniform or based on a spectral transmissivity curve.
- the portion comprises at least one of a background, a foreground, an entirety, a weld area, a welding arc, or a weld pool of the simulated rendering.
- the simulated arc state comprises a visible simulated arc or an absent simulated arc.
- the one or more settings simulate settings of an auto-darkening welding helmet.
- the one or more settings comprise one or more of a shade setting, a sensitivity setting, a helmet model setting, or a delay setting.
- the sensitivity setting sets an arc brightness threshold above which the visual effect applies a filter to the portion of the simulated rendering. In some examples, wherein the shade setting sets a filter level of the visual effect after an arc brightness threshold is reached. In some examples, the delay setting comprises a time delay between a change of the simulated arc state and a change of the visual effect. In some examples, the helmet model setting comprises a type of welding helmet, wherein the visual effect comprises a filter, and wherein the filter is based on a spectral transmissivity curve of the type of welding helmet. In some examples, the control circuitry is further configured to determine the visual effect based on one or more weld settings.
- the one or more weld settings comprise one or more of a voltage, a current, a gas type, a wire feed speed, a workpiece material type, or a filler type.
- the control circuitry is further configured to determine the visual effect based on a simulation difficulty setting.
- the system further comprises a simulated welding helmet having the display screen, wherein the simulated welding helmet comprises a camera configured to capture images of a surrounding environment.
- the control circuitry is further configured to receive the images from the camera, detect one or more weld settings of the welding training system, and generate the simulated rendering based on the images and the one or more weld settings.
- Some examples of the present disclosure relate to a method, comprising determining, via control circuitry, a visual effect based on one or more settings and a simulated arc state, applying, via the control circuitry, the visual effect to a simulated rendering, and displaying the simulated rendering on a display screen.
- applying the visual effect comprises filtering a brightness of at least a portion of the simulated rendering, wherein the portion comprises a background, a foreground, an entirety, a weld area, a welding arc, or a weld pool of the simulated rendering.
- the one or more settings comprise one or more of a shade setting, a sensitivity setting, a helmet model setting, or a delay setting.
- an augmented reality welding system has a display screen configured to display a simulated rendering, such as display screen of a welding helmet worn by the user.
- the simulated rendering may be based on recorded images of a surrounding environment, such as recorded by one or more cameras configured to record images.
- the recorded images are processed and/or augmented by an augmented reality computing system to create the simulated rendering.
- the simulated rendering includes more or fewer images and/or details (e.g., relating to an arc, a workpiece, a welding torch, etc.) than the images recorded by the cameras.
- the simulated rendering is presented to a user via the display screen.
- the computing system is further configured to add a visual effect to the simulated rendering when the simulated rendering includes images related to an augmented and/or simulated arc.
- the visual effect is a filtering effect that filters some or all of the bright light associated with the augmented and/or simulated arc.
- the visual effect is designed to enhance the reality of the augmented reality welding system by emulating a filtering, shading, tinting, and/or darkening effect of actual welding helmets when an actual arc is present.
- the augmented reality system further includes one or more settings that impact the visual effect. Some of the settings may be similar to settings that sometimes appear on actual welding helmets, such as, for example, a shade setting, a sensitivity setting, and/or a delay setting. In some examples, additional settings specific to the augmented reality welding system are provided, such as, for example, a welding helmet model setting and/or a difficulty or realism setting. These settings may allow a user to adjust and/or customize the augmented reality welding experience. In some examples, the computing system may apply the visual effect to the simulated rendering based on the one or more settings.
- FIG. 1 shows an example of an augmented reality welding system 100 . While the present disclosure sometimes refers to just augmented reality for simplicity, it should be understood that features of the augmented reality welding system 100 may also be implemented in a mixed reality welding system and/or a virtual reality welding system. In some examples, the augmented reality welding system 100 may be used for weld training.
- the augmented reality welding system 100 includes a simulated welding helmet 102 , a computing system 200 , a display screen 104 in communication with the computing system 200 , and a user interface 106 in communication with the computing system 200 .
- the simulated welding helmet 102 includes an outer shell 108 , a headband 110 , a faceplate 112 , and a helmet interface 113 .
- the headband 110 is configured to secure the simulated welding helmet 102 to the head of a user, while the outer shell 108 is configured to retain the faceplate 112 and protect the head of the user.
- the simulated welding helmet 102 may appear substantially differently.
- the simulated welding helmet 102 may simply simulate a conventional welding helmet, without including features such as the outer shell 108 , headband 110 , faceplate 112 , and/or helmet interface 113 .
- an electronic display screen 104 is secured within simulated welding helmet 102 (such as within the outer shell 108 , for example), such that the display screen 104 is viewable by a user (e.g., a trainee and/or operator) when the simulated welding helmet 102 is worn on the head of the user.
- the electronic display screen 104 may be removably secured within the simulated welding helmet 102 (such as within the outer shell 108 , for example).
- the electronic display screen 104 may be part of a separate component (e.g., specialized glasses) that is secured to the outer shell 108 .
- the electronic display screen 104 may be part of the faceplate 112 , and/or vice versa.
- the electronic display screen 104 may be entirely separate from the simulated welding helmet 102 .
- the simulated welding helmet 102 further includes one or more cameras 114 affixed to the outer shell 108 .
- the cameras 114 may be digital video cameras. As shown, there are three cameras 114 affixed to the outer shell 108 : one camera 114 a on top of the simulated welding helmet 102 , one camera 114 b on the left side, and one camera 114 c on the right side.
- the cameras 114 are arranged in a triangle configuration so as to increase the accuracy of depth and/or three dimensional spatial calculations, such as by the computing system 200 , for example. In some examples, there may be less or more cameras 114 . In some examples, only two cameras 114 may be needed to facilitate depth and/or spatial calculations.
- the cameras 114 are directed forward, in the same direction a user wearing the simulated welding helmet 102 would be looking.
- the cameras 114 may be movably mounted, and/or have movable lenses configured to redirect a focus (and/or adjust certain attributes) of the cameras 114 , in response to one or more command signals (e.g., received from computing system 200 and/or camera controller(s) 124 ).
- the cameras 114 may be otherwise positioned on the simulated welding helmet 102 and/or disconnected from the simulated welding helmet 102 entirely.
- the cameras 114 are directed towards a workpiece 116 and welding torch 118 within a welding area 120 (and/or welding cell).
- the welding torch 118 may be a real, functional, welding torch.
- the welding torch 118 may instead be a mock welding torch.
- the welding torch 118 may be a gun or torch configured for gas metal arc welding (GMAW) or an electrode holder (i.e., stinger) configured for shielded metal arc welding (SMAW).
- GMAW gas metal arc welding
- SMAW shielded metal arc welding
- the welding torch 118 is in communication with the computing system 200 .
- the welding torch 118 may additionally, or alternatively, be in communication with some other system (e.g., a mock or actual welding power supply and/or mock or actual welding wire feeder) that may (or may not) be in communication with the computing system 200 .
- some other system e.g., a mock or actual welding power supply and/or mock or actual welding wire feeder
- the workpiece 116 and welding torch 118 within the welding area 120 include markers 122 configured to be captured by the cameras 114 and/or interpreted by the computing system 200 .
- markers 122 configured to be captured by the cameras 114 and/or interpreted by the computing system 200 .
- other welding components and/or items e.g., clamps, electrode holders, wire feeders, power supplies, electrodes, other workpieces, tips, nozzles, etc.
- the cameras 114 are configured to record images of the welding area 120 and encode data representative of the images into one or more image signals.
- a camera controller 124 is configured to collect image signals from the cameras 114 .
- the image signals may be representative of images recorded by the cameras 114 .
- the camera controller 124 is configured to send the image signals, or data representative of the image signals and/or the recorded images, to the computing system 200 .
- the computing system 200 is configured to process the images, augment the images to create a simulated rendering 126 , and send the simulated rendering 126 to the display screen 104 for display to the user.
- each camera 114 has its own camera controller 124 .
- the camera controller 124 is part of the simulated welding helmet 102 .
- the camera controller 124 is separate from the simulated welding helmet 102 .
- the computing system 200 is integrated into the simulated welding helmet 102 .
- the cameras 114 , camera controller 124 , display screen 104 , helmet interface 113 , user interface 106 , welding torch 118 , and/or computing system 200 may communicate via one or more wired mediums and/or protocols (e.g., Ethernet cable(s), universal serial bus cable(s), other signal and/or communication cable(s)) and/or wireless mediums and/or protocols (e.g., near field communication (NFC), ultra high frequency radio waves (commonly known as Bluetooth), IEEE 802.11x, Zigbee, HART, LTE, Z-Wave, WirelessHD, WiGig, etc.).
- wired mediums and/or protocols e.g., Ethernet cable(s), universal serial bus cable(s), other signal and/or communication cable(s)
- wireless mediums and/or protocols e.g., near field communication (NFC), ultra high frequency radio waves (commonly known as Bluetooth), IEEE 802.11x, Zigbee, HART, LTE, Z-Wave, WirelessHD, WiGig
- the computing system 200 is configured to create the simulated rendering 126 based on the images captured by the cameras 114 in conjunction with one or more user adjustable settings and/or inputs.
- inputs may be received from certain welding components (e.g., trigger 119 of the welding torch 118 ) as well as from the helmet interface 113 and/or the user interface 106 .
- the helmet interface 113 may be considered part of the user interface 106 .
- the augmented reality welding system 100 is configured to apply an additional visual effect to the simulated rendering 126 .
- the visual effect is impacted by several user adjustable settings. Some of the settings emulate auto-darkening settings found on conventional auto-darkening welding helmets (e.g., shade, sensitivity, and/or delay). Other settings are unique to the augmented reality welding system 100 , such as, for example, a helmet model/type setting, a difficulty setting, a realism setting, and/or a visual effect area setting.
- the visual impact may further be impacted by one or more weld settings (e.g., a voltage, a current, a gas type, a wire feed speed, a workpiece material type, and/or a filler type).
- weld settings e.g., a voltage, a current, a gas type, a wire feed speed, a workpiece material type, and/or a filler type.
- the settings are discussed further below with respect to FIG. 2 .
- the settings may be adjusted by a user via the user interface 106 and/or the helmet interface 113 .
- the helmet interface 113 includes one or more adjustable inputs (e.g., knobs, buttons, switches, keys, etc.) and/or outputs (e.g., lights, speakers, etc.).
- the helmet interface 113 is in communication with the computing system 200 .
- the helmet interface 113 may further comprise communication circuitry (not shown) configured for communication with computing system 200 .
- the helmet interface 113 may be part of the user interface 106 .
- the user interface 106 is in communication with the computing system 200 .
- the user interface 106 comprises a touch screen interface, such as a tablet, touch screen computer, smartphone or other touch screen device.
- the user interface 106 may instead comprise more traditional input devices (e.g., mouse, keyboard, buttons, knobs, etc.) and/or output devices (e.g., display screen, speakers, etc.).
- the user interface 106 may further include one or more receptacles configured for connection to (and/or reception of) one or more external memory devices (e.g., floppy disks, compact discs, digital video disc, flash drive, etc.).
- the computing system 200 also uses camera-captured images of markers 122 on the welding torch 118 and/or workpiece 116 (and/or other components) to create the simulated rendering 126 .
- the computing system 200 may be configured to recognize the markers 122 on the workpiece 116 and/or welding torch 118 , and create a simulated rendering based (at least in part) on the markers 122 .
- the markers 122 may assist the computing system 200 in tracking and/or recognition of the welding torch 118 , workpiece 116 , and/or other objects, as well as their respective shapes, sizes, spatial relationships, etc.
- the computing system 200 may combine recognition of markers 122 with user input to create the simulated rendering 126 .
- the computing system 200 may recognize markers 122 on the welding torch 118 near markers 122 on the workpiece 116 and, after recognizing that the user is pressing a trigger 119 of the welding torch 118 , create a simulated rendering showing an arc between the welding torch 118 and the workpiece 116 , and/or a weld pool proximate the arc endpoint on the workpiece 116 .
- the computing system 200 is configured to omit the markers 122 from the simulated rendering 126 .
- FIG. 2 is a block diagram of the augmented reality welding system 100 of FIG. 1 .
- the computing system 200 is in communication with the user interface 106 (and/or helmet interface 113 ), the display screen 104 , one or more welding components (e.g., welding torch 118 , welding wire feeder, welding power supply, etc.), and the cameras 114 (e.g., through the camera controller(s) 124 ).
- the cameras 114 may be in direct communication with the computing system 200 without going through the camera controller(s) 124 .
- the computing system 200 includes communication circuitry 202 configured to facilitate communication between the computing system and the user interface 106 (and/or helmet interface 113 ), the display screen 104 , one or more welding components (e.g., welding torch 118 ), and the cameras 114 (e.g., through the camera controller(s) 124 ).
- communication circuitry 202 configured to facilitate communication between the computing system and the user interface 106 (and/or helmet interface 113 ), the display screen 104 , one or more welding components (e.g., welding torch 118 ), and the cameras 114 (e.g., through the camera controller(s) 124 ).
- the computing system 200 also includes memory 206 and one or more processors 204 .
- the memory 206 , processor(s) 204 , and communication circuitry 202 are in electrical communication with each another, such as through a common data bus.
- the one or more processors 204 are configured to execute instructions stored in the memory 206 .
- the memory 206 stores executable instructions that, when executed by the processor, further operation of the augmented reality welding system 100 .
- the memory 206 stores instructions relating to at least two processes of the augmented reality welding system 100 : a primary control process 300 and a visual effect determination process 400 .
- the memory 206 also stores data that may be used by the primary control process 300 and/or visual effect determination process 400 .
- the memory 206 stores primary settings 301 (e.g., such as may be relevant to the primary control process 300 ) and visual effect settings 401 (e.g., such as may be relevant to the visual effect determination process 400 ).
- the primary settings 301 may include such settings as a (training) difficulty setting (e.g., easy, normal, hard, etc.), a realism setting (e.g., low, medium, high, etc.), and/or various weld settings (e.g., voltage, current, gas type, wire feed speed, workpiece material type, filler type, etc.).
- the visual effect settings 402 may include such settings as a shade setting (e.g., low, medium, high or 1, 2, 3, 4, 5, etc.), a sensitivity settings (e.g., low, medium, high or 1, 2, 3, 4, 5, etc.), a delay setting (e.g., low, medium, high or 1, 2, 3, 4, 5, etc.), a helmet model/type setting, and/or a visual effect area setting (e.g., localized, expanded, entire, background, foreground, weld area, welding arc, weld pool, etc.).
- the primary settings 302 , visual effect settings 402 , and/or other data stored in memory may be used by the primary control process 300 and/or visual effect determination process 400 .
- the primary control process 300 , visual effect determination process 400 , primary settings 302 , visual effect settings 402 , and/or other data used by the augmented reality welding system 100 may be retrieved from an external memory device (e.g., flash drive, cloud storage, etc.) instead of, or in addition to, being stored in memory 206 .
- an external memory device e.g., flash drive, cloud storage, etc.
- FIG. 3 is a flowchart illustrating an example primary control process 300 of the augmented reality welding system 100 .
- the primary control process 300 may be implemented in machine readable instructions stored in memory 206 and/or executed by the one or more processors 204 .
- some or all of the primary control process 300 may be implemented in analog and/or discrete circuitry.
- the primary control process 300 is configured to control the augmented reality welding system 100 , such as by processing the images captured by cameras 114 along with the various inputs and/or settings to generate the simulated rendering 126 displayed to the user via the display screen 104 .
- the primary control process 300 begins at block 302 , where the computing system 200 receives one or more images (and/or image signals) from the cameras 114 (and/or camera controller(s) 124 ).
- the primary control process 300 processes the images along with inputs of the augmented reality welding system 100 and generates the simulated rendering 126 .
- processing the images may comprise parsing the images to determine and/or recognize objects in the image, as well as properties of the objects. Markers 122 on certain objects (e.g., welding torch 118 and/or workpiece 116 ) may assist in this processing.
- the primary control process 300 may recognize the workpiece 116 from the markers 122 (and/or other distinguishing characteristics) and render the workpiece 116 as a metallic workpiece (or some other workpiece type, depending on weld settings, etc.). In some examples, the primary control process 300 may further render the workpiece 116 as having completed welds in some places and uncompleted welds in other places, according to inputs and/or settings of the augmented reality welding system 100 .
- the primary control process 300 may recognize the welding torch 118 and workpiece in proximity to one another, detect a signal from the welding torch 118 indicating that the trigger 119 is being pressed, and in response render an arc extending from a torch tip of the welding torch 118 to the workpiece 116 , along with an associated weld pool on the workpiece 116 at the end of the arc.
- the primary control process 300 determines whether the simulated rendering 126 includes a visible simulated arc at block 306 . In some examples, this determination is based, at least in part, on inputs and/or settings (e.g., weld settings) of the augmented reality welding system 100 . If the primary control process 300 determines that there is a visible simulated arc in the simulated rendering 126 , then the primary control process 300 proceeds to execute the visual effect determination process 400 at block 400 (discussed further below in reference to FIG. 4 ). After block 400 , the primary control process 300 proceeds to block 308 where the visual effect is applied to the simulated rendering 126 , and then to block 310 where the simulated rendering 126 is sent to the display screen 104 and displayed to the user.
- the visual effect determination process 400 discussed further below in reference to FIG. 4
- the primary control process 300 determines that there is no visible simulated arc (or an absent simulated arc) in the simulated rendering 126 , the primary control process 300 proceeds from block 306 to block 312 .
- the primary control process 300 reads the visual effect settings 401 and determines whether the set time delay is greater than the time since last there was a simulated arc in the simulated rendering 126 .
- the determination at block 312 emulates some real life auto-darkening welding helmets with delay settings that provide the option of continuing to apply the shading (and/or tinting, darkening, etc.) effect to the welding helmet lens for some time after the welding arc (or other sufficiently bright light) has subsided.
- the primary control process 300 proceeds to block 310 , where the simulated rendering 126 is sent to the display screen 104 and displayed to the user, without any visual effect applied. If the set time delay is greater than the time since last there was a simulated arc in the simulated rendering 126 then the primary control process 300 proceeds to block 308 , where the most recent visual effect is again applied to the simulated rendering 126 before the simulated rendering 126 is sent to the display screen 104 and displayed to the user at block 310 .
- FIG. 4 is a flowchart illustrating the visual effect determination process 400 of the augmented reality welding system 100 .
- some or all of the visual effect determination process 400 may be implemented in machine readable instructions stored in memory 206 and/or executed by the one or more processors 204 .
- some or all of the visual effect determination process 400 may instead be implemented in analog and/or discrete circuitry.
- the visual effect determination process 400 is configured to determine a visual effect to apply to the simulated rendering 126 to simulate the shading (and/or tinting, darkening, etc.) effect of some real life welding helmet lenses.
- the visual effect determination process 400 begins at block 402 , where properties of the simulated arc are determined.
- determining properties of the simulated arc may comprise determining (and/or estimating) an amount and/or brightness of visible light radiation that an actual welding arc would produce given the weld settings of the augmented reality welding system 100 .
- the determined brightness of an arc with low current and/or voltage weld settings may be less than the determined brightness of an arc with high current and/or voltage weld settings.
- the primary control process 300 may determine properties of the simulated arc at block 304 of the primary control process 300 , and the visual effect determination process 400 may use the properties at block 402 .
- the primary control process 300 may save the arc properties to memory 206 at block 304 , and the visual effect determination process 400 may read and/or load the arc properties from memory 206 at block 402 .
- the visual effect determination process 400 determines whether the determined brightness of the simulated arc is greater than a sensitivity threshold.
- the sensitivity threshold is based on the sensitivity setting of the visual effect settings 401 .
- the brightness of other elements (e.g., the weld pool) of the simulated rendering (and/or the entire simulated rendering) is evaluated against the sensitivity threshold.
- the visual effect determination process 400 proceeds to block 406 if the brightness is less than the sensitivity threshold.
- the visual effect determination process 400 determines the visual effect to be nothing. If, however, the brightness is greater than the set sensitivity threshold, the visual effect determination process 400 proceeds to block 408 .
- the visual effect determination process 400 determines one or more spectral transmissivity properties corresponding to the helmet model/type setting at block 408 .
- Spectral transmissivity properties impact certain aspects of the visual effect, such as how much light of a given wavelength is allowed to pass.
- Actual welding helmet have different lenses with different filters, each with different spectral transmissivity properties.
- some actual welding helmets have lenses that slightly color (e.g., with a slight green tint, a slight yellowish color, etc.) the appearance of everything viewed through the lens because of the spectral transmissivity properties of the lens’ filter.
- one or more of the spectral transmissivity properties may be dependent upon the shade setting, such that one or more of the spectral transmissivity properties may vary with respect to different shade settings.
- FIG. 5 is a graph 500 showing example spectral transmissivity properties of an example filter via a transmissivity curve 502 .
- the Y axis of the graph 500 represents transmissivity percentage, or a percentage of light that such an example filter would allow through, while the X axis represents wavelength of light (in nanometers).
- the spectral transmissivity curve 502 is a mathematical representation of the properties of the example filter, showing which wavelengths of light are filtered only slightly, and which wavelengths of light are filtered entirely (or substantially). As shown, the spectral transmissivity curve 502 indicates that almost all light having a wavelength less than 450 nanometers, or greater than 750 , is filtered. In the example of FIG.
- the spectral transmissivity curve 502 indicates that most of the light allowed through the filter will be in the 500 to 700 nanometer range, with up to approximately 38% of light having wavelengths around approximately 580 nanometers allowed through the filter.
- this example filter may lend a slightly yellowish color to the appearance of everything viewed through the filter, because of the spectral transmissivity properties.
- the spectral transmissivity properties are determined at block 408 based on the helmet model setting of the visual effect settings 401 , which identifies a particular type and/or model of real life welding helmet filter for the visual effect to emulate.
- the determination may comprise utilizing a data structure that associates different helmet models with different spectral transmissivity properties.
- the spectral transmissivity properties may be disabled or determined to be uniform, corresponding to a flat filter where the same amount of light is allowed to pass through regardless of wavelength. For example, spectral transmissivity may be disabled or determined to be uniform for lower difficulty and/or realism settings.
- the visual effect determination process 400 proceeds to block 410 after block 408 .
- the visual effect determination process 400 determines the visual effect to apply to the simulated rendering 126 based on the previously determined spectral transmissivity properties and the shade setting of the visual effect settings 401 .
- the shade setting impacts the degree to which brightness is filtered (and/or shaded, tinted, darkened, etc) by the visual effect.
- the higher the shade setting the more brightness is filtered (and/or shaded, tinted, darkened, etc) by the visual effect.
- FIG. 6 a illustrates an example of different visual effects 600 with different shade settings, with visual effect 600 a corresponding to the lowest shade setting, and visual effect 600 d corresponding to the highest shade setting (everything else being equal).
- FIG. 6 a illustrates the visual effect 600 as a shading of sorts that is added on top of the simulated rendering 126 for ease of understanding. However, in some examples the visual effect may actually be a subtraction and/or reduction of certain visual attributes (e.g., brightness) of all or some of the simulated rendering 126 . In the example of FIG. 6 a , only the visual effect 600 a is applied to the simulated rendering 126 , while the other visual effects 600 b - 600 d are not applied due to the visual effect settings 401 . While FIG. 6 a shows four different visual effects 600 corresponding to shade settings, in some examples there may be more or less shade settings and/or corresponding visual effects.
- the visual effect determination process 400 proceeds to block 412 after block 410 .
- the visual effect determination process 400 determines if any modification to the visual effect 600 is warranted given the difficulty, realism, and/or effect area settings of the visual effect settings 401 .
- certain difficulty and/or realism settings may correspond to certain visual effect area settings.
- an easy difficulty and/or low realism setting may correspond to a small and/or localized visual effect area (or no visual effect area).
- a small and/or localized visual effect area may apply the visual effect 600 only to a small and/or localized area around the simulated arc, such as shown, for example in FIG. 6 b .
- a hard difficulty and/or high realism setting may correspond to a full visual effect area that applies the visual effect to the entire simulated rendering 126 , such as shown, for example, in FIG. 6 a . While the visual effect 600 in FIG. 6 b is depicted as having a sharp, abrupt, edge between the portions of the simulated rendering 126 with and without the visual effect 600 , in some examples, the transition may be smoothed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is continuation of, and claims priority to, co-pending U.S. Non-Provisional Pat. Application No. 16/279,625, entitled “Adjustable Visual Effects Simulating Auto Darkening Lenses in Augmented Reality Welding Systems,” filed Feb. 19, 2019, issuing Nov. 29, 2022 as U.S. Pat. No. 11,514,816, the entire contents of which is hereby incorporated by reference.
- The present disclosure generally relates to augmented reality welding systems and, more particularly, to adjustable visual effects that simulate auto darkening lenses in augmented reality welding systems.
- Conventional arc welding systems generate electrical arcs that are bright enough to blind if viewed by the naked eye. Conventional welding helmets therefore provide shaded (and/or tinted) lenses to diminish the brightness. Some welding helmets have an auto-darkening lens that provides substantial shading (and/or darkening, tinting, etc.) only when exposed to a threshold level of light (e.g., from a bright electrical arc), while providing a largely unshaded viewing lens when exposed to lower light levels (e.g., from a light bulb).
- Some augmented reality welding systems present welding simulations (e.g., for training) via a display screen. However, as there are no actual welding arcs, and/or associated bright visible light, simulating the shading and/or auto-darkening effects of welding helmets is more complicated.
- Limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present disclosure as set forth in the remainder of the present application with reference to the drawings.
- The present disclosure is directed to adjustable visual effects that simulate auto darkening lenses in augmented reality welding systems, for example, substantially as illustrated by and/or described in connection with at least one of the figures, and as set forth more completely in the claims.
- These and other advantages, aspects and novel features of the present disclosure, as well as details of an illustrated example thereof, will be more fully understood from the following description and drawings.
-
FIG. 1 is a diagram illustrating components of an example augmented welding system, in accordance with aspects of this disclosure. -
FIG. 2 is a block diagram further illustrating the components of the example augmented welding system ofFIG. 1 , in accordance with aspects of this disclosure. -
FIG. 3 is a flow diagram illustrating an example primary control process that may be used with the example augmented welding system ofFIGS. 1 and 2 , in accordance with aspects of this disclosure. -
FIG. 4 is a flow diagram illustrating an example visual effect determination process that may be used with the example primary control process ofFIG. 4 , in accordance with aspects of this disclosure. -
FIG. 5 is a graph illustrating an example spectral transmissivity curve, in accordance with aspects of this disclosure. -
FIGS. 6 a and 6 b are diagrams illustrating example applications of a visual effect to a simulated rendering, in accordance with aspects of this disclosure. - The figures are not necessarily to scale. Where appropriate, the same or similar reference numerals are used in the figures to refer to similar or identical elements. For example, reference numerals utilizing lettering (e.g.,
camera 114 a,camera 114 b) refer to instances of the same reference numeral that does not have the lettering (e.g., cameras 114). - Preferred examples of the present disclosure may be described hereinbelow with reference to the accompanying drawings. In the following description, well-known functions or constructions are not described in detail because they may obscure the disclosure in unnecessary detail. For this disclosure, the following terms and definitions shall apply.
- As used herein, the terms “about” and/or “approximately,” when used to modify or describe a value (or range of values), position, orientation, and/or action, mean reasonably close to that value, range of values, position, orientation, and/or action. Thus, the examples described herein are not limited to only the recited values, ranges of values, positions, orientations, and/or actions but rather should include reasonably workable deviations.
- As used herein, “and/or” means any one or more of the items in the list joined by “and/or”. As an example, “x and/or y” means any element of the three-element set {(x), (y), (x, y)}. In other words, “x and/or y” means “one or both of x and y”. As another example, “x, y, and/or z” means any element of the seven-element set {(x), (y), (z), (x, y), (x, z), (y, z), (x, y, z)}. In other words, “x, y and/or z” means “one or more of x, y and z”.
- As utilized herein, the terms “e.g.,” and “for example” set off lists of one or more non-limiting examples, instances, or illustrations.
- As used herein, the terms “coupled,” “coupled to,” and “coupled with,” each mean a structural and/or electrical connection, whether attached, affixed, connected, joined, fastened, linked, and/or otherwise secured. As used herein, the term “attach” means to affix, couple, connect, join, fasten, link, and/or otherwise secure. As used herein, the term “connect” means to attach, affix, couple, join, fasten, link, and/or otherwise secure.
- As used herein the terms “circuits” and “circuitry” refer to physical electronic components (i.e., hardware) and any software and/or firmware (“code”) which may configure the hardware, be executed by the hardware, and or otherwise be associated with the hardware. As used herein, for example, a particular processor and memory may comprise a first “circuit” when executing a first one or more lines of code and may comprise a second “circuit” when executing a second one or more lines of code. As utilized herein, circuitry is “operable” and/or “configured” to perform a function whenever the circuitry comprises the necessary hardware and/or code (if any is necessary) to perform the function, regardless of whether performance of the function is disabled or enabled (e.g., by a user-configurable setting, factory trim, etc.).
- As used herein, a control circuit may include digital and/or analog circuitry, discrete and/or integrated circuitry, microprocessors, DSPs, etc., software, hardware and/or firmware, located on one or more boards, that form part or all of a controller, and/or are used to control a welding process, and/or a device such as a power source or wire feeder.
- As used herein, the term “processor” means processing devices, apparatus, programs, circuits, components, systems, and subsystems, whether implemented in hardware, tangibly embodied software, or both, and whether or not it is programmable. The term “processor” as used herein includes, but is not limited to, one or more computing devices, hardwired circuits, signal-modifying devices and systems, devices and machines for controlling systems, central processing units, programmable devices and systems, field-programmable gate arrays, application-specific integrated circuits, systems on a chip, systems comprising discrete elements and/or circuits, state machines, virtual machines, data processors, processing facilities, and combinations of any of the foregoing. The processor may be, for example, any type of general purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, an application-specific integrated circuit (ASIC). The processor may be coupled to, and/or integrated with a memory device.
- As used, herein, the term “memory” and/or “memory device” means computer hardware or circuitry to store information for use by a processor and/or other digital device. The memory and/or memory device can be any suitable type of computer memory or any other type of electronic storage medium, such as, for example, read-only memory (ROM), random access memory (RAM), cache memory, compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically-erasable programmable read-only memory (EEPROM), a computer-readable medium, or the like.
- The term “power” is used throughout this specification for convenience, but also includes related measures such as energy, current, voltage, and enthalpy. For example, controlling “power” may involve controlling voltage, current, energy, and/or enthalpy, and/or controlling based on “power” may involve controlling based on voltage, current, energy, and/or enthalpy.
- As used herein, welding-type power refers to power suitable for welding, cladding, brazing, plasma cutting, induction heating, carbon arc cutting, and/or hot wire welding/preheating (including laser welding and laser cladding), carbon arc cutting or gouging, and/or resistive preheating.
- As used herein, a welding-type power supply and/or power source refers to any device capable of, when power is applied thereto, supplying welding, cladding, brazing, plasma cutting, induction heating, laser (including laser welding, laser hybrid, and laser cladding), carbon arc cutting or gouging, and/or resistive preheating, including but not limited to transformer-rectifiers, inverters, converters, resonant power supplies, quasi-resonant power supplies, switch-mode power supplies, etc., as well as control circuitry and other ancillary circuitry associated therewith.
- Some examples of the present disclosure relate to a welding training system, comprising a display screen configured to display a simulated rendering and control circuitry configured to determine a visual effect based on one or more settings and a simulated arc state, and apply the visual effect to at least a portion of the simulated rendering. In some examples, the visual effect comprises a filter. In some examples, the filter reduces a brightness of at least a portion of the simulated rendering. In some examples, the filter is uniform or based on a spectral transmissivity curve. In some examples, the portion comprises at least one of a background, a foreground, an entirety, a weld area, a welding arc, or a weld pool of the simulated rendering. In some examples, the simulated arc state comprises a visible simulated arc or an absent simulated arc. In some examples, the one or more settings simulate settings of an auto-darkening welding helmet. In some examples, the one or more settings comprise one or more of a shade setting, a sensitivity setting, a helmet model setting, or a delay setting.
- In some examples, the sensitivity setting sets an arc brightness threshold above which the visual effect applies a filter to the portion of the simulated rendering. In some examples, wherein the shade setting sets a filter level of the visual effect after an arc brightness threshold is reached. In some examples, the delay setting comprises a time delay between a change of the simulated arc state and a change of the visual effect. In some examples, the helmet model setting comprises a type of welding helmet, wherein the visual effect comprises a filter, and wherein the filter is based on a spectral transmissivity curve of the type of welding helmet. In some examples, the control circuitry is further configured to determine the visual effect based on one or more weld settings. In some examples, the one or more weld settings comprise one or more of a voltage, a current, a gas type, a wire feed speed, a workpiece material type, or a filler type. In some examples, the control circuitry is further configured to determine the visual effect based on a simulation difficulty setting.
- In some examples, the system further comprises a simulated welding helmet having the display screen, wherein the simulated welding helmet comprises a camera configured to capture images of a surrounding environment. In some examples, the control circuitry is further configured to receive the images from the camera, detect one or more weld settings of the welding training system, and generate the simulated rendering based on the images and the one or more weld settings.
- Some examples of the present disclosure relate to a method, comprising determining, via control circuitry, a visual effect based on one or more settings and a simulated arc state, applying, via the control circuitry, the visual effect to a simulated rendering, and displaying the simulated rendering on a display screen. In some examples, applying the visual effect comprises filtering a brightness of at least a portion of the simulated rendering, wherein the portion comprises a background, a foreground, an entirety, a weld area, a welding arc, or a weld pool of the simulated rendering. In some examples, the one or more settings comprise one or more of a shade setting, a sensitivity setting, a helmet model setting, or a delay setting.
- Some examples of the present disclosure relate to augmented reality welding systems. In some examples, an augmented reality welding system has a display screen configured to display a simulated rendering, such as display screen of a welding helmet worn by the user. The simulated rendering may be based on recorded images of a surrounding environment, such as recorded by one or more cameras configured to record images. In some examples, the recorded images are processed and/or augmented by an augmented reality computing system to create the simulated rendering. In some examples, the simulated rendering includes more or fewer images and/or details (e.g., relating to an arc, a workpiece, a welding torch, etc.) than the images recorded by the cameras. In some examples, the simulated rendering is presented to a user via the display screen.
- In some examples, the computing system is further configured to add a visual effect to the simulated rendering when the simulated rendering includes images related to an augmented and/or simulated arc. In some examples, the visual effect is a filtering effect that filters some or all of the bright light associated with the augmented and/or simulated arc. In some examples, the visual effect is designed to enhance the reality of the augmented reality welding system by emulating a filtering, shading, tinting, and/or darkening effect of actual welding helmets when an actual arc is present.
- In some examples, the augmented reality system further includes one or more settings that impact the visual effect. Some of the settings may be similar to settings that sometimes appear on actual welding helmets, such as, for example, a shade setting, a sensitivity setting, and/or a delay setting. In some examples, additional settings specific to the augmented reality welding system are provided, such as, for example, a welding helmet model setting and/or a difficulty or realism setting. These settings may allow a user to adjust and/or customize the augmented reality welding experience. In some examples, the computing system may apply the visual effect to the simulated rendering based on the one or more settings.
-
FIG. 1 shows an example of an augmentedreality welding system 100. While the present disclosure sometimes refers to just augmented reality for simplicity, it should be understood that features of the augmentedreality welding system 100 may also be implemented in a mixed reality welding system and/or a virtual reality welding system. In some examples, the augmentedreality welding system 100 may be used for weld training. - In the example of
FIG. 1 , the augmentedreality welding system 100 includes asimulated welding helmet 102, acomputing system 200, adisplay screen 104 in communication with thecomputing system 200, and auser interface 106 in communication with thecomputing system 200. As shown, thesimulated welding helmet 102 includes anouter shell 108, aheadband 110, afaceplate 112, and ahelmet interface 113. In some examples, theheadband 110 is configured to secure thesimulated welding helmet 102 to the head of a user, while theouter shell 108 is configured to retain thefaceplate 112 and protect the head of the user. However, in some examples, thesimulated welding helmet 102 may appear substantially differently. For example, thesimulated welding helmet 102 may simply simulate a conventional welding helmet, without including features such as theouter shell 108,headband 110,faceplate 112, and/orhelmet interface 113. - In the example of
FIG. 1 , anelectronic display screen 104 is secured within simulated welding helmet 102 (such as within theouter shell 108, for example), such that thedisplay screen 104 is viewable by a user (e.g., a trainee and/or operator) when thesimulated welding helmet 102 is worn on the head of the user. In some examples, theelectronic display screen 104 may be removably secured within the simulated welding helmet 102 (such as within theouter shell 108, for example). In some examples, theelectronic display screen 104 may be part of a separate component (e.g., specialized glasses) that is secured to theouter shell 108. In some examples, theelectronic display screen 104 may be part of thefaceplate 112, and/or vice versa. In some examples, theelectronic display screen 104 may be entirely separate from thesimulated welding helmet 102. - In the example of
FIG. 1 , thesimulated welding helmet 102 further includes one ormore cameras 114 affixed to theouter shell 108. In some examples, thecameras 114 may be digital video cameras. As shown, there are threecameras 114 affixed to the outer shell 108: onecamera 114 a on top of thesimulated welding helmet 102, onecamera 114 b on the left side, and onecamera 114 c on the right side. Thecameras 114 are arranged in a triangle configuration so as to increase the accuracy of depth and/or three dimensional spatial calculations, such as by thecomputing system 200, for example. In some examples, there may be less ormore cameras 114. In some examples, only twocameras 114 may be needed to facilitate depth and/or spatial calculations. In the example ofFIG. 1 , thecameras 114 are directed forward, in the same direction a user wearing thesimulated welding helmet 102 would be looking. In some examples, thecameras 114 may be movably mounted, and/or have movable lenses configured to redirect a focus (and/or adjust certain attributes) of thecameras 114, in response to one or more command signals (e.g., received fromcomputing system 200 and/or camera controller(s) 124). In some examples, thecameras 114 may be otherwise positioned on thesimulated welding helmet 102 and/or disconnected from thesimulated welding helmet 102 entirely. - In the example of
FIG. 1 , thecameras 114 are directed towards aworkpiece 116 andwelding torch 118 within a welding area 120 (and/or welding cell). In some examples, thewelding torch 118 may be a real, functional, welding torch. In some examples, thewelding torch 118 may instead be a mock welding torch. In some examples, thewelding torch 118 may be a gun or torch configured for gas metal arc welding (GMAW) or an electrode holder (i.e., stinger) configured for shielded metal arc welding (SMAW). In the example ofFIG. 1 , thewelding torch 118 is in communication with thecomputing system 200. In some examples, thewelding torch 118 may additionally, or alternatively, be in communication with some other system (e.g., a mock or actual welding power supply and/or mock or actual welding wire feeder) that may (or may not) be in communication with thecomputing system 200. - As shown, the
workpiece 116 andwelding torch 118 within thewelding area 120 includemarkers 122 configured to be captured by thecameras 114 and/or interpreted by thecomputing system 200. In some examples, other welding components and/or items (e.g., clamps, electrode holders, wire feeders, power supplies, electrodes, other workpieces, tips, nozzles, etc.), with or withoutmarkers 122, may also be positioned within thewelding area 120. In operation, thecameras 114 are configured to record images of thewelding area 120 and encode data representative of the images into one or more image signals. - In the example of
FIG. 1 , acamera controller 124 is configured to collect image signals from thecameras 114. In some examples, the image signals may be representative of images recorded by thecameras 114. As shown, thecamera controller 124 is configured to send the image signals, or data representative of the image signals and/or the recorded images, to thecomputing system 200. Thecomputing system 200 is configured to process the images, augment the images to create asimulated rendering 126, and send thesimulated rendering 126 to thedisplay screen 104 for display to the user. - In some examples, each
camera 114 has itsown camera controller 124. In some examples, thecamera controller 124 is part of thesimulated welding helmet 102. In some examples, thecamera controller 124 is separate from thesimulated welding helmet 102. In some examples, thecomputing system 200 is integrated into thesimulated welding helmet 102. In some examples, thecameras 114,camera controller 124,display screen 104,helmet interface 113,user interface 106,welding torch 118, and/orcomputing system 200 may communicate via one or more wired mediums and/or protocols (e.g., Ethernet cable(s), universal serial bus cable(s), other signal and/or communication cable(s)) and/or wireless mediums and/or protocols (e.g., near field communication (NFC), ultra high frequency radio waves (commonly known as Bluetooth), IEEE 802.11x, Zigbee, HART, LTE, Z-Wave, WirelessHD, WiGig, etc.). - In the example of
FIG. 1 , thecomputing system 200 is configured to create thesimulated rendering 126 based on the images captured by thecameras 114 in conjunction with one or more user adjustable settings and/or inputs. In some examples, inputs may be received from certain welding components (e.g., trigger 119 of the welding torch 118) as well as from thehelmet interface 113 and/or theuser interface 106. In some examples, thehelmet interface 113 may be considered part of theuser interface 106. - In some examples, the augmented
reality welding system 100 is configured to apply an additional visual effect to thesimulated rendering 126. In some examples, the visual effect is impacted by several user adjustable settings. Some of the settings emulate auto-darkening settings found on conventional auto-darkening welding helmets (e.g., shade, sensitivity, and/or delay). Other settings are unique to the augmentedreality welding system 100, such as, for example, a helmet model/type setting, a difficulty setting, a realism setting, and/or a visual effect area setting. In some examples, the visual impact may further be impacted by one or more weld settings (e.g., a voltage, a current, a gas type, a wire feed speed, a workpiece material type, and/or a filler type). The settings are discussed further below with respect toFIG. 2 . The settings may be adjusted by a user via theuser interface 106 and/or thehelmet interface 113. - In the example of
FIG. 1 , thehelmet interface 113 includes one or more adjustable inputs (e.g., knobs, buttons, switches, keys, etc.) and/or outputs (e.g., lights, speakers, etc.). In some examples, thehelmet interface 113 is in communication with thecomputing system 200. In some examples, thehelmet interface 113 may further comprise communication circuitry (not shown) configured for communication withcomputing system 200. In some examples, thehelmet interface 113 may be part of theuser interface 106. - In the example of
FIG. 1 , theuser interface 106 is in communication with thecomputing system 200. As shown, theuser interface 106 comprises a touch screen interface, such as a tablet, touch screen computer, smartphone or other touch screen device. In some examples, theuser interface 106 may instead comprise more traditional input devices (e.g., mouse, keyboard, buttons, knobs, etc.) and/or output devices (e.g., display screen, speakers, etc.). In some examples, theuser interface 106 may further include one or more receptacles configured for connection to (and/or reception of) one or more external memory devices (e.g., floppy disks, compact discs, digital video disc, flash drive, etc.). - In the example of
FIG. 1 , thecomputing system 200 also uses camera-captured images ofmarkers 122 on thewelding torch 118 and/or workpiece 116 (and/or other components) to create thesimulated rendering 126. In some examples, thecomputing system 200 may be configured to recognize themarkers 122 on theworkpiece 116 and/orwelding torch 118, and create a simulated rendering based (at least in part) on themarkers 122. For example, themarkers 122 may assist thecomputing system 200 in tracking and/or recognition of thewelding torch 118,workpiece 116, and/or other objects, as well as their respective shapes, sizes, spatial relationships, etc. In some examples, thecomputing system 200 may combine recognition ofmarkers 122 with user input to create thesimulated rendering 126. For example, thecomputing system 200 may recognizemarkers 122 on thewelding torch 118 nearmarkers 122 on theworkpiece 116 and, after recognizing that the user is pressing atrigger 119 of thewelding torch 118, create a simulated rendering showing an arc between thewelding torch 118 and theworkpiece 116, and/or a weld pool proximate the arc endpoint on theworkpiece 116. In some examples, thecomputing system 200 is configured to omit themarkers 122 from thesimulated rendering 126. -
FIG. 2 is a block diagram of the augmentedreality welding system 100 ofFIG. 1 . In the example ofFIG. 2 , thecomputing system 200 is in communication with the user interface 106 (and/or helmet interface 113), thedisplay screen 104, one or more welding components (e.g.,welding torch 118, welding wire feeder, welding power supply, etc.), and the cameras 114 (e.g., through the camera controller(s) 124). In some examples, thecameras 114 may be in direct communication with thecomputing system 200 without going through the camera controller(s) 124. As shown, thecomputing system 200 includescommunication circuitry 202 configured to facilitate communication between the computing system and the user interface 106 (and/or helmet interface 113), thedisplay screen 104, one or more welding components (e.g., welding torch 118), and the cameras 114 (e.g., through the camera controller(s) 124). - In the example of
FIG. 2 , thecomputing system 200 also includesmemory 206 and one ormore processors 204. As shown, thememory 206, processor(s) 204, andcommunication circuitry 202 are in electrical communication with each another, such as through a common data bus. The one ormore processors 204 are configured to execute instructions stored in thememory 206. In the example ofFIG. 2 , thememory 206 stores executable instructions that, when executed by the processor, further operation of the augmentedreality welding system 100. As shown, thememory 206 stores instructions relating to at least two processes of the augmented reality welding system 100: aprimary control process 300 and a visualeffect determination process 400. - In the example of
FIG. 2 , thememory 206 also stores data that may be used by theprimary control process 300 and/or visualeffect determination process 400. In particular, as shown, thememory 206 stores primary settings 301 (e.g., such as may be relevant to the primary control process 300) and visual effect settings 401 (e.g., such as may be relevant to the visual effect determination process 400). In some examples, theprimary settings 301 may include such settings as a (training) difficulty setting (e.g., easy, normal, hard, etc.), a realism setting (e.g., low, medium, high, etc.), and/or various weld settings (e.g., voltage, current, gas type, wire feed speed, workpiece material type, filler type, etc.). In some examples, the visual effect settings 402 may include such settings as a shade setting (e.g., low, medium, high or 1, 2, 3, 4, 5, etc.), a sensitivity settings (e.g., low, medium, high or 1, 2, 3, 4, 5, etc.), a delay setting (e.g., low, medium, high or 1, 2, 3, 4, 5, etc.), a helmet model/type setting, and/or a visual effect area setting (e.g., localized, expanded, entire, background, foreground, weld area, welding arc, weld pool, etc.). In some examples, theprimary settings 302, visual effect settings 402, and/or other data stored in memory may be used by theprimary control process 300 and/or visualeffect determination process 400. In some examples, theprimary control process 300, visualeffect determination process 400,primary settings 302, visual effect settings 402, and/or other data used by the augmentedreality welding system 100 may be retrieved from an external memory device (e.g., flash drive, cloud storage, etc.) instead of, or in addition to, being stored inmemory 206. -
FIG. 3 is a flowchart illustrating an exampleprimary control process 300 of the augmentedreality welding system 100. In some examples, some or all of theprimary control process 300 may be implemented in machine readable instructions stored inmemory 206 and/or executed by the one ormore processors 204. In some examples, some or all of theprimary control process 300 may be implemented in analog and/or discrete circuitry. In some examples, theprimary control process 300 is configured to control the augmentedreality welding system 100, such as by processing the images captured bycameras 114 along with the various inputs and/or settings to generate thesimulated rendering 126 displayed to the user via thedisplay screen 104. - In the example of
FIG. 3 , theprimary control process 300 begins atblock 302, where thecomputing system 200 receives one or more images (and/or image signals) from the cameras 114 (and/or camera controller(s) 124). Atblock 304, theprimary control process 300 processes the images along with inputs of the augmentedreality welding system 100 and generates thesimulated rendering 126. In some examples, processing the images may comprise parsing the images to determine and/or recognize objects in the image, as well as properties of the objects.Markers 122 on certain objects (e.g.,welding torch 118 and/or workpiece 116) may assist in this processing. For example, while theactual workpiece 116 may be a piece of plastic, theprimary control process 300 may recognize theworkpiece 116 from the markers 122 (and/or other distinguishing characteristics) and render theworkpiece 116 as a metallic workpiece (or some other workpiece type, depending on weld settings, etc.). In some examples, theprimary control process 300 may further render theworkpiece 116 as having completed welds in some places and uncompleted welds in other places, according to inputs and/or settings of the augmentedreality welding system 100. As another example, theprimary control process 300 may recognize thewelding torch 118 and workpiece in proximity to one another, detect a signal from thewelding torch 118 indicating that thetrigger 119 is being pressed, and in response render an arc extending from a torch tip of thewelding torch 118 to theworkpiece 116, along with an associated weld pool on theworkpiece 116 at the end of the arc. - In the example of
FIG. 3 , theprimary control process 300 determines whether thesimulated rendering 126 includes a visible simulated arc atblock 306. In some examples, this determination is based, at least in part, on inputs and/or settings (e.g., weld settings) of the augmentedreality welding system 100. If theprimary control process 300 determines that there is a visible simulated arc in thesimulated rendering 126, then theprimary control process 300 proceeds to execute the visualeffect determination process 400 at block 400 (discussed further below in reference toFIG. 4 ). Afterblock 400, theprimary control process 300 proceeds to block 308 where the visual effect is applied to thesimulated rendering 126, and then to block 310 where thesimulated rendering 126 is sent to thedisplay screen 104 and displayed to the user. - In the example of
FIG. 3 , if theprimary control process 300 determines that there is no visible simulated arc (or an absent simulated arc) in thesimulated rendering 126, theprimary control process 300 proceeds fromblock 306 to block 312. Atblock 312, theprimary control process 300 reads thevisual effect settings 401 and determines whether the set time delay is greater than the time since last there was a simulated arc in thesimulated rendering 126. The determination atblock 312 emulates some real life auto-darkening welding helmets with delay settings that provide the option of continuing to apply the shading (and/or tinting, darkening, etc.) effect to the welding helmet lens for some time after the welding arc (or other sufficiently bright light) has subsided. As shown, if the set time delay is less than the time since last there was a simulated arc in thesimulated rendering 126 then theprimary control process 300 proceeds to block 310, where thesimulated rendering 126 is sent to thedisplay screen 104 and displayed to the user, without any visual effect applied. If the set time delay is greater than the time since last there was a simulated arc in thesimulated rendering 126 then theprimary control process 300 proceeds to block 308, where the most recent visual effect is again applied to thesimulated rendering 126 before thesimulated rendering 126 is sent to thedisplay screen 104 and displayed to the user atblock 310. -
FIG. 4 is a flowchart illustrating the visualeffect determination process 400 of the augmentedreality welding system 100. In some examples, some or all of the visualeffect determination process 400 may be implemented in machine readable instructions stored inmemory 206 and/or executed by the one ormore processors 204. In some examples, some or all of the visualeffect determination process 400 may instead be implemented in analog and/or discrete circuitry. In some examples, the visualeffect determination process 400 is configured to determine a visual effect to apply to thesimulated rendering 126 to simulate the shading (and/or tinting, darkening, etc.) effect of some real life welding helmet lenses. - In the example of
FIG. 4 , the visualeffect determination process 400 begins at block 402, where properties of the simulated arc are determined. In some examples, determining properties of the simulated arc may comprise determining (and/or estimating) an amount and/or brightness of visible light radiation that an actual welding arc would produce given the weld settings of the augmentedreality welding system 100. For example, the determined brightness of an arc with low current and/or voltage weld settings may be less than the determined brightness of an arc with high current and/or voltage weld settings. In some examples, theprimary control process 300 may determine properties of the simulated arc atblock 304 of theprimary control process 300, and the visualeffect determination process 400 may use the properties at block 402. For example, theprimary control process 300 may save the arc properties tomemory 206 atblock 304, and the visualeffect determination process 400 may read and/or load the arc properties frommemory 206 at block 402. - In the example of
FIG. 4 , the visualeffect determination process 400 determines whether the determined brightness of the simulated arc is greater than a sensitivity threshold. In some examples, the sensitivity threshold is based on the sensitivity setting of thevisual effect settings 401. In some examples, the brightness of other elements (e.g., the weld pool) of the simulated rendering (and/or the entire simulated rendering) is evaluated against the sensitivity threshold. As shown, the visualeffect determination process 400 proceeds to block 406 if the brightness is less than the sensitivity threshold. Atblock 406, the visualeffect determination process 400 determines the visual effect to be nothing. If, however, the brightness is greater than the set sensitivity threshold, the visualeffect determination process 400 proceeds to block 408. - In the example of
FIG. 4 , the visualeffect determination process 400 determines one or more spectral transmissivity properties corresponding to the helmet model/type setting atblock 408. Spectral transmissivity properties impact certain aspects of the visual effect, such as how much light of a given wavelength is allowed to pass. Actual welding helmet have different lenses with different filters, each with different spectral transmissivity properties. For example, some actual welding helmets have lenses that slightly color (e.g., with a slight green tint, a slight yellowish color, etc.) the appearance of everything viewed through the lens because of the spectral transmissivity properties of the lens’ filter. In some examples, one or more of the spectral transmissivity properties may be dependent upon the shade setting, such that one or more of the spectral transmissivity properties may vary with respect to different shade settings. -
FIG. 5 is agraph 500 showing example spectral transmissivity properties of an example filter via atransmissivity curve 502. In the example ofFIG. 5 , the Y axis of thegraph 500 represents transmissivity percentage, or a percentage of light that such an example filter would allow through, while the X axis represents wavelength of light (in nanometers). Thespectral transmissivity curve 502 is a mathematical representation of the properties of the example filter, showing which wavelengths of light are filtered only slightly, and which wavelengths of light are filtered entirely (or substantially). As shown, thespectral transmissivity curve 502 indicates that almost all light having a wavelength less than 450 nanometers, or greater than 750, is filtered. In the example ofFIG. 5 , thespectral transmissivity curve 502 indicates that most of the light allowed through the filter will be in the 500 to 700 nanometer range, with up to approximately 38% of light having wavelengths around approximately 580 nanometers allowed through the filter. Thus, this example filter may lend a slightly yellowish color to the appearance of everything viewed through the filter, because of the spectral transmissivity properties. - In the example of
FIG. 4 , the spectral transmissivity properties are determined atblock 408 based on the helmet model setting of thevisual effect settings 401, which identifies a particular type and/or model of real life welding helmet filter for the visual effect to emulate. In some examples, the determination may comprise utilizing a data structure that associates different helmet models with different spectral transmissivity properties. In some examples, the spectral transmissivity properties may be disabled or determined to be uniform, corresponding to a flat filter where the same amount of light is allowed to pass through regardless of wavelength. For example, spectral transmissivity may be disabled or determined to be uniform for lower difficulty and/or realism settings. - In the example of
FIG. 4 , the visualeffect determination process 400 proceeds to block 410 afterblock 408. At block 410, the visualeffect determination process 400 determines the visual effect to apply to thesimulated rendering 126 based on the previously determined spectral transmissivity properties and the shade setting of thevisual effect settings 401. In some examples, the shade setting impacts the degree to which brightness is filtered (and/or shaded, tinted, darkened, etc) by the visual effect. In some examples, the higher the shade setting, the more brightness is filtered (and/or shaded, tinted, darkened, etc) by the visual effect. -
FIG. 6 a illustrates an example of differentvisual effects 600 with different shade settings, withvisual effect 600 a corresponding to the lowest shade setting, andvisual effect 600 d corresponding to the highest shade setting (everything else being equal).FIG. 6 a illustrates thevisual effect 600 as a shading of sorts that is added on top of thesimulated rendering 126 for ease of understanding. However, in some examples the visual effect may actually be a subtraction and/or reduction of certain visual attributes (e.g., brightness) of all or some of thesimulated rendering 126. In the example ofFIG. 6 a , only thevisual effect 600 a is applied to thesimulated rendering 126, while the othervisual effects 600 b-600 d are not applied due to thevisual effect settings 401. WhileFIG. 6 a shows four differentvisual effects 600 corresponding to shade settings, in some examples there may be more or less shade settings and/or corresponding visual effects. - In the example of
FIG. 4 , the visualeffect determination process 400 proceeds to block 412 after block 410. Atblock 412, the visualeffect determination process 400 determines if any modification to thevisual effect 600 is warranted given the difficulty, realism, and/or effect area settings of thevisual effect settings 401. In some examples, certain difficulty and/or realism settings may correspond to certain visual effect area settings. For example, an easy difficulty and/or low realism setting may correspond to a small and/or localized visual effect area (or no visual effect area). A small and/or localized visual effect area may apply thevisual effect 600 only to a small and/or localized area around the simulated arc, such as shown, for example inFIG. 6 b . In some examples, this would allow the user to view the rest of thesimulated rendering 126, away from the arc, without any visual effect. As another example, a hard difficulty and/or high realism setting may correspond to a full visual effect area that applies the visual effect to the entiresimulated rendering 126, such as shown, for example, inFIG. 6 a . While thevisual effect 600 inFIG. 6 b is depicted as having a sharp, abrupt, edge between the portions of thesimulated rendering 126 with and without thevisual effect 600, in some examples, the transition may be smoothed. - While the present apparatus, systems, and/or methods have been described with reference to certain implementations, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present apparatus, systems, and/or methods. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from its scope. Therefore, it is intended that the present apparatus, systems, and/or methods not be limited to the particular implementations disclosed, but that the present apparatus, systems, and/or methods will include all implementations falling within the scope of the appended claims.
Claims (20)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/991,939 US20230080145A1 (en) | 2019-02-19 | 2022-11-22 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
| US18/612,445 US20240346951A1 (en) | 2019-02-19 | 2024-03-21 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/279,625 US11514816B2 (en) | 2019-02-19 | 2019-02-19 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
| US17/991,939 US20230080145A1 (en) | 2019-02-19 | 2022-11-22 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/279,625 Continuation US11514816B2 (en) | 2019-02-19 | 2019-02-19 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/612,445 Continuation US20240346951A1 (en) | 2019-02-19 | 2024-03-21 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20230080145A1 true US20230080145A1 (en) | 2023-03-16 |
Family
ID=71092600
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/279,625 Active 2040-06-09 US11514816B2 (en) | 2019-02-19 | 2019-02-19 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
| US17/991,939 Abandoned US20230080145A1 (en) | 2019-02-19 | 2022-11-22 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
| US18/612,445 Pending US20240346951A1 (en) | 2019-02-19 | 2024-03-21 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/279,625 Active 2040-06-09 US11514816B2 (en) | 2019-02-19 | 2019-02-19 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/612,445 Pending US20240346951A1 (en) | 2019-02-19 | 2024-03-21 | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems |
Country Status (4)
| Country | Link |
|---|---|
| US (3) | US11514816B2 (en) |
| EP (1) | EP3928295A1 (en) |
| CA (1) | CA3130746A1 (en) |
| WO (1) | WO2020172291A2 (en) |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12198568B2 (en) * | 2019-11-25 | 2025-01-14 | Illinois Tool Works Inc. | Weld training simulations using mobile devices, modular workpieces, and simulated welding equipment |
| JP7505708B2 (en) * | 2020-09-29 | 2024-06-25 | 株式会社コベルコE&M | WELDING TRAINING SYSTEM, WELDING TRAINING METHOD, AND PROGRAM |
| WO2023135671A1 (en) * | 2022-01-12 | 2023-07-20 | 株式会社コベルコE&M | Welding training system, welding training method, and program |
| CN116884289B (en) * | 2023-06-26 | 2025-09-09 | 浙江省二建建设集团有限公司 | Virtual welding method, virtual welding system, intelligent terminal and storage medium |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160022496A1 (en) * | 2014-07-25 | 2016-01-28 | Snap-On Incorporated | Auto-Darkening Welding Helmet |
| US20160260261A1 (en) * | 2015-03-06 | 2016-09-08 | Illinois Tool Works Inc. | Sensor assisted head mounted displays for welding |
| US20180130376A1 (en) * | 2016-11-07 | 2018-05-10 | Lincoln Global, Inc. | Welding trainer utilizing a head up display to display simulated and real-world objects |
Family Cites Families (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4931018A (en) | 1987-12-21 | 1990-06-05 | Lenco, Inc. | Device for training welders |
| US7024342B1 (en) | 2000-07-01 | 2006-04-04 | Mercury Marine | Thermal flow simulation for casting/molding processes |
| CA2482240A1 (en) | 2004-09-27 | 2006-03-27 | Claude Choquet | Body motion training and qualification system and method |
| AT502283B1 (en) | 2005-07-15 | 2007-05-15 | Fronius Int Gmbh | WELDING PROCESS AND WELDING SYSTEM DETERMINING THE POSITION OF THE WELDING BURNER |
| US7580821B2 (en) | 2005-08-10 | 2009-08-25 | Nvidia Corporation | Application programming interface for fluid simulations |
| US8747116B2 (en) | 2008-08-21 | 2014-06-10 | Lincoln Global, Inc. | System and method providing arc welding training in a real-time simulated virtual reality environment using real-time weld puddle feedback |
| US9230449B2 (en) | 2009-07-08 | 2016-01-05 | Lincoln Global, Inc. | Welding training system |
| US9101994B2 (en) | 2011-08-10 | 2015-08-11 | Illinois Tool Works Inc. | System and device for welding training |
| ES2438440B1 (en) | 2012-06-13 | 2014-07-30 | Seabery Soluciones, S.L. | ADVANCED DEVICE FOR SIMULATION-BASED WELDING TRAINING WITH INCREASED REALITY AND REMOTE UPDATE |
| US9368045B2 (en) | 2012-11-09 | 2016-06-14 | Illinois Tool Works Inc. | System and device for welding training |
| US9583023B2 (en) | 2013-03-15 | 2017-02-28 | Illinois Tool Works Inc. | Welding torch for a welding training system |
| US10032388B2 (en) | 2014-12-05 | 2018-07-24 | Illinois Tool Works Inc. | Augmented and mediated reality welding helmet systems |
| US10406638B2 (en) | 2015-02-27 | 2019-09-10 | Illinois Tool Works Inc. | Augmented vision system with active welder guidance |
| WO2016144744A1 (en) | 2015-03-09 | 2016-09-15 | Illinois Tool Works Inc. | Methods and apparatus to provide visual information associated with welding operations |
| US10913125B2 (en) | 2016-11-07 | 2021-02-09 | Lincoln Global, Inc. | Welding system providing visual and audio cues to a welding helmet with a display |
-
2019
- 2019-02-19 US US16/279,625 patent/US11514816B2/en active Active
-
2020
- 2020-02-19 EP EP20732696.8A patent/EP3928295A1/en active Pending
- 2020-02-19 WO PCT/US2020/018845 patent/WO2020172291A2/en not_active Ceased
- 2020-02-19 CA CA3130746A patent/CA3130746A1/en active Pending
-
2022
- 2022-11-22 US US17/991,939 patent/US20230080145A1/en not_active Abandoned
-
2024
- 2024-03-21 US US18/612,445 patent/US20240346951A1/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160022496A1 (en) * | 2014-07-25 | 2016-01-28 | Snap-On Incorporated | Auto-Darkening Welding Helmet |
| US20160260261A1 (en) * | 2015-03-06 | 2016-09-08 | Illinois Tool Works Inc. | Sensor assisted head mounted displays for welding |
| US20180130376A1 (en) * | 2016-11-07 | 2018-05-10 | Lincoln Global, Inc. | Welding trainer utilizing a head up display to display simulated and real-world objects |
Also Published As
| Publication number | Publication date |
|---|---|
| US11514816B2 (en) | 2022-11-29 |
| EP3928295A1 (en) | 2021-12-29 |
| US20200265748A1 (en) | 2020-08-20 |
| CA3130746A1 (en) | 2020-08-27 |
| WO2020172291A2 (en) | 2020-08-27 |
| US20240346951A1 (en) | 2024-10-17 |
| WO2020172291A3 (en) | 2020-10-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230080145A1 (en) | Adjustable visual effects simulating auto darkening lenses in augmented reality welding systems | |
| US12198568B2 (en) | Weld training simulations using mobile devices, modular workpieces, and simulated welding equipment | |
| US11670191B2 (en) | Systems and methods to provide weld training | |
| US20220198955A1 (en) | Weld training simulations using mobile devices, modular workpieces, and simulated welding equipment | |
| US11721231B2 (en) | Weld training simulations using mobile devices, modular workpieces, and simulated welding equipment | |
| EP3400587B1 (en) | Systems and methods to provide weld training | |
| US12205489B2 (en) | Weld training systems with resettable target tool images | |
| US11676510B2 (en) | Welding simulation systems with observation devices |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SEABERY NORTH AMERICA INC., MARYLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MARQUINEZ TORRECILLA, PEDRO GERARDO;BECKER, WILLIAM JOSHUA;GUNIA, PAVEL;SIGNING DATES FROM 20220603 TO 20220727;REEL/FRAME:062078/0212 Owner name: ILLINOIS TOOL WORKS INC., ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MARQUINEZ TORRECILLA, PEDRO GERARDO;BECKER, WILLIAM JOSHUA;GUNIA, PAVEL;SIGNING DATES FROM 20220603 TO 20220727;REEL/FRAME:062078/0212 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |