CA2554498A1 - Body motion training and qualification system and method - Google Patents

Body motion training and qualification system and method Download PDF

Info

Publication number
CA2554498A1
CA2554498A1 CA002554498A CA2554498A CA2554498A1 CA 2554498 A1 CA2554498 A1 CA 2554498A1 CA 002554498 A CA002554498 A CA 002554498A CA 2554498 A CA2554498 A CA 2554498A CA 2554498 A1 CA2554498 A1 CA 2554498A1
Authority
CA
Canada
Prior art keywords
user
training exercise
simulated
training
dynamic environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CA002554498A
Other languages
French (fr)
Other versions
CA2554498C (en
Inventor
Claude Choquet
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CA002482240A external-priority patent/CA2482240A1/en
Application filed by Individual filed Critical Individual
Priority to CA2554498A priority Critical patent/CA2554498C/en
Publication of CA2554498A1 publication Critical patent/CA2554498A1/en
Application granted granted Critical
Publication of CA2554498C publication Critical patent/CA2554498C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements

Abstract

The system allows training and qualification of a user performing a skill-related training exercise involving body motion in a workspace. A training environment is selected through a computer apparatus, and variables, parameters and controls of the training environment and the training exercise are adjusted. Use of an input device by the user is monitored. The 3D angles and spatial coordinates of reference points related to the input device are monitored through a detection device. A simulated 3D dynamic environment reflecting effects caused by actions performed by the user on objects is computed in real time as a function of the training environment selected.
Images of the simulated 3D dynamic environment in real time are generated on a display device viewable by the user as a function of a computed organ of vision~-object relation. Data indicative of the actions performed by the user and the effects of the actions are recorded and user qualification is set as a function of the recorded data.

Description

BODY MOTION TRAINING AND QUALIFICATION SYSTEM AND METHOD
FIELD OF THE INVENTION
The present invention relates to simulators for teaching, training, qualifying and certifying purposes, and more particularly to body motion training and qualification system and method.
RELATED ART
Many skills involve a high level of dexterity from workers. For example, a surgeon must be highly manually skilled for performing operations on patients. Another skill involving a high level of dexterity and skill is welding. Minute details in a welding operation may result, for example, in a structure collapsing or not. The environment in which a welder acts is also particular as it more often involves a degree of danger, dust and darkness than not, thereby adding to the difficulty of performing a good welding operation.
A high level of dexterity or other forms of body motion may be required for athletes practicing certain sports, for human or animals practicing certain activities.
For example, the body motion is crucial for divers or skiers.
Known in the art are US patents Nos. 4,988,981 (Zimmerman et al.), 5,554,033 (Bizzi et al.), 5,805,287 (Pettersen et al.), 5,821,943 (Shashua), 5,846,086 (Bizzi, et al.), 6,166,744 (Jaszlics et al.), 6,225,978 (McNeil), 6,304,050 (Skaar et al.), 6,396,232 (Haanpaa et al.), 6,425,764 (Lamson), 6,478,735 (Pope et a1.),6,574,355 (Green), 6,597,347 (Yasutake), 6,697,044 (Shahoian et al.), 6,766,036 (Pryor), French patent application No. 2,827,066 (Dasse et al.)), and Canadian patent No. 2,311,685 (Choquet) and Canadian patent application No. 2,412,109 (Choquet). These patents or applications provide examples of systems, devices or methods related to virtual reality simulators. In many cases, a haptic or force feedback is provided to the user during the simulation process to achieve a more realistic effect of the conditions that a user may encounter in the simulated environment. However, such a haptic or force feedback is often unnecessary, especially when evaluating qualificafiion of the user or for training purposes. Indeed, such a haptic or force feedback is generally used when the user has transgressed some requirements of the training process, and often involves the use of complex or expansive devices. It is like using negative reinforcement instead of positive reinforcement to help the user correct his/her skill under training. In some cases, the position of the tool used to perform the training is simulated, but the body parts of the user involved during the training are ignored, for example the position of the head of the user during the training process. Also, the working space used for performing the training is often small and impose training limits, preventing the user from being immersed in the simulation process.
Coordination of hand, head and glance is concerned in the specific field and conditions of flight simulators. But the environment and context is far different from manual dexterity or body motion, and the systems and methods have been so far impractical and expensive for body motion training and qualification. Furthermore, current manual dexterity simulators are not able to duplicate the visible world in a virtual world without compromises. For those who want manual dexterity training of a skill, a profession, a sport or reeducation, the current simulators are unsatisfactory for the rendering of the visible world at a high level. Also, current simulator systems are impractical for preproduction training of an operator who has to perform a task requiring a high level of manual dexterity due, for example, to difficult work conditions.
SUMMARY
An object of the invention is to provide a system and a method which both satisfy the need for body motion training and qualification.
Another object of the invention is to provide such a system and a method which integrate production data in real time and do not need force feedback.
Another object of the invention is to provide such a system and a method which is able to process spatial variables like a typical piece-contact distance, an angle of attack and a tilt angle having a direct effect on the activity performed, and which realistically reproduce possible defects resulting from a poor performance and enable the analysis and qualification of the defects and of the user skill.
According to one aspect of the present invention, there is provided a system for training and qualification of a user performing a skill-related training exercise involving body motion in a workspace, comprising:
an input device operable by the user in the workspace when performing the training exercise, the input device being such as to provide the user with a physical feeling of an object normally used to perform the training exercise;
a detection device positioned non-invasively with respect to the workspace, for measuring 3D angles and spatial coordinates of reference points relative to the input device and an organ of vision of the user during performance of the training exercise a display device viewable by the user during performance of the training exercise;
a computer apparatus connectable with the input device, the detection device and the display device, the computer apparatus having a memory with computer readable code embodied therein, for execution by the computer apparatus for:
selecting a training environment related to the training exercise to be performed from a training database;
adjusting variables, parameters and controls of the training environment and training exercise;
monitoring use of the input device by the user;
monitoring the 3D angles and spatial coordinates measured by the detection device and computing an organ of vision-object relation as a function thereof;
computing a simulated 3D dynamic environment in real time as a function of the training environment selected, the simulated 3D dynamic environment reflecting effects caused by actions performed by the user on objects in the simulated 3D dynamic environment as monitored from the input device and the detection device;
generating images of the simulated 3D dynamic environment in real time on the display device as a function of the organ of vision-object relation;
recording data indicative of the actions performed by the user and the effects of the actions; and setting user qualification as a function of the recorded data.
According to another aspect of the present invention, there is also provided a computer-implemented mefihod for training and qualificafiion of a user performing a skill-related training exercise involving body motion in a workspace, comprising:
selecting a training environment related to the training exercise to be performed from a training database;
adjusting variables, parameters and controls of the training environment and training exercise;
monitoring use of an input device providing the user with a physical feeling of an object normally used to perform the training exercise;
measuring 3D angles and spatial coordinates of reference points relative to the input device and an organ of vision of the user during performance of the training exercise;
computing an organ of vision-object relation as a function of the 3D angles and spatial coordinates;
computing a simulated 3D dynamic environment in real time as a function of the training environment selected, the simulated 3D dynamic environment reflecting effects caused by actions performed by the user on objects in the simulated 3D dynamic environment as tracked from the input device and the 3D angles and spatial coordinates;
generating images of the simulated 3D dynamic environment in real time on a display device viewable by the user during performance of the training exercise as a function of the organ of vision-object relation;
recording data indicative of the actions performed by the user and the effects of the actions; and setting user qualification as a function of the recorded data.
The system and the method have many possible uses. For example, they may be used as an helping tool for remote hiring or for remote annual wage evaluation.
They may also be used for performing accessibility tests prior to practicing the real work, or to bid certain works according to code requirements, or for preproduction training, or remote monitoring during working of an operator. In the bid case or remote hiring, plants will have the possibility of cloning a scaled-down engineering division. For consulting firms winning a turn key contract for plant construction abroad, they will have the possibility to hire local workers while making sure that they are apt to skillfully perform the work. The system and the method are particularly usable in the field of welding, although they are not limited to this field.
BRIEF DESCRIPTION OF THE DRAWINGS
A detailed description of preferred embodiments will be given herein below with reference to the following drawings, in which like numbers refer to like elements:
Figure 1 is a perspective view of a training station and workspace of the disclosed system for training and qualification.
Figure 2 is a schematic diagram illustrating an image on a display device of the disclosed system.
Figure 3 is a schematic diagram of a computer station of the disclosed system.
Figure 4 is a schematic diagram of a detection device of a training station of the disclosed system.
Figure 5 is a flowchart illustrating a possible procedure implemented by the disclosed system.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
The skills related to welding have been chosen in this disclosure to describe embodiments of the invention for illustrative purposes only. It should be understood that any other skill involving a degree of manual dexterity or other kind of body motion and a qualification thereof is contemplated As used in connection with this disclosure, the expression "monitoring" refers to an action consisting of measuring, checking, watching, recording, keeping track of, regulating or controlling, on a regular or ongoing basis, with a view of collecting information.
Many occupations, professions, works, crafts, trades, sports, activities possibly with sensitive reaction, etc. require minimal qualifications to operate a tool, a machine, etc., i.e. certain skills involving a manual or body motion activity. A common point of these skills resides in spatial location tracking of body and/or object to obtain information which may be called spatial essential variables (S.E.V.). These S.E.V. are specific to each activity. Their numbers vary and their interrelationships may be more or less complex. A simulator system for inserting a nail with a hammer will not have the same future as a simulator system for cutting a diamond, for performing a surgical operation or even for welding, or brazing, or cutting various types of meat properly.
Welding involves more than forty (40) essential variables (E.V.). There are even standards attempting to define nomenclature related to the E.V. in this field to avoid confusion on the meaning of these E.V. A worker such as a carpenter must also have a manual dexterity as a function of the laws of physics (kinematics, dynamics) for example for hammering in a nail at a good depth. A welder, for his/her part, has to face the following current allusion in the world of welding: "a nice weld may be poor and an inelegant weld may be good". This expression is based on the fact that the welder has no access to the final microscopic results showing the regeneration of the grains or the root fusion ensuring welding integrity. Likewise, a surgeon has no access to the final results at the cellular level ensuring cellular regeneration. Each of these activities commonly involves manual dexterity or neuromuscular variables in addition to a neurocerebral knowledge.
All the activities requiring psychomotor knowledge also require a capacity for recognition of space location.
The combination of the 3D location angle and XYZ coordinate data are reference points required for positioning purposes. The kinematics of these reference points enables to find trajectories, speeds and accelerations of points in space and to associate them to masses which can then be used in the processing of the manual dexterity.
Referring to Figures 1 and 3, there is shown a system for training and qualification of a user performing a skill-related training exercise involving bogy motion in a workspace 2.
As used in connection with this disclosure, the expression "workspace"
represents a space in which a training exercise is performed. The space may be arranged as a scene or a setup with physical objects related to the training exercise if desired.
In the illustrated case, the training exercise relates to the field of welding and consists in welding two pieces 4., 6 together, the pieces 4., 6 being positioned in the space as if a real welding operation were to be achieved. Thus, the workspace may comprise physical objects subjected to the actions performed by the user during the training exercise.
The system comprises an input device 8 operable by the user in the workspace 2 when performing the training exercise. The input device 8 is such as to provide the user with a physical feeling of an object normally used to perform the training exercise.
It may be made of a physical object imitating a tool normally used to perform the training exercise.
In the illustrated case, the input device is made of a dummy welding torch.
The input device may take the form of any suitable accessory to the training exercise to carry out.
For example, it may be a pen to practice writing skills, a scalpel to practice surgery skills, a tool to practice workmanship, etc. It may be simply a computer mouse 10. It may also be a sensor glove or suit translating body motion by the user to commands for manipulating objects in a virtual environment.
A detection device 12 is positioned non-invasively with respect to the workspace 2, for measuring 3D angles and spatial coordinates of reference points relative to the input device 8 and an organ of vision of the user during performance of the training exercise.
As used in connection with this disclosure, the expression "non-invasively"
refers to detection techniques whereby the detection device does not interfere with the body motion and the vision of the user during the training exercise. The detection device may be positioned within, outside or partially within the workspace 2, as long as it does not interfere with the body motion and the vision of the user. As used in connection with this disclosure, the expression "organ of vision" represents something that is used to see objects in the workspace 2, and which defines a viewpoint. It may be but is not limited to the eyes of a user, his/her head, a display device or other accessory such as a helmet 26 through which or by which a user may see the objects in the workspace 2. In the case where a sensor glove or suit is used, sensors in the glove or suit may form parts of the detection device 12. In the illustrated case, the detection device 12 comprises a pair of electronic sensors 14, 16 on sides of the workspace 2 and an other electronic sensor 18 positioned on a side of the workspace 2 opposite to the user.
Referring to Figure 4, the electronic sensors 14, 16 are directed to at least cover portions of the workspace 2 where the user may be led to move the input device during the training exercise, in order to be capable of continuously tracking the input device 8 during the training exercise. The electronic sensor 18 is arranged to at least cover a portion of the workspace 2 where the organ of vision 26 of the user may be moved during the training exercise, in order to be capable of continuously tracking the organ of vision 26 during the training exercise. Cones 20, 22 depict the fields of view of the electronic sensors 14, 16 while cone 24 depicts the field of view of the electronic sensor 18. Reflectors positioned on the input device 8 may be used in combination with the electronic sensors 14, 16 thereby forming emitter-receptor arrangements, to facilitate the angle and location tracking of reference points relative to the input device 8. Likewise, reflectors positioned near the organ of vision of the user, for example on 0 the helmet 26, may be used in combination with the electronic sensor 18, thereby also forming emitter-receptor arrangements, to facilitate tracking of reference points relative to the organ of vision and determination of the viewpoint of the use r. The electronic sensors 14, 16 may for example be formed of video cameras while the electronic sensor 18 may be formed of an infrared detector. Other non-invasive detection technologies 5 may be used and combined together if desired, for example video, ultrasound, MEMS
(Micro Electro-Mechanical System), etc. The detection device 12 preferably remains operational at all time during performance of the training exercise, for tracking purposes.
The detection device 12 may measure or otherwise track the input device 8 and the organ of vision 26 at a regular interval or continuously, depending on the needs. The '.0 positioning of the detection device may be adapted to the kind of training exercise to be performed by the user. For example, the electronic sensors 14, 16, 18 may be positioned at locations different from those illustrated in Figures 1 and ~L, as long as they fulfil their function as hereinabove explained. Depending on the kind of sensor, only one sensor may be used to track the input device 8 if desired.
~5 Referring to Figures 1 and 3 again, a display device 28 viewable by the user during performance of the training exercise is provided. The display device may be integrated in a pair of goggles, the helmet 26 as in the illustrated case, or any other suitable structure wearable by the user during the training exercise to be viewable at all time .0 during performance of the training exercise. Depending on the training exercise to be performed by the user, the display device 28 may be provided by a computer monitor 30. In the case where it is integrated in the helmet 26 or a pair of goggle, the display device 28 may have a degree of transparency letting the user see the workspace 2.
Figure 2 illustrates an example of images generated on the display device 28.

A computer apparatus 32 connectable with the input device 8, the detection device 12 and the display device 28, is provided. The computer apparatus 32 has a memory with computer readable code embodied therein, for execution by the computer apparatus 32 for selecting a training environment related to the training exercise to be performed from a training database. The computer apparatus 32 also adjusts variables, parameters and controls of the trai ning environment and training exercise. The adjustments may be carried out automatically or manually, depending on the needs of the training exercise.
The computer apparatus 32 monitors use of the input device 8 by the user, monitors the 3D angles and spatial coordinates measured by the detection device and computes an organ of vision-object relation as a function of the 3D angles and spatial coordinates so measured. The computer apparatus 32 further computes a simulated 3D dynamic environment in real time as a function of the training environment selected, so that the simulated 3D dynamic environment reflects effects caused by actions performed by the 5 user on objects in the simulated 3D dynamic environment as monitored from the input device 8 and the detection device 12. The computer apparatus 32 generates images of the simulated 3D dynamic environment in real time on the display device 12 as a function of the organ of vision-object relation, records data indicative of the actions performed by the user and the effects of the actions, and sets user qualification as a 0 function of the recorded data. The recording of the data may be used for providing a replay mode for the user so that he/she or a teacher or a third party may see what the user has done during the training exercise.
The simulated 3D dynamic environment may involve a more realistic virtual 5 representation of tf-~e tool than the physical object used as the input device 8. The input device 8 may have a stopwatch circuit measuring elapsed time du ring the training exercise and providing time data to the computer apparatus 32 for timing functions of the training exercise. The input device 8 may be provided with a haptic interface responsive to a collision condition detected by the computer apparatus 32 during the 0 training exercise while computing the simulated 3D dynamic environment, to relay a sense of physical sensation to the user as a function of the collision condition.
Referring to Figure 2, there is shown an example of images generated on the display device 28 during performance of a training exercise. The images may be mathematically generated by the computer apparatus 32. A visual indicator 34, for example a red dot appearing in the images on the display device 28, or a sound indicator responsive to a collision condition detected by the computer apparatus during the training exercise while computing the simulated 3D dynamic environment may be also provided to relay the collision condition to the user. The images on the display device 28 may be formed of virtual images superimposed over other virtual images as modelled from the computing of the simulated 3D dynamic environment. The display device 28 may provide informative data 36 superimposed over the images as modelled from the computing of the simulated 3D dynamic environment.
The training exercise may be initiated by a user action on the input device 8 or the computer apparatus 32, for exam ple by pressing a start button 38.
Depending on the kind of training environment, the computing of the simulated i dynamic environment may involve virtual modelling of phenomena resu Iting from the effects, and representation of the phenomena in the images generated on the display device. The phenomena may be, for example, changes of states and properties of matter at interfaces and in portions of objects in the simulated 3D dynamic environment.
The phenomena may also be possible defects resulting from a poor performance of the J user during the training exercise. The computing of the simulated 3D dynamic environment may involve determining essential variables typical to a skill associated with the training exercise. The essential variables may be trajectory, speed and skill activity angle of objects interacted with during the training exercise. The computing of the simulated 3D dynamic envi ronment may then involve linking the 3 D angles and spatial coordinates of the reference points to the essential variables. 'The variables, parameters and controls may involve a tolerance degree for qualification of the training exercise. The user qualification may be set by pixel analysis of the data recorded. The computing of the simulated 3D dynamic environment may also involve replicating the physical objects in the simulated 3D dynamic environment.

The data used for computing tt-~e simulated 3D dynamic environment may consist of data related to a code of condu ct, a code of practice, physics law equations, technical standards and specifications for physical activities requiring a training, a qualification and possibly a cerfiification, and training scenarios and environments, for example elements of tests, parameters and basic controls. The disclosed system may be combined with the system disclosed in Canadian patent No. 2,311,685 (Choquet) for third party certification purposes, and with the system disclosed in Canadian patent application No. 2,1.12,109 (Choquet) for distributed environment simulation purposes, the disclosures of which are incorporated herein by reference. T-he computer apparatus 32 may thus be provided with a communication link 40 for communication with other network components via a communication network.
The physics law equations may be used to create, to deposit or fill or to create matter 0 defects, and to move a virtual matter in space. Virtual matter may for example be molten metal, painting, ink or lead, as would be deposited with tools like a welding gun, a brazing gun, a painting gun, a pencil. The virtual matter may for example be animal or human virtual skin and muscles, meat, vegetal, or metal to be cut. The object used to perform the training exercise may be a real or original or dummy physical object like a 5 pencil, a welding t-~andle, a brazing handle, a lancet/scalpel, a chisel etc., or simply a mass such as a glove which reproduces weight and inertia. It may also be reproduced virtually, for example a hairdresser chisel.
The organ of vision-object relation allows to deposit/move/position virtual matter with a '0 translation motion from a point A to a point B located anywhere in space.
Thus a translation training exercise may be taught as a function of a motion straightness tolerance. The organ of vision-object relation also allows processing of time-related data as provided by the stopwatch, for example an amount of matter deposited. A
training exercise may thus be taught as a function of acceleration and speed range tolerances.
'.5 The display device 28 allow seeing the work carried out in real time while making it possible to give properties to the matter (solid, liquid and gas) and to associate an output after processing with the stopwatch. For example, a virtual digital thermography temperature may be displayed on the display device 28 to show heat intensities with .0 color arrangements to thereby simulate temperature according to time. The measurement unit of the virtual matter may be a pixel. The pixels may be studied to show, simulate, qualify and certify the manual dexterity of the user.

The disclosed system includes the capture of three-dimensional data and then allocates properties to material and also provide visual and sound feed-back if required. There are different types of transmitting surfaces and their selection depends on the signals to be used, for example ultrasound, infrared, digital or video imagery, etc., which at their turn are selected based on practical and cost considerations and design choices.
Colors, shapes and motions in the images may be used to detect objects. The objects in the workspace 2 may be real or virtual. In the illustrated example, the objects which are to be welded are made of a real assembly of 2 aluminum plates normalized according to CSA W47.2 standard. A tool such as the welding gun 8 may be real or virtual, in which 0 case a ballasted glove may be used to collect three-dimensional data related to the body motion of the user. An activation/deactivation function of the stopwatch may be provided on the tool. In the illustrated example, the tool is in the form of a handle for usual production welding in which the activation/deactivation function (trigger) is connected to the computer-implemented process. The handle may be provided with 5 emitting, transmitting or reflecting surfaces depending on the type of detection device 12 used.
Determination of the spatial location of the tool 8 may be achieved by means of a lookup table method, a triangulation method or any other methods allowing 0 determination of the spatial location of the tool 8. Preferably, the method will allow a six degrees of freedom detection for spatial location determination.
The lookup table method consists in detecting a space location by a matrix tracking process, for example by using two matrix devices positioned to cover two different 5 planes to capture the 3D location of the scene in the workspace 2. In this case, the tool 8 may be provided with two emitters, transmitters or reflectors, depending on whether the matrix devices are active or passive and their configuration, to provide position data processed by the computer apparatus 32 to transpose the tool 8 in the virtual environment, for example applying trigonometry rules.
.0 The triangulation method allows detecting the tool 8 in the workspace 2 with only one detecfiion device. In this case, the tool may be provided with three emitters, transmitters or reflectors positioned firiangularly. When the detection device detects the emitters, transmitters or reflectors, it produces X, Y, Z position data and angle d ata (for example pitch, roll and yaw).
The methods may be used alone or combined together or with other tracking methods.
The choice may depend on the access limitations of the workspace 2 and the kind of training exercise to be performed, while seeking optimization for the user visual feedback. In the field of welding, the training tool may be bulky and heavy while in another field like surgery, the training tool maybe small and light. Thus, the type of detection devices and the tracking methods will likely be different from case to case, in 0 addition to economic and efficiency issues.
During the training exercise, the emitters, transmitters or reflectors on the tool handle 8 are detected by the receivers who transmit position data to the computer apparatus for processing. The real coordinates of the emitters, transmitters or reflectors may be 5 computed in real time when the receivers detect them simultaneously.
Once the real positions of the emitters, transmitters or reflectors are known, each displacement of the handle 8 may be reported in real time in a simulator, such as an ARC~ simulator. The detected positions allow computing three-dimensional information .0 (for example, the final length of an electric arc, the work angle and d rag angle) in the computer process implementing an algorithm. Once the computer process is completed, the resulting information may be subjected to a certification process as disclosed in Canadian patent No. 2,412,109 (Choquet). It will then be possible to combine the processed information (for example, the 3D image capture, the image reconstruction for '.5 vision feedback, tactile and sound feedback for educational considerations) with those who were collected by the system of the aforesaid patent to obtain all the relevant information for achieving a standard welding operation by means of virtual certification now available via a communication network, such as Internet.
.0 Referring to Figure 5, there is shown a flowchart illustrating a possible procedure implemented by the disclosed system in the case where the training exercise consists in performing a welding operation.

The first step as depicted by block 44 may consists of a user authentication followed by validation of the user as depicted by block 46. The user authentication may be achieved in various manner, for example through a web cam for visual proof, or a personal identification number and password assigned to the user.
As depicted by block 48, a virtual certification process may be initiated if desired, once the user has been validated by the system. The virtual certification process refers to a database 50 to search for law book and standards to be used for the training exercise.
The training exercise may be selected as a function of th ese standards and the variables, parameters and controls of the training environm ent may be adjusted if needed or if desired.
A search of the peripherals connected to the computer apparatus may be achieved, as depicted by block 52. This step may set the electronic receivers forming the detection 5 device into operation to measure the position information according to the case required for the application. It may also set the display device in operation to display the virtual environment and the relevant information regarding the training exercises to the user.
Possible sound equipment may also be set in operation, while detection and setting of the tools used by the user in operation may also be achieved as depicted by block 54, 0 56 and 58.
The user is now ready to perform the training exercise. As d epicted by block 60, the user may position the object, for example the tool 8 provided with the spatial tracking elements of the detection device 12, as needed. The computer apparatus 32 begins 5 creation of the scene activity and, if necessary, starts the stopwatch and the processing algorithm in response, for example, to an action of a finger on a trigger or an action triggered by a combination of signals from the detection device 12 and the computer application, as depicted by block 62. Other triggering events or conditions may be implemented, for example a condition where an electric circuit is closed or opened by ~0 appropriate displacement of the tool 8 by the user. Such a triggering condition may be obtained in the case where the tool 8 forms a welding electrode, by bringing the electrode closer at an arc striking distance from the object to be welded, causing formation of an arc. Then, the user must find and preserve th ~ appropriate distance to avoid arc extinction as in GMAW (Gas Metal Arc Welding). A different triggering action by striking the electrode against the object may be required as in SMAW
(Shielded Metal Arc Welding).
The information (motion) to be processed with respect to the reference material, for example the tool 8, is captured by the computer apparatus 32 from the detection device 12, as depicted by block 64. As hereinabove indicated, a glove provided with virtual or real functions, like a real tool, may be used as the tool 8.
As depicted by block 66, the visual and sound data are processed by the computer apparatus 32 to compute the 3D space location of the reference points with respect to the material used as spatial tracking elements of the detection device for the object/tool 8. Likewise, the 3D space location of the reference points with respect to the user viewpoint are computed using the data, and the organ of vision-object relation is also computed.
As depicted by block 68, a position of the reference points in the 3D space is determined by simultaneous processing of the signals produced by the detection device 12, while the material moves tridimensionally, kinematically and dynamically.
In the example case of a welding operation, a hot spot is thermodynamically created virtually in the space or environment. The materials to be joined are assembled with a welding bead produced by translation of this hot spot feeded virtually with metallurgical essential variables enabling fusion.
The work angle, travel angle and the final length data are determined in real time by processing the data captured by the detection device 12, with the reference points.
These E.V. are manageable with the capture of the position in space and its relative location with respect to the viewpoint.
As depicted by block 70, the images are generated on the display device 28, so that the user may look at the creation of the bead deposit and correct in real time the final length or the tool angles to thus obtain a welding in conformity with the standards in force. At the same time, when the hand provided with the mass of the tool 8 moves in the real space, the images on the display device 28 reflect a welding bead having all the features of a real welding.

A sound feedback loop may be also achieved during the exercise to guide the user for performing a proper transfer mode and for user recognition purposes.
i Once the training exercise is finished and the detection is stopped, as again depicted by block 62, the exercise is evaluated by processing the data created and recorded during performance of the exercise. The data may be created as per the certification algorithm as disclosed in Canadian patent application No. 2,311,685 (Choquet) in conformity with a standard identified at the time of the user authentication. The processed data may be validated with the database 50 and with the mathematical equations used to process the size of the bead, its root penetration and the acceptable or unacceptable physical state according to acceptance criteria based on a code of conduct or rule of the art.
The computer-implemented process may be distributed across network components in communication with the computer apparatus 32.
The data structures and code used by the computer apparatus 32 are typically stored on a computer readable storage medium or memory, which may be any device or medium that can store code and/or data for use by a computer system. This includes, 0 but is not limited to, magnetic and optical storage devices such as disk drives, magnetic tape, CDs (compact discs) and DVDs (digital video discs), and computer instruction signals embodied in a transmission medium (with or without a carrier wave upon which the signals are modulated). For example, the transmission medium may include a communications network, such as the Internet. The computer apparatus 32 may be a 5 local or a remote computer such as a server and include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a personal organizer, a device controller, and a computational engine within an appliance.
0 The computer implemented process may be embodied in a computer program product comprising a memory having computer readable code embodied therein, for execution by a processor. On a practical level, the computer program product may be embodied in software enabling a computer system to perform fihe operations, described above in detail, supplied on any one of a variety of media. An implementation of the approach and operations of the invention may be statements written in a programming language.
Such programming language statements, when executed by a computer, cause the computer to act in accordance with the particular content of the statements.
Furthermore, the software that enables a computer system to act in accordance with the invention may be provided in any number of forms including, but not limited to, original source code, assembly code, object code, machine language, compressed or encrypted versions of the foregoing, and any and all equivalents.
The disclosed system may comprise multiple input devices operable by a same user or different users, and may track two or more hands as well as objects manipulated by these hands. The sound, visual or contact feedback may be implemented together or individually in production situations. The visual feedback provided to the user during performance of the training exercise is instantaneous. It allows production monitoring, and allow the user to understand that this equipment is not for spying him/her but is a companion during phases of difficult access or for other reasons. The sound feedback may be generated from pre-recorded sounds associated with electric arc transfer modes or any other effects related to the training exercise and environment.
For example, the key sound of a bee is typical of an axial spray mode, globular frying and short-circuit cracking. The contact feedback may be used to show that when a welder touch the virtual plate, an undercut is created, which is a normal defect as soon as there has been a contact with the plate. With the disclosed system, a welder may be provided with a helmet having an LCD display showing real time data while the welder produces real metal in fusion as the welder welds. In addition, a virtual image of the metal in fusion could be superimposed over the real image in the field of vision of the welder to thereby provide the welder with required information to achieve a standard-complying and earth friendly welding as with this equipment, the welder will pollute less the environment as he/she will achieve clean welding. Thus, the disclosed system may be used to train while integrating real time production data. The use in production of the various visual, contact and sound feedbacks together or individually for monitoring purposes is possible as the user performs the exercise. Thus, the qualification is in real time. For example, a welder knows that by keeping his/her speed, trajectory, angle and distance in conformity and that the electric element is also in conformity, then the welding operation is in conformity with the standard. ,4t the level of the sound feedback, it is possible to incorporate a sound to a real welding machine to confirm that the operation is in conformity. At the level of the visual feedback, the speed, trajectory, angle and distance data may be provided on the display device 28 at all time.
The disclosed system may be used for remotely controlled production namely e-production or teleproduction purposes.
In the disclosed system, the body motion in a 3D scene environment can be as complete as in reality. The disclosed system may also be used for training in nano-environments with nano-tools, for example for nano-forging and micro joining applications.
While embodiments of this invention have been illustrated in the accompanying drawings and described above, it will be evident to those skilled in the art that changes and modifications may be made therein without departing from the essence of this invention.

Claims (44)

1. A system for training and qualification of a user performing a skill-related training exercise involving body motion in a workspace, comprising:

an input device operable by the user in the workspace when performing the training exercise, the input device being such as to provide the user with a physical feeling of an object normally used to perform the training exercise;

a detection device positioned non-invasively with respect to the workspace, for measuring 3D angles and spatial coordinates of reference points relative to the input device and an organ of vision of the user during performance of the training exercise a display device viewable by the user during performance of the training exercise;

a computer apparatus connectable with the input device, the detection device and the display device, the computer apparatus having a memory with computer readable code embodied therein, for execution by the computer apparatus for:

selecting a training environment related to the training exercise to be performed from a training database;

adjusting variables, parameters and controls of the training environment and training exercise;

monitoring use of the input device by the user;

monitoring the 3D angles and spatial coordinates measured by the detection device and computing an organ of vision-object relation as a function thereof;

computing a simulated 3D dynamic environment in real time as a function of the training environment selected, the simulated 3D dynamic environment reflecting effects caused by actions performed by the user on objects in the simulated 3D
dynamic environment as monitored from the input device and the detection device;

generating images of the simulated 3D dynamic environment in real time on the display device as a function of the organ of vision-object relation;

recording data indicative of the actions performed by the user and the effects of the actions; and setting user qualification as a function of the recorded data.
2. The system according to claim 1, wherein the input device comprises one of a glove and a suit.
3. The system according to claim 1, wherein the input device comprises a physical object imitating a tool normally used to perform the training exercise.
4. The system according to claim 3, wherein the simulated 3D dynamic environment comprises a more realistic virtual representation of the tool than the physical object used as the input device.
5. The system according to claim 1, wherein the input device comprises a stopwatch circuit measuring elapsed time during the training exercise and providing time data to the computer apparatus for timing functions of the training exercise.
6. The system according to claim 1, wherein the input device comprises a haptic interface responsive to a collision condition detected by the computer apparatus during the training exercise while computing the simulated 3D
dynamic environment, the haptic interface relaying a sense of physical sensation to the user as a function of the collision condition.
7. The system according to claim 1, further comprising at least one of a visual indicator and a sound indicator responsive to a collision condition detected by the computer apparatus during the training exercise while computing the simulated dynamic environment and relaying the collision condition to the user.
8. The system according to claim 7, wherein the images on the display device comprise the visual indicator.
9. The system according to claim 1, wherein the detection device is operational at all time during performance of the training exercise.
10. The system according to claim 1, wherein the detection device comprises emitter-receptor arrangements positioned on the input device and on sides of the workspace, and emitter-receptor arrangements positioned on a side of the workspace opposite to the user and near the organ of vision of the user.
11. The system according to claim 1, wherein the display device is integrated in one of a pair of goggles, a helmet and a structure wearable by the user during the training exercise to be viewable at all time during performance of the training exercise.
12. The system according to claim 1, wherein the images on the display device comprise virtual images superimposed over other virtual images as modelled from the computing of the simulated 3D dynamic environment.
13. The system according to claim 1, wherein the display device has a degree of transparency letting the a ser see the workspace.
14. The system according to claim 1, wherein the display device comprises informative data superimposed over the images as modelled from the computing of the simulated 3D dynamic environment.
15. The system according to claim 1, wherein the training exercise is initiated by a user action on one of the input device and the computer apparatus.
16. The system according to claim 1, wherein the computing of the simulated 3D dynamic environment comprises virtual modelling of phenomena resulting from the effects, and representation of the phenomena in the images generated on the display device.
17. The system according to claim 16, wherein the phenomena comprise changes of states and properties of matter at interfaces and in portions of objects in the simulated 3D dynamic environment.
18. The system according to claim 16, wherein the phenomena comprise possible defects resulting from a poor performance of the user during the training exercise.
19. The system according to claim 1, wherein the computing of the simulated 3D dynamic environment comprises determining essential variables typical to a skill associated with the training exercise.
20. The system according to claim 19, wherein the essential variables comprise trajectory, speed and skill activity angle of objects interacted with during the training exercise.
21. The system according to claim 19, wherein the computing of the simulated 3D dynamic environment comprises linking the 3D angles and spatial coordinates of the reference points to the essential variables.
22. The system according to claim 1, wherein the variables, parameters and controls comprise a tolerance degree for qualification of the training exercise.
23. The system according to claim 1, wherein the user qualification is set by pixel analysis of the data recorded.
24. The system according to claim 1, wherein the workspace comprises physical objects subjected to the actions performed by the user during the training exercise.
25. The system according to claim 21, wherein the computing of the simulated 3D dynamic environment comprises replicating the physical objects in the simulated 3D dynamic environment.
26. A computer-implemented method for training and qualification of a user performing a skill-related training exercise involving body motion in a workspace, comprising:

selecting a training environment related to the training exercise to be performed from a training database;

adjusting variables, parameters and controls of the training environment and training exercise;

monitoring use of an input device providing the user with a physical feeling of an object normally used to perform the training exercise;

measuring 3D angles and spatial coordinates of reference points relative to the input device and an organ of vision of the user during performance of the training exercise;

computing an organ of vision-object relation as a function of the 3D angles and spatial coordinates;

computing a simulated 3D dynamic environment in real time as a function of the training environment selected, the simulated 3D dynamic environment reflecting effects caused by actions performed by the user on objects in the simulated 3D
dynamic environment as tracked from the input device and the 3D angles and spatial coordinates;

generating images of the simulated 3D dynamic environment in real time on a display device viewable by the user during performance of the training exercise as a function of the organ of vision-object relation;

recording data indicative of the actions performed by the user and the effects of the actions; and setting user qualification as a function of the recorded data.
27. The computer-implemented method according to claim 26 , wherein the input device comprises a physical object imitating a tool normally used to perform the training exercise, and the simulated 3D dynamic environment comprises a more realistic virtual representation of the tool than the physical object used as the input device.
28. The computer-implemented method according to claim 26, further comprising measuring elapsed time during the training exercise for timing functions of the training exercise.
29. The computer-implemented method according to claim 26, further comprising detecting a collision condition during the training exercise while computing the simulated 3D dynamic environment, and relaying the collision condition to the user.
30. The computer-implemented method according to claim 26, wherein the display device is integrated in one of a pair of goggles, a helmet and a structure wearable by the user during the training exercise to be viewable at all time during performance of the training exercise.
31. The computer-implemented method according to claim 26, further comprising superimposing virtual images over other virtual images in the images on the display device as modelled from the computing of the simulated 3D dynamic environment.
32. The computer-implemented method according to claim 26 wherein the display device has a degree of transparency letting the user see the workspace.
33. The computer-implemented method according to claim 26, further comprising superimposing informative data over the simulated 3D dynamic environment in the images on the display device.
34. The computer-implemented method according to claim 26, further comprising initiating the training exercise by a user action on on a of the input device and a computer apparatus implementing the method.
35. The computer-implemented method according to claim 26, wherein the computing of the simulated 3D dynamic environment comprises virtual modelling of phenomena resulting from the effects, and representation of the a phenomena in the images generated on the display device.
36. The computer-implemented method according to claim 35, wherein the phenomena comprise changes of states and properties of matter at interfaces and in portions of objects in the simulated 3D dynamic environment.
37. The computer-implemented method according to claim 35, wherein the phenomena comprise possible defects resulting from a poor performance of the user during the training exercise.
38. The computer-implemented method according to claim 26, wherein the computing of the simulated 3D dynamic environment comprises determining essential variables typical to a skill associated with the training exercise.
39. The computer-implemented method according to claim 38, wherein the essential variables comprise trajectory, speed and skill activity angle of objects interacted with during the training exercise.
40. The computer-implemented method according to claim 38, wherein the computing of the simulated 3D dynamic environment comprises linking the 3D
angles and spatial coordinates of the reference points to the essential variables.
41. The computer-implemented method according to claim 26, wherein the variables, parameters and controls comprise a tolerance degree for qualification of the training exercise.
42. The computer-implemented method according to claim 26, wherein the user qualification is set by pixel analysis of the data recorded.
43. The computer-implemented method according to claim 26, wherein the workspace comprises physical objects subjected to the actions performed by the user during the training exercise.
44. The computer-implemented method according to claim 26, wherein the computing of the simulated 3D dynamic environment comprises replicating the physical objects in the simulated 3D dynamic environment.
CA2554498A 2004-09-27 2005-09-26 Body motion training and qualification system and method Expired - Fee Related CA2554498C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA2554498A CA2554498C (en) 2004-09-27 2005-09-26 Body motion training and qualification system and method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CA002482240A CA2482240A1 (en) 2004-09-27 2004-09-27 Body motion training and qualification system and method
CA2,482,240 2004-09-27
PCT/CA2005/001460 WO2006034571A1 (en) 2004-09-27 2005-09-26 Body motion training and qualification system and method
CA2554498A CA2554498C (en) 2004-09-27 2005-09-26 Body motion training and qualification system and method

Publications (2)

Publication Number Publication Date
CA2554498A1 true CA2554498A1 (en) 2006-04-06
CA2554498C CA2554498C (en) 2016-06-28

Family

ID=36998254

Family Applications (1)

Application Number Title Priority Date Filing Date
CA2554498A Expired - Fee Related CA2554498C (en) 2004-09-27 2005-09-26 Body motion training and qualification system and method

Country Status (1)

Country Link
CA (1) CA2554498C (en)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014149398A1 (en) * 2013-03-15 2014-09-25 Illinois Tool Works Inc. Data storage and analysis for a welding training system
US9101994B2 (en) 2011-08-10 2015-08-11 Illinois Tool Works Inc. System and device for welding training
US9352411B2 (en) 2008-05-28 2016-05-31 Illinois Tool Works Inc. Welding training system
US9368045B2 (en) 2012-11-09 2016-06-14 Illinois Tool Works Inc. System and device for welding training
US9583014B2 (en) 2012-11-09 2017-02-28 Illinois Tool Works Inc. System and device for welding training
US9583023B2 (en) 2013-03-15 2017-02-28 Illinois Tool Works Inc. Welding torch for a welding training system
US9666100B2 (en) 2013-03-15 2017-05-30 Illinois Tool Works Inc. Calibration devices for a welding training system
US9672757B2 (en) 2013-03-15 2017-06-06 Illinois Tool Works Inc. Multi-mode software and method for a welding training system
US9713852B2 (en) 2013-03-15 2017-07-25 Illinois Tool Works Inc. Welding training systems and devices
US9724787B2 (en) 2014-08-07 2017-08-08 Illinois Tool Works Inc. System and method of monitoring a welding environment
US9724788B2 (en) 2014-01-07 2017-08-08 Illinois Tool Works Inc. Electrical assemblies for a welding system
US9751149B2 (en) 2014-01-07 2017-09-05 Illinois Tool Works Inc. Welding stand for a welding system
US9757819B2 (en) 2014-01-07 2017-09-12 Illinois Tool Works Inc. Calibration tool and method for a welding system
US9862049B2 (en) 2014-06-27 2018-01-09 Illinois Tool Works Inc. System and method of welding system operator identification
US9875665B2 (en) 2014-08-18 2018-01-23 Illinois Tool Works Inc. Weld training system and method
US9937578B2 (en) 2014-06-27 2018-04-10 Illinois Tool Works Inc. System and method for remote welding training
US10056010B2 (en) 2013-12-03 2018-08-21 Illinois Tool Works Inc. Systems and methods for a weld training system
US10105782B2 (en) 2014-01-07 2018-10-23 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US10170019B2 (en) 2014-01-07 2019-01-01 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US10204406B2 (en) 2014-11-05 2019-02-12 Illinois Tool Works Inc. System and method of controlling welding system camera exposure and marker illumination
US10210773B2 (en) 2014-11-05 2019-02-19 Illinois Tool Works Inc. System and method for welding torch display
US10239147B2 (en) 2014-10-16 2019-03-26 Illinois Tool Works Inc. Sensor-based power controls for a welding system
US10307853B2 (en) 2014-06-27 2019-06-04 Illinois Tool Works Inc. System and method for managing welding data
US10373304B2 (en) 2014-11-05 2019-08-06 Illinois Tool Works Inc. System and method of arranging welding device markers
US10373517B2 (en) 2015-08-12 2019-08-06 Illinois Tool Works Inc. Simulation stick welding electrode holder systems and methods
US10402959B2 (en) 2014-11-05 2019-09-03 Illinois Tool Works Inc. System and method of active torch marker control
US10417934B2 (en) 2014-11-05 2019-09-17 Illinois Tool Works Inc. System and method of reviewing weld data
US10427239B2 (en) 2015-04-02 2019-10-01 Illinois Tool Works Inc. Systems and methods for tracking weld training arc parameters
US10438505B2 (en) 2015-08-12 2019-10-08 Illinois Tool Works Welding training system interface
US10490098B2 (en) 2014-11-05 2019-11-26 Illinois Tool Works Inc. System and method of recording multi-run data
US10593230B2 (en) 2015-08-12 2020-03-17 Illinois Tool Works Inc. Stick welding electrode holder systems and methods
US10596650B2 (en) 2012-02-10 2020-03-24 Illinois Tool Works Inc. Helmet-integrated weld travel speed sensing system and method
US10657839B2 (en) 2015-08-12 2020-05-19 Illinois Tool Works Inc. Stick welding electrode holders with real-time feedback features
US10665128B2 (en) 2014-06-27 2020-05-26 Illinois Tool Works Inc. System and method of monitoring welding information
US10913126B2 (en) 2014-01-07 2021-02-09 Illinois Tool Works Inc. Welding software for detection and control of devices and for analysis of data
US11014183B2 (en) 2014-08-07 2021-05-25 Illinois Tool Works Inc. System and method of marking a welding workpiece
US11090753B2 (en) 2013-06-21 2021-08-17 Illinois Tool Works Inc. System and method for determining weld travel speed
US11247289B2 (en) 2014-10-16 2022-02-15 Illinois Tool Works Inc. Remote power supply parameter adjustment
US11288978B2 (en) 2019-07-22 2022-03-29 Illinois Tool Works Inc. Gas tungsten arc welding training systems
US11776423B2 (en) 2019-07-22 2023-10-03 Illinois Tool Works Inc. Connection boxes for gas tungsten arc welding training systems

Cited By (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11749133B2 (en) 2008-05-28 2023-09-05 Illinois Tool Works Inc. Welding training system
US9352411B2 (en) 2008-05-28 2016-05-31 Illinois Tool Works Inc. Welding training system
US11423800B2 (en) 2008-05-28 2022-08-23 Illinois Tool Works Inc. Welding training system
US10748442B2 (en) 2008-05-28 2020-08-18 Illinois Tool Works Inc. Welding training system
US9101994B2 (en) 2011-08-10 2015-08-11 Illinois Tool Works Inc. System and device for welding training
US10096268B2 (en) 2011-08-10 2018-10-09 Illinois Tool Works Inc. System and device for welding training
US11590596B2 (en) 2012-02-10 2023-02-28 Illinois Tool Works Inc. Helmet-integrated weld travel speed sensing system and method
US11612949B2 (en) 2012-02-10 2023-03-28 Illinois Tool Works Inc. Optical-based weld travel speed sensing system
US10596650B2 (en) 2012-02-10 2020-03-24 Illinois Tool Works Inc. Helmet-integrated weld travel speed sensing system and method
US9583014B2 (en) 2012-11-09 2017-02-28 Illinois Tool Works Inc. System and device for welding training
US10417935B2 (en) 2012-11-09 2019-09-17 Illinois Tool Works Inc. System and device for welding training
US9368045B2 (en) 2012-11-09 2016-06-14 Illinois Tool Works Inc. System and device for welding training
US9713852B2 (en) 2013-03-15 2017-07-25 Illinois Tool Works Inc. Welding training systems and devices
US9728103B2 (en) 2013-03-15 2017-08-08 Illinois Tool Works Inc. Data storage and analysis for a welding training system
US9672757B2 (en) 2013-03-15 2017-06-06 Illinois Tool Works Inc. Multi-mode software and method for a welding training system
US9666100B2 (en) 2013-03-15 2017-05-30 Illinois Tool Works Inc. Calibration devices for a welding training system
US10482788B2 (en) 2013-03-15 2019-11-19 Illinois Tool Works Inc. Welding torch for a welding training system
WO2014149398A1 (en) * 2013-03-15 2014-09-25 Illinois Tool Works Inc. Data storage and analysis for a welding training system
US9583023B2 (en) 2013-03-15 2017-02-28 Illinois Tool Works Inc. Welding torch for a welding training system
CN105051801A (en) * 2013-03-15 2015-11-11 伊利诺斯工具制品有限公司 Data storage and analysis for a welding training system
US11090753B2 (en) 2013-06-21 2021-08-17 Illinois Tool Works Inc. System and method for determining weld travel speed
US10056010B2 (en) 2013-12-03 2018-08-21 Illinois Tool Works Inc. Systems and methods for a weld training system
US11127313B2 (en) 2013-12-03 2021-09-21 Illinois Tool Works Inc. Systems and methods for a weld training system
US10913126B2 (en) 2014-01-07 2021-02-09 Illinois Tool Works Inc. Welding software for detection and control of devices and for analysis of data
US10105782B2 (en) 2014-01-07 2018-10-23 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US9724788B2 (en) 2014-01-07 2017-08-08 Illinois Tool Works Inc. Electrical assemblies for a welding system
US10170019B2 (en) 2014-01-07 2019-01-01 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US11241754B2 (en) 2014-01-07 2022-02-08 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US9751149B2 (en) 2014-01-07 2017-09-05 Illinois Tool Works Inc. Welding stand for a welding system
US11676509B2 (en) 2014-01-07 2023-06-13 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US9757819B2 (en) 2014-01-07 2017-09-12 Illinois Tool Works Inc. Calibration tool and method for a welding system
US10964229B2 (en) 2014-01-07 2021-03-30 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US10307853B2 (en) 2014-06-27 2019-06-04 Illinois Tool Works Inc. System and method for managing welding data
US9862049B2 (en) 2014-06-27 2018-01-09 Illinois Tool Works Inc. System and method of welding system operator identification
US10839718B2 (en) 2014-06-27 2020-11-17 Illinois Tool Works Inc. System and method of monitoring welding information
US9937578B2 (en) 2014-06-27 2018-04-10 Illinois Tool Works Inc. System and method for remote welding training
US10665128B2 (en) 2014-06-27 2020-05-26 Illinois Tool Works Inc. System and method of monitoring welding information
US11014183B2 (en) 2014-08-07 2021-05-25 Illinois Tool Works Inc. System and method of marking a welding workpiece
US9724787B2 (en) 2014-08-07 2017-08-08 Illinois Tool Works Inc. System and method of monitoring a welding environment
US9875665B2 (en) 2014-08-18 2018-01-23 Illinois Tool Works Inc. Weld training system and method
US10861345B2 (en) 2014-08-18 2020-12-08 Illinois Tool Works Inc. Weld training systems and methods
US11475785B2 (en) 2014-08-18 2022-10-18 Illinois Tool Works Inc. Weld training systems and methods
US11247289B2 (en) 2014-10-16 2022-02-15 Illinois Tool Works Inc. Remote power supply parameter adjustment
US10239147B2 (en) 2014-10-16 2019-03-26 Illinois Tool Works Inc. Sensor-based power controls for a welding system
US10417934B2 (en) 2014-11-05 2019-09-17 Illinois Tool Works Inc. System and method of reviewing weld data
US10210773B2 (en) 2014-11-05 2019-02-19 Illinois Tool Works Inc. System and method for welding torch display
US10402959B2 (en) 2014-11-05 2019-09-03 Illinois Tool Works Inc. System and method of active torch marker control
US10490098B2 (en) 2014-11-05 2019-11-26 Illinois Tool Works Inc. System and method of recording multi-run data
US11127133B2 (en) 2014-11-05 2021-09-21 Illinois Tool Works Inc. System and method of active torch marker control
US10373304B2 (en) 2014-11-05 2019-08-06 Illinois Tool Works Inc. System and method of arranging welding device markers
US10204406B2 (en) 2014-11-05 2019-02-12 Illinois Tool Works Inc. System and method of controlling welding system camera exposure and marker illumination
US11482131B2 (en) 2014-11-05 2022-10-25 Illinois Tool Works Inc. System and method of reviewing weld data
US10427239B2 (en) 2015-04-02 2019-10-01 Illinois Tool Works Inc. Systems and methods for tracking weld training arc parameters
US11462124B2 (en) 2015-08-12 2022-10-04 Illinois Tool Works Inc. Welding training system interface
US10438505B2 (en) 2015-08-12 2019-10-08 Illinois Tool Works Welding training system interface
US11594148B2 (en) 2015-08-12 2023-02-28 Illinois Tool Works Inc. Stick welding electrode holder systems and methods
US11081020B2 (en) 2015-08-12 2021-08-03 Illinois Tool Works Inc. Stick welding electrode with real-time feedback features
US10657839B2 (en) 2015-08-12 2020-05-19 Illinois Tool Works Inc. Stick welding electrode holders with real-time feedback features
US10373517B2 (en) 2015-08-12 2019-08-06 Illinois Tool Works Inc. Simulation stick welding electrode holder systems and methods
US10593230B2 (en) 2015-08-12 2020-03-17 Illinois Tool Works Inc. Stick welding electrode holder systems and methods
US11288978B2 (en) 2019-07-22 2022-03-29 Illinois Tool Works Inc. Gas tungsten arc welding training systems
US11776423B2 (en) 2019-07-22 2023-10-03 Illinois Tool Works Inc. Connection boxes for gas tungsten arc welding training systems

Also Published As

Publication number Publication date
CA2554498C (en) 2016-06-28

Similar Documents

Publication Publication Date Title
CA2554498C (en) Body motion training and qualification system and method
US8512043B2 (en) Body motion training and qualification system and method
Lavrentieva et al. Use of simulators together with virtual and augmented reality in the system of welders’ vocational training: past, present, and future
US9218745B2 (en) Virtual simulator method and system for neuromuscular training and certification via a communication network
US10629093B2 (en) Systems and methods providing enhanced education and training in a virtual reality environment
KR970005193B1 (en) Interactive aircraft training system and method
JP6687543B2 (en) System and method for hand welding training
US9773429B2 (en) System and method for manual welder training
US9336686B2 (en) Tablet-based welding simulator
CA2549553A1 (en) Virtual simulator method and system for neuromuscular training and certification via a communication network
CN105190724A (en) Systems and methods providing enhanced education and training in a virtual reality environment
WO2020179128A1 (en) Learning assist system, learning assist device, and program
EP3557560A1 (en) Simulated welding training supporting real-world applications
CN109144273A (en) A kind of virtual fire-fighting experiential method based on VR technology
CN212070747U (en) Intelligent virtual welding training device
Ye et al. Robot-assisted immersive kinematic experience transfer for welding training
KR102211108B1 (en) VR-based clean verification procedure training system of biological safety cabinet for GMP environment monitoring
US11538358B2 (en) Method of training for welding through virtual reality
Chen et al. Augmenting Embodied Learning in Welding Training: The Co-Design of an XR-and tinyML-Enabled Welding System for Creative Arts and Manufacturing Training
Adam Towards more realism: Improving immersion of a virtual human-robot working cell and discussing the comparability with its real-world representation
Ismail et al. VR Welding Kit: Welding Training Simulation in Mobile Virtual Reality using Multiple Marker Tracking Method
White Impact of Visualization Augmentation on Welder Training: A Study with the Simulated MIG Lab

Legal Events

Date Code Title Description
EEER Examination request
MKLA Lapsed

Effective date: 20180926