WO2014047944A1 - system and method for improving manufacturing production - Google Patents

system and method for improving manufacturing production Download PDF

Info

Publication number
WO2014047944A1
WO2014047944A1 PCT/CN2012/082496 CN2012082496W WO2014047944A1 WO 2014047944 A1 WO2014047944 A1 WO 2014047944A1 CN 2012082496 W CN2012082496 W CN 2012082496W WO 2014047944 A1 WO2014047944 A1 WO 2014047944A1
Authority
WO
WIPO (PCT)
Prior art keywords
worker
activity data
module
data
activity
Prior art date
Application number
PCT/CN2012/082496
Other languages
French (fr)
Inventor
Sheng Lai
Zhaohui Du
Minyue Chew
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Priority to PCT/CN2012/082496 priority Critical patent/WO2014047944A1/en
Publication of WO2014047944A1 publication Critical patent/WO2014047944A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6828Leg

Definitions

  • Embodiments of the invention are directed to improving manufacturing production, and, in particular, improving manufacturing production with Manufacturing Execution Systems.
  • an MES is an information system that manages manufacturing operations in a factory.
  • An MES typically consists of a set of integrated software and hardware components that provide functions for managing production activities from job order launch to finished products. Using current and accurate data, an MES can initiate, guide, respond to, and report on production activities as they occur.
  • MES In addition to such production activities, some MES also offer personnel management functionality.
  • the SIMATIC IT system a product of Siemens, includes a '"Personnel Manager” function module that assigns shifts to workers, keeps track of workers' activities (including breaks), and integrates personnel data with production data (for example, to check the number of personnel required for a task, or to log which equipment, material lot, batch or order on which each person worked).
  • workers represent one essential resource in the manufacturing system
  • the Personnel Manager module serves as a resource for personnel resource information management.
  • personnel data is collected and stored mainly for achieving efficient allocation of human resources, as which the same function is also provided in the MES for other key resources, such as equipment and materials, in a manufacturing process.
  • Such MES do not include information on the detailed real-time, or after the fact, physical activities of human operators during their working shift, and therefore cannot determine whether a human operator performs a designated task appropriately, nor judge whether a human operator is an effective resource allocation for a certain production task.
  • an MES system that can include information related to the detailed real-time activities of human operators during their working shift, and thus determine whether a human operator performs a designated task appropriately, or determine whether a human operator is an effective resource allocation for a certain production task.
  • Embodiments of the invention are directed to this and other needs. Embodiments are directed to systems and methods that can include detailed real-time (or after-the-fact) activities of human operators into production, manufacturing and MES systems.
  • a system for creating activity data of a worker can include a sensor module configured to sense the motions of a worker and generate worker acceleration data related to the sensed motions of the worker, and an analysis module configured to analyze the worker acceleration data, and generate worker activity data from the worker acceleration data.
  • the sensor module can include one or more inertial sensors, and the sensor module can be disposed on the body of the worker.
  • Embodiments can also include a storage module, wherein the worker activity data is stored in the storage module.
  • Embodiments can also include a comparison module, configured to compare the worker activity data to benchmark activity data, wherein the benchmark activity data is previously stored in a storage module.
  • Embodiments can also include a match module, configured to determine if the worker activity data matches the benchmark activity data and an acceptance module, configured to accept the worker activity data when the worker activity data is determined to match the benchmark activity data, by the match determination module, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is a trainee worker attempting to perform the task and wherein, when the activity data is accepted by the acceptance module, the acceptance module has determined that the trainee has sufficiently performed the task.
  • a match module configured to determine if the worker activity data matches the benchmark activity data
  • an acceptance module configured to accept the worker activity data when the worker activity data is determined to match the benchmark activity data, by the match determination module, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is a trainee worker attempting to perform the task and wherein, when the activity data is accepted by the acceptance module, the acceptance module has determined that the trainee has sufficiently performed the task.
  • Embodiments can also include an MES module, wherein the worker activity data is used by the MES module to designate the worker as a production resource.
  • Embodiments can also include a match module, configured to determine if the worker activity data matches the benchmark activity data and an alert module, configured to generate an alert when the worker activity data is determined not to match the benchmark activity data, by the match determination module, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is attempting to perform the task and wherein, the alert module generates the alert to signify that the task has not been performed properly by the worker.
  • a match module configured to determine if the worker activity data matches the benchmark activity data
  • an alert module configured to generate an alert when the worker activity data is determined not to match the benchmark activity data, by the match determination module, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is attempting to perform the task and wherein, the alert module generates the alert to signify that the task has not been performed properly by the worker.
  • Embodiments of the invention can be directed to a method of creating activity data of a worker, with the method including sensing, with a sensor module, the motions of a worker and generating worker acceleration data related to the sensed motions of the worker; and analyzing the worker acceleration data, and generating worker activity data from the worker acceleration data, wherein the sensor module includes one or more inertial sensors, and wherein the sensor module is disposed on a body of the worker.
  • Embodiments can also include storing the worker activity data.
  • Embodiments can also include comparing the worker activity data to benchmark activity data, wherein the benchmark activity data is previously stored.
  • Embodiments can also include determining if the worker activity data matches the benchmark activity data and accepting the worker activity data when the worker activity data is determined to match the benchmark activity data, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is a trainee worker attempting to perform the task and wherein, when the activity data is accepted, it has been determined that the trainee has sufficiently performed the task.
  • Embodiments can also include the worker activity data being used by an MES module to designate the worker as a production resource.
  • Embodiments can also include determining if the worker activity data matches the benchmark activity data and generating an alert when the worker activity data is determined not to match the benchmark activity data, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is attempting to perform the task and wherein, the alert is generated to signify that the task has not been performed properly by the worker.
  • FIG. 1 is a schematic view of an integration of a motion capture and recognition system with a computer system, including an MES, showing various operation modules, in accordance with embodiments of the invention
  • FIG. 2 is a summary of types of motion and activity, as described in accordance with embodiments of the invention.
  • FIG. 3 is a graphical representation of various motions in relation to acceleration levels, in accordance with embodiments of the invention.
  • FIG. 4 is a flow diagram showing a process of building a worker's semantic activity model, in accordance with certain embodiments
  • FIG. 5 is a flow diagram showing a process of training and evaluating of worker skills in a computer system, including an MES, in accordance with certain embodiments.
  • FIG. 6 is a flow diagram showing a process of on-line operations governance in a computer system, including an MES, in accordance with certain embodiments.
  • Embodiments of the invention provide systems and methods to acquire information related to the detailed real-time physical activities of human operators (or workers) and integrate it into an MES and/or other computer system for optimizing a production process.
  • FIG. 1 there is shown a schematic view depicting the integration of a motion capture and recognition system with a computer system, including an MES.
  • a worker or operator
  • works on a manufacturing process such as, for example, assembly line, handling engines at an automobile assembly plant. Other manufacturing processes could also be employed.
  • One or more sensor modules 13 are attached to the worker at, for example, his or her wrists.
  • sensor module or modules 13 could be attached to other parts of the worker's body, such as, for example, at the workers legs, torso and/or head.
  • Sensor module 13 can include inertial sensors (e.g., accelerometers or gyroscopes), an A/D converter, a radio frequency (RF) transmitter and other components, as would be known to one of skill in the art.
  • the inertial sensors measure the worker's motions and generate acceleration data accordingly.
  • the acceleration data is then wirelessly broadcast to, and received by, a wireless detector module 14.
  • a wired connection could also be used.
  • embodiments of the invention are described related to the use of acceleration data, other types of motion and/or positional data could also or alternatively be tracked.
  • the detector module 14 converts the wireless signals into industrially compatible wired signals, and communicatively connects (via either wired or wireless connection) with an industrial computer system 15.
  • Computer system 15 can include one or more computing devices (or computers), which are connected to and communicate with each other via an industrial Ethernet, or by other communication means.
  • Computer system 15 includes various modules which can be implemented via software and/or hardware on the one or more computing devices. Multiple modules can be implemented on a single computing device, and/or a single module can be implemented by multiple computing devices.
  • One or more processor modules 120 can each include a central processing unit (CPU) and can perform calculations and control operations, as is known to those of skill in the art.
  • An MES module 190 is configured to perform MES functions for managing production activities, including those as described above.
  • Storage module 140 can include one or more storage devices, including flexible disks, hard disks, optical disks, magneto-optical disks, CD-ROM, CD-R, magnetic tape, non-volatile memory card, ROM and RAM, as are known by those of skill in the art.
  • Computer system 15 can process the received data from the detection module 14 with pattern recognition algorithms, by way of various modules, to generate a time-related motion sequences, and can use this information, as described in further detail below.
  • the acquired motion activity data from the worker is analyzed by an analysis module 130, as described in further detail below.
  • a worker's motion and activity 200 can be differentiated by its duration and complexity. On a smaller time scale (i.e., seconds), brief and distinct body movements 220 such as moving left, moving upper right, or moving around are referred to herein as “gestures” or “motions.”
  • a "low-level activity” 240 is a sequence of movements, such as drilling, conveying and wrenching, or a distinct posture, such as walking, standing and sitting. Such a low-level activity typically lasts for several minutes.
  • a collection of low- level activities to fulfill a certain manufacturing production task is referred to herein as a "higher-level” or “semantic” activity 260.
  • Such semantic activity is more complicated than the simple gesture or motion and low-level activity, and can last as long as a few hours.
  • Embodiments of the invention can be used to integrate semantic activity into a computer system, including an MES, to produce improved manufacturing efficiency.
  • Acceleration signals as would be received from sensor module 13, attached to a worker, are depicted.
  • the acceleration variation corresponding to different worker (i.e., human) motions along with time is recorded, with the x-axis representing time and the y-axis representing acceleration.
  • the whole time series of acceleration signals relates to a specific worker's semantic activity, such as, for example, drilling a hole on a plate with a drilling machine, and can be divided into five sessions, or five individual motions, 310, 320, 330, 340 and 350, such as putting a plate on the machine platform, holding the machine arm, drilling a hole on a plate, as well as other activities.
  • the motions are segmented based on the variance of the acceleration signals, that is, a measure of how far the set of acceleration signals are spread out from each other.
  • the curve 360 is the time series of original acceleration signals
  • the curve 370 is the calculated variance series corresponding to the acceleration curve 360. If the variance is very small (i.e., smaller than a certain predetermined threshold) during a certain time period, it can be determined that during this period the acceleration signals did not significantly change, and thus, the worker is static and did not perform any significant motions. Conversely, if the variance is bigger than a larger pre-determined threshold, it can be inferred that the corresponding acceleration signals are generated by significant human motions. Thus, the static periods between these motions are used to segment the whole acceleration series into separated motion sessions.
  • the semantic activity model corresponds to a collection of worker's activity data.
  • the worker performs a manufacturing task and creates a series of motions, and the sensor module 13 worn by the worker senses the motions, and generates acceleration signals that are transmitted to a detector module 14.
  • the detector module 14 converts the acceleration signals into industrially compatible signals. Alternatively, in some embodiments, functions of detector module 14 can be performed by sensor module 13.
  • the acceleration signals are transmitted to computer system 15.
  • the analysis module 130 processes and analyzes the acceleration signals, and the movements of the worker are abstracted, modeled and categorized as certain types of motions. Such modeling can be based on the use of known analysis techniques, such as, for example, the hidden Markov model and the ant colony model.
  • the worker's semantic activity model and worker activity data are stored in the storage module 140, so that they can be subsequently used for future production optimization applications, in combination with functions performed by the various modules of computer system 15, including MES module 190, or for other purposes, such as for designing intuitive and friendly human machine interfaces ("HMI”) to control industrial machines.
  • HMI human machine interfaces
  • a stored worker's semantic activity model can be integrated into the functions of the computer system 15, including MES module 190 (or another MES) in several ways, to improve production efficiency and/or safety.
  • MES module 190 or another MES
  • one such method of integration includes a method 500 to improve the manufacturing process efficiency of a worker, such as a worker being trained to work on a production line (trainee worker).
  • Embodiments of this method relate to the training and evaluation of a worker's skills using semantic activity information (worker activity data).
  • a sensor module 13 is attached to the body of a worker in training.
  • step S520 the trainee mimics a standard manufacturing activity and creates a series of motions to finish a certain production task, such as, for example, drilling a hole on a plate and cutting a bushing piece.
  • motions performed by the worker are sensed by sensor module 13, worn by the worker, and generated acceleration signals are transmitted to a detector module 14.
  • the detector module 14 converts the acceleration signals into industrially compatible signals.
  • these acceleration signals are then transmitted to computer system 15.
  • analysis module 130 processes and analyzes the received acceleration signals to abstract the features representing the worker's activity, and to store a semantic activity model of the worker's activity model (worker activity data) at storage module 140.
  • Such features can be obtained by known analysis techniques, such as the hidden Markov model and the ant colony model.
  • the details can be found in books or articles, such as Gemot A Fink, Markov Models for Pattern Recognition: From Theory to Applications, November, 2010 and Christoph Amma, et al. Airwriting Recognition using Wearable Motion Sensors.
  • a predefined worker's semantic activity model is used as a standard manufacturing activity benchmark (benchmark activity data).
  • the features of the trainee worker's activity model are compared with those of the benchmark activity data by the comparison module 150, as shown in step S570.
  • the features are obtained in S560.
  • an evaluation is performed by the match module 160 to determine whether the trainee worker's activity model suitably matches the predefined standard activity model. If yes, the flow proceeds to step S590, and the trainee worker is accepted, by the acceptance module 170, as a qualified personnel resource for the particular performed manufacturing process.
  • the trainee worker can be allocated by the MES module 190 (or other module) as an appropriate personnel resource for certain designated manufacturing processes. If the trainee's activity model does not suitably match the predefined standard activity model, the trainee worker is not accepted by the acceptance module 170, and the method flow proceeds back to step S520, where the trainee worker is required to mimic the standard activity again.
  • the skills of a trainee worker can be efficiently evaluated. An expert worker's (operator's) activities can be recorded, when the expert worker performs a certain production task. Those recorded activities can then be analyzed and abstracted and stored as a standard semantic activity model as a benchmark for the production task.
  • a method 600 for improving manufacturing process safety by using worker semantic activity information (worker activity data).
  • a sensor module(s) 13 is installed on the body of a worker who is engaged in a certain production task.
  • the worker performs a series of motions as he completes the production task.
  • the sensor module 13 senses these motions and generates a series of acceleration signals, and generated acceleration signals are transmitted to a detector module 14.
  • the detector module 14 converts the acceleration signals into industrially compatible signals.
  • the acceleration signals are transmitted to computer system 15.
  • the analysis module 130 of computer system 15 processes and analyzes the received acceleration signals, abstracts the features and creates an activity model of the worker's activity.
  • the features can be obtained by known analysis techniques, such as the hidden Markov model and the ant colony model, which are mentioned above.
  • a predefined worker's semantic activity model is loaded into the comparison module 150 as the benchmark of the standard activity for the particular production task (benchmark activity data).
  • the worker's semantic activity model is compared by the comparison module 150 with that of the benchmark semantic activity model through features.
  • step S680 an evaluation is performed by the comparison module 150 to determine whether the worker's semantic activity model suitably matches the benchmark semantic activity model, or in other words, whether the worker performs the right manufacturing activities for this production task. If yes, the flow proceeds to step S690a, no action is taken, and the worker's activities continue to be sensed and monitored. If no, the flow proceeds to step S690b, and alert module 180 issues a warning that an incorrect activity has been performed by the worker, and alerts that remedial measures, such as switching off critical machines, must be taken to avoid resulting imminent accidents. Alternatively, portions of a manufacturing line can be automatically shut down, or, in certain circumstances, the production line can be allowed to continue running, with the incorrect activity being logged for later review.
  • computer system 15 can set a predefined semantic activity model as standard operations which must be executed by workers when they perform a certain production task.
  • a predefined semantic activity model By on-line sensing and recognizing workers' motions, and comparing them with a standard benchmark semantic activity model, it can be determined whether a certain task is being performed correctly. Specifically, it can be determined whether all the necessary operations have been executed correctly (e.g., no omissions and incorrect operations), and whether the correct sequence of each operation is preserved (e.g., no incorrect order of operations).
  • the quick detection of any abnormal operation can be the basis of taking appropriate remedial measures to avoid uncontrollable risks and improve manufacturing safety.
  • embodiments of the invention can provide for the integration of a worker's motions and semantic activity information, acquired by a motion capture and recognition system based on inertial sensors (e.g., accelerometers and/or gyroscopes).
  • inertial sensors e.g., accelerometers and/or gyroscopes.
  • Embodiments of the invention use inexpensive inertial sensors to detect acceleration signals that correspond to workers' motions, unlike costly vision detection solutions based on cameras.
  • the inertial sensors can be very compact, so a worker can wear these sensors on the body, and move and perform tasks freely.
  • the acceleration detection will not be affected by moving backgrounds and varying lighting conditions, which are still the main technical challenges for vision-based motion recognition systems.
  • Embodiments of the invention can improve a worker's operation consistency and efficiency.
  • Embodiments of the invention can provide an interactive way to help a trainee worker to practice a standard operation activity.
  • the trainee worker or a supervisor can evaluate the training process, and keep improving the trainee worker's operation skill by reducing the gap between the trainee worker's motions and the standard benchmark activity model.
  • By improving the trainee worker's skills their operation consistency can be optimized, and operation efficiency of a production process can be enhanced.
  • embodiments of the invention provide a way to maintain a safe production environment by monitoring a worker's operation activities, and reacting to misoperations more quickly.
  • Workers' real-time semantic activity information is obtained by sensing and recognition of the workers' motions.
  • the worker's semantic activity is compared with a standard benchmark activity model to determine whether a specific operation task is performed correctly (i.e., with no omitted steps and in a correct order of operation steps). Accordingly, even if incorrect operations are performed, they can be detected quickly, and appropriate corrective actions will be taken to avoid uncontrolled accidents and preserve the safety of the manufacturing system.
  • embodiments of the invention can play an important role in realizing a human-centered manufacturing execution system (HcMES) to coordinate human and machine activities, and achieve optimal automatic manufacturing production.
  • HcMES human-centered manufacturing execution system
  • embodiments of computer system 15 can include one or more computers.
  • described modules and functions can be performed on a single computing device, or by multiple computing devices. Multiple modules and/or functions can be included and/or performed by a single computing device or by multiple computing devices. Further, a single module and/or function can be included and/or performed by multiple computing devices.
  • Embodiments of the invention can be implemented by software, hardware, or a combination of software and hardware.
  • Embodiments of the invention can also be implemented a recording medium containing program code of a software that implements the functions of the above embodiments whereby the recording medium is supplied to a system or apparatus, whose computing device (including processor a CPU) then reads the program code out of the recording medium and executes it.
  • the program code itself read out of the computer-readable recording medium will implement the functions of the above embodiments, and the recording medium which stores the program code will constitute the present invention.
  • the recording medium used to supply the program code include, for example, a flexible disk, hard disk, optical disk, magneto-optical disk, CD-ROM, CD-R, magnetic tape, non-volatile memory card, and ROM.
  • Such a recording medium could take a transitory and/or non-transitory form.
  • the functions of the above embodiments can be implemented not only by the program code read out and executed by a computing device, but also by part or all of the actual processing executed according to instructions from the program code by an OS (operating system) running on the computer.
  • OS operating system
  • the functions of the above embodiments can also be implemented by part or all of the actual processing executed by a CPU or the like contained in a function expansion card inserted in a computing device, or a function expansion unit connected to the computing device if the processing is performed according to instructions from the program code that has been read out of the recording medium and written into memory on the function expansion card or unit.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Factory Administration (AREA)

Abstract

Aspects of the invention are related to systems and methods for creating activity data of a worker. The invention can include a sensor module configured to sense the motions of a worker and generate worker acceleration data related to the sensed motions of the worker and an analysis module configured to analyze the worker acceleration data, and generate worker activity data from the worker acceleration data. The sensor module can includes one or more inertial sensors, and can be positioned on a body of the worker. The invention can also include a storage module, wherein the worker activity data is stored in the storage module. Further, the invention can also include a comparison module, configured to compare the worker activity data to benchmark activity data, wherein the benchmark activity data is previously stored in a storage module.

Description

SYSTEM AND METHOD FOR IMPROVING MANUFACTURING PRODUCTION
Background Of The Invention
Field of the Invention
[001] Embodiments of the invention are directed to improving manufacturing production, and, in particular, improving manufacturing production with Manufacturing Execution Systems.
Description of Related Art
[002] In the current business environment, manufacturers are faced with more and more complex challenges to remain competitive and increase their profitability. For this reason, many manufacturers employ manufacturing execution systems ("MES") to address these challenges. Generally, an MES is an information system that manages manufacturing operations in a factory. An MES typically consists of a set of integrated software and hardware components that provide functions for managing production activities from job order launch to finished products. Using current and accurate data, an MES can initiate, guide, respond to, and report on production activities as they occur.
[003] In addition to such production activities, some MES also offer personnel management functionality. For example, the SIMATIC IT system, a product of Siemens, includes a '"Personnel Manager" function module that assigns shifts to workers, keeps track of workers' activities (including breaks), and integrates personnel data with production data (for example, to check the number of personnel required for a task, or to log which equipment, material lot, batch or order on which each person worked). In such MES, workers represent one essential resource in the manufacturing system, and the Personnel Manager module serves as a resource for personnel resource information management. With the Personnel Manager module, personnel data is collected and stored mainly for achieving efficient allocation of human resources, as which the same function is also provided in the MES for other key resources, such as equipment and materials, in a manufacturing process.
[004] Such MES, however, do not include information on the detailed real-time, or after the fact, physical activities of human operators during their working shift, and therefore cannot determine whether a human operator performs a designated task appropriately, nor judge whether a human operator is an effective resource allocation for a certain production task.
[005] Thus, there is a need for an MES system that can include information related to the detailed real-time activities of human operators during their working shift, and thus determine whether a human operator performs a designated task appropriately, or determine whether a human operator is an effective resource allocation for a certain production task.
Summary of the Invention
[006] Embodiments of the invention are directed to this and other needs. Embodiments are directed to systems and methods that can include detailed real-time (or after-the-fact) activities of human operators into production, manufacturing and MES systems.
[007] In some embodiments, a system for creating activity data of a worker can include a sensor module configured to sense the motions of a worker and generate worker acceleration data related to the sensed motions of the worker, and an analysis module configured to analyze the worker acceleration data, and generate worker activity data from the worker acceleration data. The sensor module can include one or more inertial sensors, and the sensor module can be disposed on the body of the worker. [008] Embodiments can also include a storage module, wherein the worker activity data is stored in the storage module.
[009] Embodiments can also include a comparison module, configured to compare the worker activity data to benchmark activity data, wherein the benchmark activity data is previously stored in a storage module.
[0010] Embodiments can also include a match module, configured to determine if the worker activity data matches the benchmark activity data and an acceptance module, configured to accept the worker activity data when the worker activity data is determined to match the benchmark activity data, by the match determination module, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is a trainee worker attempting to perform the task and wherein, when the activity data is accepted by the acceptance module, the acceptance module has determined that the trainee has sufficiently performed the task.
[0011] Embodiments can also include an MES module, wherein the worker activity data is used by the MES module to designate the worker as a production resource.
[0012] Embodiments can also include a match module, configured to determine if the worker activity data matches the benchmark activity data and an alert module, configured to generate an alert when the worker activity data is determined not to match the benchmark activity data, by the match determination module, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is attempting to perform the task and wherein, the alert module generates the alert to signify that the task has not been performed properly by the worker.
[0013] Embodiments of the invention can be directed to a method of creating activity data of a worker, with the method including sensing, with a sensor module, the motions of a worker and generating worker acceleration data related to the sensed motions of the worker; and analyzing the worker acceleration data, and generating worker activity data from the worker acceleration data, wherein the sensor module includes one or more inertial sensors, and wherein the sensor module is disposed on a body of the worker.
[0014] Embodiments can also include storing the worker activity data.
[0015] Embodiments can also include comparing the worker activity data to benchmark activity data, wherein the benchmark activity data is previously stored.
[0016] Embodiments can also include determining if the worker activity data matches the benchmark activity data and accepting the worker activity data when the worker activity data is determined to match the benchmark activity data, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is a trainee worker attempting to perform the task and wherein, when the activity data is accepted, it has been determined that the trainee has sufficiently performed the task.
[0017] Embodiments can also include the worker activity data being used by an MES module to designate the worker as a production resource.
[0018] Embodiments can also include determining if the worker activity data matches the benchmark activity data and generating an alert when the worker activity data is determined not to match the benchmark activity data, wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is attempting to perform the task and wherein, the alert is generated to signify that the task has not been performed properly by the worker.
[0019] By way of embodiments of the invention, as described in further detail below, real-time activities of human operators into production, manufacturing and MES systems. Brief Description of the Drawings
[0020] The invention will be more readily understood from the detailed description of exemplary embodiments presented below considered in conjunction with the attached drawings, of which:
[0021] FIG. 1 is a schematic view of an integration of a motion capture and recognition system with a computer system, including an MES, showing various operation modules, in accordance with embodiments of the invention;
[0022] FIG. 2 is a summary of types of motion and activity, as described in accordance with embodiments of the invention;
[0023] FIG. 3 is a graphical representation of various motions in relation to acceleration levels, in accordance with embodiments of the invention;
[0024] FIG. 4 is a flow diagram showing a process of building a worker's semantic activity model, in accordance with certain embodiments;
[0025] FIG. 5 is a flow diagram showing a process of training and evaluating of worker skills in a computer system, including an MES, in accordance with certain embodiments; and
[0026] FIG. 6 is a flow diagram showing a process of on-line operations governance in a computer system, including an MES, in accordance with certain embodiments.
[0027] It is to be understood that the attached drawings are for purposes of illustrating, and not limiting, the concepts of the invention.
Detailed Description of Embodiments of the Invention
[0028] Embodiments of the invention provide systems and methods to acquire information related to the detailed real-time physical activities of human operators (or workers) and integrate it into an MES and/or other computer system for optimizing a production process. [0029] With reference to FIG. 1 , there is shown a schematic view depicting the integration of a motion capture and recognition system with a computer system, including an MES. A worker (or operator) works on a manufacturing process, such as, for example, assembly line, handling engines at an automobile assembly plant. Other manufacturing processes could also be employed. One or more sensor modules 13 are attached to the worker at, for example, his or her wrists. In other embodiments, sensor module or modules 13 could be attached to other parts of the worker's body, such as, for example, at the workers legs, torso and/or head. Sensor module 13 can include inertial sensors (e.g., accelerometers or gyroscopes), an A/D converter, a radio frequency (RF) transmitter and other components, as would be known to one of skill in the art. The inertial sensors measure the worker's motions and generate acceleration data accordingly. The acceleration data is then wirelessly broadcast to, and received by, a wireless detector module 14. In alternate embodiments, a wired connection could also be used. Also, while embodiments of the invention are described related to the use of acceleration data, other types of motion and/or positional data could also or alternatively be tracked. The detector module 14 converts the wireless signals into industrially compatible wired signals, and communicatively connects (via either wired or wireless connection) with an industrial computer system 15. Computer system 15 can include one or more computing devices (or computers), which are connected to and communicate with each other via an industrial Ethernet, or by other communication means. Computer system 15 includes various modules which can be implemented via software and/or hardware on the one or more computing devices. Multiple modules can be implemented on a single computing device, and/or a single module can be implemented by multiple computing devices. One or more processor modules 120 can each include a central processing unit (CPU) and can perform calculations and control operations, as is known to those of skill in the art. An MES module 190 is configured to perform MES functions for managing production activities, including those as described above. The functions of other modules of computer system 15 are described below. While processor module 120 and MES module 190 are depicted as being distinct from the other modules, in some embodiments, the functions of the various modules of computer system 15 can overlap, or work in conjunction with each other. Storage module 140 can include one or more storage devices, including flexible disks, hard disks, optical disks, magneto-optical disks, CD-ROM, CD-R, magnetic tape, non-volatile memory card, ROM and RAM, as are known by those of skill in the art.
[0030] Computer system 15 can process the received data from the detection module 14 with pattern recognition algorithms, by way of various modules, to generate a time-related motion sequences, and can use this information, as described in further detail below. The acquired motion activity data from the worker (worker activity data) is analyzed by an analysis module 130, as described in further detail below.
[0031] Different types of motion and activity can be defined and categorized as shown in FIG. 2. A worker's motion and activity 200 can be differentiated by its duration and complexity. On a smaller time scale (i.e., seconds), brief and distinct body movements 220 such as moving left, moving upper right, or moving around are referred to herein as "gestures" or "motions." A "low-level activity" 240, as referred to herein, is a sequence of movements, such as drilling, conveying and wrenching, or a distinct posture, such as walking, standing and sitting. Such a low-level activity typically lasts for several minutes. In turn, a collection of low- level activities to fulfill a certain manufacturing production task, such as working with a drilling machine to drill a hole on a plate, and cutting a bushing piece with a gas flame machine, is referred to herein as a "higher-level" or "semantic" activity 260. Such semantic activity is more complicated than the simple gesture or motion and low-level activity, and can last as long as a few hours. Embodiments of the invention can be used to integrate semantic activity into a computer system, including an MES, to produce improved manufacturing efficiency. [0032] With reference to FIG. 3, and continued reference to FIG. 1 , there is shown a graphical representation 300 of worker motion and activity over time. Acceleration signals as would be received from sensor module 13, attached to a worker, are depicted. The acceleration variation corresponding to different worker (i.e., human) motions along with time is recorded, with the x-axis representing time and the y-axis representing acceleration. The whole time series of acceleration signals relates to a specific worker's semantic activity, such as, for example, drilling a hole on a plate with a drilling machine, and can be divided into five sessions, or five individual motions, 310, 320, 330, 340 and 350, such as putting a plate on the machine platform, holding the machine arm, drilling a hole on a plate, as well as other activities. The motions are segmented based on the variance of the acceleration signals, that is, a measure of how far the set of acceleration signals are spread out from each other. The curve 360 is the time series of original acceleration signals, and the curve 370 is the calculated variance series corresponding to the acceleration curve 360. If the variance is very small (i.e., smaller than a certain predetermined threshold) during a certain time period, it can be determined that during this period the acceleration signals did not significantly change, and thus, the worker is static and did not perform any significant motions. Conversely, if the variance is bigger than a larger pre-determined threshold, it can be inferred that the corresponding acceleration signals are generated by significant human motions. Thus, the static periods between these motions are used to segment the whole acceleration series into separated motion sessions.
[0033] With reference to FIG. 4, and continuing reference to FIG. 1 , there is shown a method 400 of processing acceleration data relating to worker (i.e., human) motion and activity data, and building a worker's semantic activity model. The semantic activity model corresponds to a collection of worker's activity data. First, at step S401 , the worker performs a manufacturing task and creates a series of motions, and the sensor module 13 worn by the worker senses the motions, and generates acceleration signals that are transmitted to a detector module 14. The detector module 14 converts the acceleration signals into industrially compatible signals. Alternatively, in some embodiments, functions of detector module 14 can be performed by sensor module 13. Next, at step S410, the acceleration signals are transmitted to computer system 15. Then, at step S420, the analysis module 130 processes and analyzes the acceleration signals, and the movements of the worker are abstracted, modeled and categorized as certain types of motions. Such modeling can be based on the use of known analysis techniques, such as, for example, the hidden Markov model and the ant colony model. Finally, at the step S430 and S440, the worker's semantic activity model and worker activity data are stored in the storage module 140, so that they can be subsequently used for future production optimization applications, in combination with functions performed by the various modules of computer system 15, including MES module 190, or for other purposes, such as for designing intuitive and friendly human machine interfaces ("HMI") to control industrial machines.
[0034] A stored worker's semantic activity model (worker activity data) can be integrated into the functions of the computer system 15, including MES module 190 (or another MES) in several ways, to improve production efficiency and/or safety. With reference to FIG. 5, and continued reference to FIG. 1 , one such method of integration includes a method 500 to improve the manufacturing process efficiency of a worker, such as a worker being trained to work on a production line (trainee worker). Embodiments of this method relate to the training and evaluation of a worker's skills using semantic activity information (worker activity data). At step S510, as described above, a sensor module 13 is attached to the body of a worker in training. Next, at step S520, the trainee mimics a standard manufacturing activity and creates a series of motions to finish a certain production task, such as, for example, drilling a hole on a plate and cutting a bushing piece. At step S530, motions performed by the worker are sensed by sensor module 13, worn by the worker, and generated acceleration signals are transmitted to a detector module 14. The detector module 14 converts the acceleration signals into industrially compatible signals. At step S540, these acceleration signals are then transmitted to computer system 15. At step S550, analysis module 130 processes and analyzes the received acceleration signals to abstract the features representing the worker's activity, and to store a semantic activity model of the worker's activity model (worker activity data) at storage module 140. Such features can be obtained by known analysis techniques, such as the hidden Markov model and the ant colony model. The details can be found in books or articles, such as Gemot A Fink, Markov Models for Pattern Recognition: From Theory to Applications, November, 2010 and Christoph Amma, et al. Airwriting Recognition using Wearable Motion Sensors.
[0035] At step S560, a predefined worker's semantic activity model, previously stored at storage module 140, is used as a standard manufacturing activity benchmark (benchmark activity data). The features of the trainee worker's activity model are compared with those of the benchmark activity data by the comparison module 150, as shown in step S570. The features are obtained in S560. At step S580, an evaluation is performed by the match module 160 to determine whether the trainee worker's activity model suitably matches the predefined standard activity model. If yes, the flow proceeds to step S590, and the trainee worker is accepted, by the acceptance module 170, as a qualified personnel resource for the particular performed manufacturing process. After such acceptance, the trainee worker can be allocated by the MES module 190 (or other module) as an appropriate personnel resource for certain designated manufacturing processes. If the trainee's activity model does not suitably match the predefined standard activity model, the trainee worker is not accepted by the acceptance module 170, and the method flow proceeds back to step S520, where the trainee worker is required to mimic the standard activity again. [0036] Thus, by way of embodiments of the invention, the skills of a trainee worker can be efficiently evaluated. An expert worker's (operator's) activities can be recorded, when the expert worker performs a certain production task. Those recorded activities can then be analyzed and abstracted and stored as a standard semantic activity model as a benchmark for the production task. When other workers, such as, for example, trainee workers, are performing the same production task, their motions can be sensed, analyzed and compared with the standard semantic activity model. Therefore, workers can be trained to perform the production task as close to optimally as possible by benchmarking their motions with the benchmark activity model of the expert worker, and their operation skills can be evaluated by quantifying the difference between their motions and the benchmark semantic activity model. The information resulting from such evaluation and quantification can be used by the MES module (or another MES or other module or computer system) to improve the manufacturing process efficiency.
[0037] With reference to FIG. 6, and continued reference to FIG. 1 , there is shown a method 600 for improving manufacturing process safety by using worker semantic activity information (worker activity data). At step S610, a sensor module(s) 13 is installed on the body of a worker who is engaged in a certain production task. At step S620, the worker performs a series of motions as he completes the production task. At step S630, the sensor module 13 senses these motions and generates a series of acceleration signals, and generated acceleration signals are transmitted to a detector module 14. The detector module 14 converts the acceleration signals into industrially compatible signals. At step S640, the acceleration signals are transmitted to computer system 15. At step S650, the analysis module 130 of computer system 15 processes and analyzes the received acceleration signals, abstracts the features and creates an activity model of the worker's activity. The features can be obtained by known analysis techniques, such as the hidden Markov model and the ant colony model, which are mentioned above. Next, at step S660, a predefined worker's semantic activity model is loaded into the comparison module 150 as the benchmark of the standard activity for the particular production task (benchmark activity data). At step S670, the worker's semantic activity model is compared by the comparison module 150 with that of the benchmark semantic activity model through features. At step S680, an evaluation is performed by the comparison module 150 to determine whether the worker's semantic activity model suitably matches the benchmark semantic activity model, or in other words, whether the worker performs the right manufacturing activities for this production task. If yes, the flow proceeds to step S690a, no action is taken, and the worker's activities continue to be sensed and monitored. If no, the flow proceeds to step S690b, and alert module 180 issues a warning that an incorrect activity has been performed by the worker, and alerts that remedial measures, such as switching off critical machines, must be taken to avoid resulting imminent accidents. Alternatively, portions of a manufacturing line can be automatically shut down, or, in certain circumstances, the production line can be allowed to continue running, with the incorrect activity being logged for later review.
[0038] Thus, by way of method 600, computer system 15 can set a predefined semantic activity model as standard operations which must be executed by workers when they perform a certain production task. By on-line sensing and recognizing workers' motions, and comparing them with a standard benchmark semantic activity model, it can be determined whether a certain task is being performed correctly. Specifically, it can be determined whether all the necessary operations have been executed correctly (e.g., no omissions and incorrect operations), and whether the correct sequence of each operation is preserved (e.g., no incorrect order of operations). The quick detection of any abnormal operation can be the basis of taking appropriate remedial measures to avoid uncontrollable risks and improve manufacturing safety. [0039] As described above, embodiments of the invention can provide for the integration of a worker's motions and semantic activity information, acquired by a motion capture and recognition system based on inertial sensors (e.g., accelerometers and/or gyroscopes).
[0040] By way of embodiments of the invention, a cost-saving and reliable solution for acquiring workers' semantic activity information is provided. Embodiments of the invention use inexpensive inertial sensors to detect acceleration signals that correspond to workers' motions, unlike costly vision detection solutions based on cameras. In addition, the inertial sensors can be very compact, so a worker can wear these sensors on the body, and move and perform tasks freely. The acceleration detection will not be affected by moving backgrounds and varying lighting conditions, which are still the main technical challenges for vision-based motion recognition systems.
[0041] By acquiring a worker's semantic activity information by a motion capture and recognition system, embodiments of the invention can improve a worker's operation consistency and efficiency. Embodiments of the invention can provide an interactive way to help a trainee worker to practice a standard operation activity. By comparing the trainee worker's sensed motions with a standard benchmark activity model, the trainee worker or a supervisor can evaluate the training process, and keep improving the trainee worker's operation skill by reducing the gap between the trainee worker's motions and the standard benchmark activity model. By improving the trainee worker's skills, their operation consistency can be optimized, and operation efficiency of a production process can be enhanced.
[0042] Further, embodiments of the invention provide a way to maintain a safe production environment by monitoring a worker's operation activities, and reacting to misoperations more quickly. Workers' real-time semantic activity information is obtained by sensing and recognition of the workers' motions. The worker's semantic activity is compared with a standard benchmark activity model to determine whether a specific operation task is performed correctly (i.e., with no omitted steps and in a correct order of operation steps). Accordingly, even if incorrect operations are performed, they can be detected quickly, and appropriate corrective actions will be taken to avoid uncontrolled accidents and preserve the safety of the manufacturing system.
[0043] Hence, embodiments of the invention can play an important role in realizing a human-centered manufacturing execution system (HcMES) to coordinate human and machine activities, and achieve optimal automatic manufacturing production.
[0044] As described above, embodiments of computer system 15 can include one or more computers. In addition, described modules and functions can be performed on a single computing device, or by multiple computing devices. Multiple modules and/or functions can be included and/or performed by a single computing device or by multiple computing devices. Further, a single module and/or function can be included and/or performed by multiple computing devices.
[0045] Further, the described modules can be implemented by software, hardware, or a combination of software and hardware. Embodiments of the invention can also be implemented a recording medium containing program code of a software that implements the functions of the above embodiments whereby the recording medium is supplied to a system or apparatus, whose computing device (including processor a CPU) then reads the program code out of the recording medium and executes it.
[0046] In such case, the program code itself read out of the computer-readable recording medium (storage medium) will implement the functions of the above embodiments, and the recording medium which stores the program code will constitute the present invention. Examples of the recording medium used to supply the program code include, for example, a flexible disk, hard disk, optical disk, magneto-optical disk, CD-ROM, CD-R, magnetic tape, non-volatile memory card, and ROM. Such a recording medium could take a transitory and/or non-transitory form.
[0047] Also, the functions of the above embodiments can be implemented not only by the program code read out and executed by a computing device, but also by part or all of the actual processing executed according to instructions from the program code by an OS (operating system) running on the computer.
[0048] Furthermore, the functions of the above embodiments can also be implemented by part or all of the actual processing executed by a CPU or the like contained in a function expansion card inserted in a computing device, or a function expansion unit connected to the computing device if the processing is performed according to instructions from the program code that has been read out of the recording medium and written into memory on the function expansion card or unit.
[0049] Reference has been made above in detail to specific embodiments of the invention including the best modes contemplated by the inventors for carrying out the invention. Examples of these specific embodiments are illustrated in the accompanying drawings. While the invention is described in conjunction with these specific embodiments, it will be understood that it is not intended to limit the invention to the described embodiments. On the contrary, it is intended to cover alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims. In the above description, specific details are set forth in order to provide a thorough understanding of embodiments of the invention. Embodiments of the invention may be practiced without some or all of these specific details. In addition, well known features may not have been described in detail to avoid unnecessarily obscuring the invention. Further, portions of different embodiments and/or drawings can be combined, as would be understood by one of skill in the art. [0050] Thus, it is to be understood that the exemplary embodiments are merely illustrative of the invention and that many variations of the above-described embodiments can be devised by one skilled in the art without departing from the scope of the invention. It is therefore intended that all such variations be included within the scope of the following claims and their equivalents.

Claims

CLAIMS What is claimed is:
1. A system for creating activity data of a worker, the system comprising:
an analysis module configured to analyze received worker acceleration data, and generate worker activity data from the worker acceleration data;
wherein the worker acceleration data is received from a sensor module configured to sense the motions of a worker and generate worker acceleration data related to the sensed motions of the worker;
wherein the sensor module includes one or more inertial sensors, and wherein the sensor module is disposed on a body of the worker.
2. The system of Claim 1 , further comprising the sensor module.
3. The system of Claim 1 or Claim 2, further comprising:
a comparison module, configured to compare the worker activity data to benchmark activity data to determine if the worker activity data matches the benchmark activity data;
wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is attempting to perform the task.
4. The system of Claim 3, further comprising:
an acceptance module, configured to accept the worker activity data when the worker activity data is determined to match the benchmark activity data.
5. The system of Claim 4, further comprising an MES module, wherein the worker activity data is used by the MES module to designate the worker as a production resource.
6. The system of Claim 3, further comprising:
an alert module, configured to generate an alert when the worker activity data is determined not to match the benchmark activity data;
wherein, the alert module generates the alert to signify that the task has not been performed properly by the worker.
7. The system of Claim 6, wherein the alert module generates an alert that stops a portion of a production line.
8. The system of Claim 1 , wherein the one or more inertial sensors comprise at least one of an accelerometer and a gyroscope.
9. The system of Claim 1 or Claim 3, wherein the activity data corresponds to a semantic activity model.
10. The system of Claim 1 , further comprising a storage module, wherein the worker activity data is stored in the storage module.
1 1. A method of creating activity data of a worker, the method comprising: analyzing received worker acceleration data, and generating worker activity data from the worker acceleration data; wherein the worker acceleration data is received from a sensor module that senses motions of a worker and generates worker acceleration data related to the sensed motions of the worker;
wherein the sensor module includes one or more inertial sensors, and wherein the sensor module is disposed on a body of the worker.
12. The method of Claim 11 , further comprising: sensing, with a sensor module, the motions of a worker and generating worker acceleration data related to the sensed motions of the worker.
13. The method of Claim 11 or Claim 12, further comprising:
comparing the worker activity data to benchmark activity data to determine if the worker activity data matches the benchmark activity data,
wherein the benchmark activity data is activity data previously generated from the motions of an expert worker while performing a task, and the worker is attempting to perform the task.
14. The method of Claim 13, further comprising:
accepting the worker activity data when the worker activity data is determined to match the benchmark activity data.
15. The method of Claim 14, wherein the worker activity data is used by an MES module to designate the worker as a production resource.
The method of Claim 13, further comprising: generating an alert when the worker activity data is determined not to match the benchmark activity data;
wherein, the alert is generated to signify that the task has not been performed properly by the worker.
17. The method of Claim 16, wherein the alert module generates an alert that stops a portion of a production line.
18. The method of Claim 1 1 , wherein the one or more inertial sensors comprise at least one of an accelerometer and a gyroscope.
19. The method of Claim 11 or Claim 13, wherein the activity data corresponds to a semantic activity model.
20. The method of Claim 11 , further comprising: storing the worker activity data.
PCT/CN2012/082496 2012-09-29 2012-09-29 system and method for improving manufacturing production WO2014047944A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/082496 WO2014047944A1 (en) 2012-09-29 2012-09-29 system and method for improving manufacturing production

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/082496 WO2014047944A1 (en) 2012-09-29 2012-09-29 system and method for improving manufacturing production

Publications (1)

Publication Number Publication Date
WO2014047944A1 true WO2014047944A1 (en) 2014-04-03

Family

ID=50386906

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/082496 WO2014047944A1 (en) 2012-09-29 2012-09-29 system and method for improving manufacturing production

Country Status (1)

Country Link
WO (1) WO2014047944A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107550473A (en) * 2016-06-30 2018-01-09 欧姆龙株式会社 Abnormality processing system
CN110599018A (en) * 2019-08-30 2019-12-20 苏州浪潮智能科技有限公司 Production task configuration system and method based on MES system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2236081A1 (en) * 2009-04-02 2010-10-06 Tanita Corporation Body movement detecting apparatus and body movement detecting method
CN102334086A (en) * 2009-01-26 2012-01-25 泽罗技术(2009)有限公司 Device and method for monitoring an object's behavior

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102334086A (en) * 2009-01-26 2012-01-25 泽罗技术(2009)有限公司 Device and method for monitoring an object's behavior
EP2236081A1 (en) * 2009-04-02 2010-10-06 Tanita Corporation Body movement detecting apparatus and body movement detecting method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107550473A (en) * 2016-06-30 2018-01-09 欧姆龙株式会社 Abnormality processing system
CN110599018A (en) * 2019-08-30 2019-12-20 苏州浪潮智能科技有限公司 Production task configuration system and method based on MES system
CN110599018B (en) * 2019-08-30 2022-06-03 苏州浪潮智能科技有限公司 Production task configuration system and method based on MES system

Similar Documents

Publication Publication Date Title
US11592800B2 (en) Abnormality detector of a manufacturing machine using machine learning
US11442429B2 (en) Systems and methods for advance anomaly detection in a discrete manufacturing process with a task performed by a human-robot team
US20210216773A1 (en) Personal protective equipment system with augmented reality for safety event detection and visualization
US11538281B2 (en) Worker task performance safely
EP3270243B1 (en) Methods and systems for context based operator assistance for control systems
US20190164110A1 (en) Worker management device
CN110790105B (en) Elevator door system diagnosis and decline time prediction method and diagnosis and prediction system
WO2018158622A1 (en) Work management apparatus, method, and program
JP7243979B2 (en) Robot interference determination device, robot interference determination method, robot control device, robot control system, human motion prediction device, and human motion prediction method
US11472028B2 (en) Systems and methods automatic anomaly detection in mixed human-robot manufacturing processes
WO2018076992A1 (en) Production-line monitoring system and method
US20200000414A1 (en) Monitors for movements of workers
CN112106084A (en) Personal protective device and security management system for comparative security event evaluation
CN111308925A (en) Information processing apparatus, production instruction support method, and computer program
CN108347433B (en) Detect the system from the invasion to communication environment and intrusion detection method of
EP4038557A1 (en) Method and system for continuous estimation and representation of risk
WO2014047944A1 (en) system and method for improving manufacturing production
JP7520711B2 (en) Work behavior recognition system and work behavior recognition method
US11669084B2 (en) Controller and control system
Huber et al. Addressing Worker Safety and Accident Prevention with AI
Lee et al. UAV Pilot Status Identification Algorithm Using Image Recognition and Biosignals
WO2021166320A1 (en) Moving body abnormal state monitoring system
US20210374639A1 (en) System and method for management and support of workplace
Guidolin et al. Trust the Robot! Enabling Flexible Collaboration With Humans via Multi-Sensor Data Integration
US20240329400A1 (en) System that performs mass production process analysis with mixed reality glasses with eye tracking and accelerometer

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12885477

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12885477

Country of ref document: EP

Kind code of ref document: A1