WO2021006183A1 - Système de classification de tâches et programme de classification de tâches - Google Patents
Système de classification de tâches et programme de classification de tâches Download PDFInfo
- Publication number
- WO2021006183A1 WO2021006183A1 PCT/JP2020/026072 JP2020026072W WO2021006183A1 WO 2021006183 A1 WO2021006183 A1 WO 2021006183A1 JP 2020026072 W JP2020026072 W JP 2020026072W WO 2021006183 A1 WO2021006183 A1 WO 2021006183A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- work
- worker
- equipment
- area
- classification
- Prior art date
Links
- 230000008859 change Effects 0.000 claims description 31
- 239000002699 waste material Substances 0.000 claims description 13
- 239000000284 extract Substances 0.000 claims description 3
- 239000000725 suspension Substances 0.000 abstract 1
- 238000000034 method Methods 0.000 description 22
- 238000012545 processing Methods 0.000 description 21
- 238000004519 manufacturing process Methods 0.000 description 19
- 238000005516 engineering process Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 8
- 230000006872 improvement Effects 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 3
- 238000004140 cleaning Methods 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000005856 abnormality Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000003754 machining Methods 0.000 description 2
- 241000699666 Mus <mouse, genus> Species 0.000 description 1
- 241000699670 Mus sp. Species 0.000 description 1
- 210000003423 ankle Anatomy 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000001513 elbow Anatomy 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 210000001624 hip Anatomy 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003739 neck Anatomy 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 210000002832 shoulder Anatomy 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
- 210000003857 wrist joint Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Definitions
- the present invention relates to a work classification system and a work classification program.
- Patent Document 1 the posture or motion of the worker is determined, and the work motion directly related to the work and the non-work motion not directly related to the work are extracted from the determined posture or motion of the worker. Then, in this technology, the burden value for the extracted operation is calculated, and the load evaluation information for the non-work operation is presented using the result, and this information is used for equipment layout, line design, line improvement, and safety improvement. It is used for such purposes.
- Patent Document 2 the alarm information indicating that the worker is performing non-routine work and the work record information of the worker are acquired, and the alarm information and the work record information are associated with each other to perform the work capable of routine work. Work is allocated to each worker according to the number of people. As a result, in this technology, work management is performed by distinguishing between non-routine work and routine work.
- Patent Document 3 a first worker image in which a worker at the start of work is shown and a second worker image in which a worker at the end of work is shown are acquired, and work related to one work item is obtained. Measure the work time required for. Further, in this technique, the work matter is recognized from the second worker image, and the information on the worker and the implementation status for each work matter is displayed based on the measurement result of the work time and the work recognition result. As a result, in this technique, the worker is identified from the image, the working time of the worker is acquired, and the work matter is recognized. Therefore, in this technique, it is not necessary for the operator to input information such as operating the terminal, so that it is possible to avoid a decrease in work efficiency.
- the start / end of work is determined by the image at the time of entering / exiting the work area, and the work can be classified only by the unit of entering / exiting the work area. Therefore, even in the technique of Patent Document 3, classification is not performed according to the operating state of the equipment.
- the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a work classification system and a work classification program capable of classifying work in which the equipment is in operation and work in which the equipment is stopped. Is.
- the equipment operation status acquisition unit that acquires the operation status of the equipment, A camera that captures the area including the worker, A worker position / orientation determination unit that determines the position and orientation of the worker from the video data captured by the camera, and A work classification unit that classifies setup change work based on the combination of the operating state of the equipment and the position and orientation of the worker. Has a work classification system.
- the setup change work is a work when one product is produced or processed.
- a predetermined range within the shooting range of the camera and in the vicinity of the equipment is set as the first area.
- the work classification unit classifies the setup change work as an internal setup work when the worker is present in the first area and faces the equipment while the equipment is stopped, and the equipment is in operation. When the worker is not facing the equipment in the first area, it is classified as an external setup work, and the worker is not facing the equipment in the first area while the equipment is stopped.
- the work classification system according to (1) above which classifies as waste when it is outside the first area, and classifies it as absent when the worker is outside the first area while the equipment is in operation.
- the setup change work is a work when one product is produced or processed.
- a predetermined range within the shooting range of the camera and in the vicinity of the equipment is defined as a first region, and a range outside the first region is defined as a second region.
- the work classification unit classifies the setup change work as an internal setup work when the worker is present in the first area and faces the equipment while the equipment is stopped, and the equipment is in operation.
- the worker is not facing the equipment in the first area and when the worker is present in the second area, it is classified as an external setup work, and the worker is in the first area while the equipment is stopped. If the equipment is not facing the facility or is outside the first area, it is classified as waste, and if the worker is outside the first area and the second area while the facility is in operation, it is regarded as absent.
- the work classification system according to (1) above, which classifies.
- the worker position / orientation determination unit extracts the coordinate values of a predetermined joint or skeleton of the worker from the video data, and determines the position and orientation of the worker from the coordinate values.
- the work classification system according to any one of 1) to (4).
- a video recording unit that records the video data and The work classification system according to any one of (1) to (5) above, further comprising a classification result classified by the work classification unit and a linking unit for linking the video data.
- the linking unit further adds the operating state acquired by the equipment operating state acquisition unit and the position and orientation of the worker determined by the worker position / orientation determination unit to the classification result and the video data.
- the setup change work is a work when one product is produced or processed.
- a predetermined range in the vicinity of the equipment is set as the first region within the range in which the video data is captured, and the setup change work is performed by the worker in the first region while the equipment is stopped. It is classified as an inner setup work when it exists inside and faces the equipment, and when the worker is not facing the equipment in the first area while the equipment is in operation, the outer setup work.
- the equipment is stopped and the worker is not facing the equipment in the first area or outside the first area, it is classified as waste, and the worker is operating the equipment.
- the work classification program according to (8) above, which classifies as absent when it is outside the first area.
- the setup change work is a work when one product is produced or processed.
- the setup change operation is performed by setting a predetermined range in the vicinity of the equipment as a first region and a range outside the first region as a second region within the range in which the video data is captured. Is classified as an internal setup work when the worker is present in the first area while the equipment is stopped and is facing the equipment, and the worker is in the first area while the equipment is in operation. When the equipment is not facing the equipment and when it is present in the second area, it is classified as an external setup work, and the worker is not facing the equipment in the first area while the equipment is stopped. The case or when it is outside the first area, it is classified as waste, and when the worker is outside the first area and the second area, it is classified as absent, according to the above (8). Work classification program.
- step (b) the coordinate values of a predetermined joint or skeleton of the worker are extracted from the video data, and the position and orientation of the worker are determined from the coordinate values.
- the work classification program according to any one of (11).
- step (g) further links the operating state, the position and orientation of the worker, to the classification result and the video data.
- the operating state of the equipment is acquired, the position and orientation of the worker are determined, and the setup change work is classified from the combination thereof.
- the work in which the equipment is in operation and the work in which the equipment is stopped can be classified.
- FIG. 1 is an explanatory diagram for explaining the functional configuration of the work classification system of the embodiment.
- the work classification system 1 has a server 10, a camera 20 connected to the server 10, and a terminal 18 connected to the server 10.
- the server 10 has the functions of the worker identification unit 11, the equipment operation status acquisition unit 12, the video recording unit 13, the worker position / orientation determination unit 14, the work classification unit 15, the product information acquisition unit 16, and the linking unit. Has 17.
- the server 10 is a computer and has a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), an HDD (Hard Disk Drive), and an interface (IF) for external connection. Further, the server 10 has an input device such as a keyboard and a mouse, and an output device such as a display.
- CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- HDD Hard Disk Drive
- IF interface
- the server 10 has an input device such as a keyboard and a mouse, and an output device such as a display.
- each function is realized by the CPU reading and executing the program stored in the ROM or HDD.
- the interface communicates with the equipment 50, the camera 20, and the terminal 18.
- the interface includes, for example, a network interface according to a standard such as Ethernet (registered trademark), SATA, PCI Express, USB, and IEEE 1394, and various local connection interfaces such as a wireless communication interface such as Bluetooth (registered trademark) and IEEE 802.11.
- the interface may also be a line for making a local analog connection with the equipment 50 and the camera 20.
- the interface may be a wireless LAN (WiFi) or a mobile phone line for connecting to the terminal 18.
- WiFi wireless LAN
- the terminal 18 is a computer.
- a personal computer PC (Personal Computer)
- a tablet a smartphone, or the like is used.
- These are computers used by workers, managers, and other stakeholders.
- These computers have input and output devices such as keyboards, mice, touch panels, and displays.
- the worker identification unit 11 individually recognizes the worker 30 that has entered the work area (details will be described later).
- the worker 30 is identified, for example, by using an RFID (Radio Frequency Identification) tag 31 or by using biometric authentication such as face authentication, iris authentication, and fingerprint authentication.
- RFID Radio Frequency Identification
- biometric authentication such as face authentication, iris authentication, and fingerprint authentication.
- the RFID tag 31 is carried by the worker 30 or attached to a work clothes, a hat, a helmet, or the like.
- a reading device for reading the RFID tag 31 is installed in the work area.
- the worker identification unit 11 causes the RFID tag 31 to be read by the reading device to acquire the identification information of each worker.
- iris recognition the worker identification unit 11 identifies the worker 30 by using these authentication sensors and cameras.
- the equipment operating state acquisition unit 12 acquires the operating state of the equipment 50.
- the operating state of the equipment 50 is, for example, an operation by normal operation for production, a completion stop when machining is completed and stopped, an abnormal stop when an abnormality occurs, and a stop where the equipment 50 is not used.
- the equipment operating state acquisition unit 12 acquires the operating state of the equipment by, for example, detecting the emission color of the laminated signal lamp 51 provided in the equipment 50 or detecting the switching signal. When detecting the emission color or switching signal, a dedicated sensor (not shown) for detecting them is used.
- the equipment operating state acquisition unit 12 (server 10) is connected to a sensor that detects the emission color and the switching signal.
- the laminated signal lamp 51 is also called a three-color laminated lamp, a signal tower, or the like, and most of them emit light of three color lamps of green, yellow, and red according to the operating state of the equipment 50.
- the laminated signal light 51 is usually green during normal operation, yellow when stopped due to production completion, and red when an abnormality occurs.
- the equipment operating state acquisition unit 12 may acquire the signal.
- the equipment operating state acquisition unit 12 (server 10) is connected to the equipment 50.
- the equipment operating state acquisition unit 12 may acquire the operating state from the video data of the camera 20.
- the laminated signal light 51 is placed in the shooting range of the camera 20.
- the equipment operating state acquisition unit 12 detects the emission color of the laminated signal lamp 51 from the video data.
- the video recording unit 13 records video data from the camera 20.
- the video recording unit 13 is, for example, an HDD in the server 10. Video data is recorded over time.
- the worker position / orientation determination unit 14 detects the position and orientation of the worker 30 from the video data.
- the camera 20 has a shooting range 25 including a work area in which the worker 30 works on the equipment 50. As described above, when the operating state of the equipment 50 is detected from the video data of the camera 20, the laminated signal light 51 is included in the shooting range 25.
- Existing technology can be used to detect the position and orientation of the worker 30.
- open pose open pose (open pose (see https://github.com/CMU-Perceptual-Computing-Lab/openpose, etc.)
- deep pose deep pose (https://www.slideshare.net/mitmul/deeppose-)
- Use technology that recognizes the posture by estimating the human skeleton from dual video data (video) such as human-pose-estimation-via-deep-neural-networks)) (hereinafter referred to as skeleton recognition technology).
- video video data
- skeleton recognition technology a technology using a depth camera (RBG-D camera) or TOF (Time of Flight), such as Kinect (registered trademark) of Microsoft Corporation, can be used.
- Kinect registered trademark
- the skeleton recognition technology can recognize the posture of a person from two-dimensional video data. Therefore, a general movie camera can be used as the camera 20, and the system configuration is simplified.
- FIG. 2a and 2b are explanatory views for explaining the position and orientation of the worker 30 with respect to the equipment 50.
- FIG. 2a is an example in which the equipment 50 is installed independently, such as an automatic processing machine 50a.
- FIG. 2b shows an example in which the equipment 50 is a production line 50b for assembly line work.
- the first region 41 and the second region 42 are set in the photographing range 25 as shown in FIGS. 2a and 2b.
- the first area 41 is a predetermined range in the vicinity of the equipment, for example, a range in which the worker 30 can directly work on the equipment 50 or the product set in the equipment 50.
- the first region 41 is, for example, a range in which the worker 30 can reach the automatic processing machine 50a, the production line 50b, and the products set in them if the worker 30 reaches out.
- the second region 42 is a region outside the first region 41. The first region 41 and the second region 42 are exclusively set.
- the shooting range 25 is a range including at least the first area 41 and the second area 42. In FIGS. 2a and 2b, the outside of the second region 42 is also photographed.
- the worker position / orientation determination unit 14 determines from the video data whether the worker 30 exists in the first area 41, exists in the second area 42, or does not exist (absence) in both areas.
- the person in the image is specified by the three-dimensional coordinate value.
- the three-dimensional coordinate system to be used and the coordinate origin are set in advance. Further, coordinate values indicating the ranges of the first region 41 and the second region 42 are also set in advance in the three-dimensional coordinate system.
- the equipment 50 when the equipment 50 is an automatic processing machine 50a, work is performed on the outer surface of the automatic processing machine 50a, for example, an operation panel, an outer door, a raw material supply port, an access panel, etc., as the first area 41.
- the range is set so that the person 30 can access it (the range shown in the figure a).
- the second region 42 is set further outside from the outer peripheral end of the first region 41 (range b in the figure).
- a range of a certain distance is set as the first region 41 in parallel with the production line 50b (range in the figure a).
- the range of a certain distance is a range in which the worker 30 can access the work on the production line.
- the second region 42 is set further outside from the outer peripheral end of the first region 41 (range b in the figure).
- the second area 42 is preferably a range in which the worker 30 can immediately move to work, for example.
- the second region 42 is, for example, a range of 1 m from the outer peripheral end of the first region 41.
- the ranges of the first region 41 and the second region 42 are merely examples and can be set arbitrarily.
- the worker position / orientation determination unit 14 first detects the position of the worker 30.
- the worker position / orientation determination unit 14 extracts the coordinate values of predetermined joints from the skeleton of the worker 30 obtained by using the skeleton recognition technique, and determines in which region the worker 30 exists. Or determine if it does not exist in both.
- the predetermined joints are, for example, the neck, shoulders, elbows, wrists, hips, knees, ankles, and the like, and may be determined in advance.
- the worker 30 is present so as to straddle the boundary between the first region 41 and the second region 42, it is assumed that the worker 30 is present in the region including the predetermined joint of the worker 30, for example. To do. As a result, it is detected whether the worker 30 is present in either region or is not present in either region.
- the coordinate value of the skeleton provided by the skeleton recognition technique may be used instead of the coordinate value of the joint.
- the worker position / orientation determination unit 14 then detects the orientation of the worker 30.
- the detection of the orientation of the worker 30 is executed only for the worker 30 existing in the first region 41.
- the worker position / orientation determination unit 14 determines that the direction in which the arm (particularly the wrist joint) protrudes with respect to the joint position of the neck or shoulder is the direction in which the person is facing.
- the worker 30a shown by the solid line is determined to be facing the equipment 50 because the direction in which both arms are protruding is the direction of the equipment 50.
- the worker 30b shown by the dotted line is determined not to face the equipment 50 because both arms are facing in a direction other than the equipment 50 (diagonally backward in the figure).
- the worker 30 if at least one arm protrudes in the direction of the equipment 50 with respect to the neck or shoulder, it is determined that the worker 30 is facing the equipment 50.
- both arms of the worker 30 may protrude in the direction of the equipment 50, or only one hand may protrude in the direction of the equipment 50.
- work on the operation panel such as changing the setting of the machining program or restarting can be performed with one hand. Therefore, in the present embodiment, when at least one arm is facing the equipment 50, it is determined that the worker 30 is facing the equipment 50.
- the worker position / orientation determination unit 14 detects the position of each worker 30, and further, the plurality of workers 30 are present in the first area 41. If so, the orientation of each worker 30 is determined.
- a two-dimensional coordinate system may be used to determine the position and orientation of the worker.
- the camera 20 is installed at a position where the camera 20 is photographed from above the worker 30 (may be from diagonally above).
- a two-dimensional coordinate system is set in the video data, and a first region 41 and a second region 42 are set. Then, the worker position / orientation determination unit 14 determines whether or not the worker 30 exists in each area in the same manner as described above.
- the work classification unit 15 converts the information on the operating state of the equipment 50 from the equipment operating state acquisition unit 12 and the information on the state (position and orientation) of the worker 30 from the worker position / orientation determination unit 14 into the equipment 50. Classify the work being done on it.
- FIG. 3 is a classification judgment table for classifying work.
- the work classification unit 15 classifies the state of the worker 30 from the position and orientation of the worker 30 into three categories: anti-equipment work, non-equipment work, and absence.
- the equipment work is a case where the worker 30 existing in the first area 41 is facing the equipment 50.
- the non-equipment work is a case where the worker 30 existing in the first area 41 does not face the equipment 50 and a case where the worker 30 exists in the second area 42.
- this non-equipment work there is a worker 30 in the vicinity of the equipment 50, and the work on the equipment 50 is not performed, but the work on the equipment 50 can be performed immediately. If it is possible to work on such equipment 50, it is classified as different from absence.
- Absence is when the worker 30 does not exist in either the first area 41 or the second area 42.
- the work classification unit 15 classifies the work from the combination of the operation state of the equipment 50 from the equipment operation state acquisition unit 12 and the work state of the worker 30.
- the work to be classified is the setup change work.
- the setup change work is a work that is performed after the production or processing of a certain product is completed, until the jig or tool of the equipment 50 is changed in order to produce or process the next product, and the production or processing of the product is completed.
- Jig tool changes include changing jigs and tools, changing equipment programs required for production and machining, and replacing and installing additional parts required for products to be produced or machined.
- This setup change work is classified into the following four categories: inner setup work, outer setup work, waste, and absence.
- the internal setup work is work performed on the equipment 50 while the equipment 50 is stopped.
- the internal setup work includes, for example, replacement of jigs and tools and tools, change of equipment program, installation of additional parts, setting of pre-processed product to equipment 50, and the like. These operations are performed while the equipment 50 is stopped. Therefore, when the equipment is stopped (yellow, red, and when the lights are off), the combination of equipment work is classified as internal setup work.
- the external setup work is a work performed while the equipment 50 is in operation.
- the external setup work is performed after preparatory work such as preparing jigs and tools and additional parts for the production or processing of the next product, carrying out the product whose production has been completed or processed, and cleaning and cleaning around the equipment 50. Cleaning up work, etc. It is efficient that these operations are performed while the equipment 50 is in operation, i.e., while the previous product is being produced or processed. Therefore, when the equipment is in operation (when it is green) and the combination of non-equipment work is performed, it is classified as external setup work.
- Waste is a state in which work is being performed on other than the equipment 50 even though the equipment 50 is stopped. Waste is, for example, finding the jigs and tools needed for the next product to be produced or processed, or going to pick up additional parts. Therefore, non-equipment work when the equipment is stopped in yellow, red, and off is classified as waste.
- Absence is when the worker 30 does not exist in either the first area 41 or the second area 42 while the equipment is in operation. During the operation of the equipment, the worker 30 does not usually have to be in the vicinity of the equipment 50, so that it is in a normal state and is simply classified as absent.
- the work classification unit 15 records the start and end times of each classified work together.
- the start and end of each classified work starts at the classified time and ends at the time when the next classification is switched to. Specifically, it is recorded at the following timing.
- the time when the direction is changed is recorded.
- the time when the worker 30 enters the first area 41 from the second area 42 or the time when the worker 30 goes out of the second area 42 is recorded.
- the worker 30 is absent, the time when the worker 30 enters the second area 42 is recorded. If there is no change in the position and orientation of the worker 30, the time when the operating state of the equipment 50 has changed is recorded.
- the work classification unit 15 records the classification results in association with each worker 30 identified by the worker identification unit 11.
- the classification result associated with each worker 30 is recorded in, for example, the HDD of the server 10.
- the product information acquisition unit 16 acquires product information, which is information on products produced and processed by the equipment 50.
- the product information is acquired, for example, when the product before processing arrives at the equipment 50, and is recorded together with the time at that time. The time of the product information is recorded when the processing is completed.
- the product information may be read from, for example, a barcode attached to the product, an RFID tag 31, or the like. Further, the product information may be input by the worker 30 from the equipment 50 or a terminal installed near the equipment 50. Further, the product information may be recalled in order from which the server 10 or another computer stores the product list.
- the linking unit 17 includes worker information from the worker identification unit 11, operating status of the equipment 50 from the equipment operating status acquisition unit 12 (referred to as equipment operation information), product information from the product information acquisition unit 16, and a work classification unit.
- equipment operation information operating status of the equipment 50 from the equipment operating status acquisition unit 12
- product information from the product information acquisition unit 16
- work classification unit The work classification results from 15 and the video data recorded in the video recording unit 13 are linked (linked).
- the linking unit 17 links each information, the classification result, and the video data on the basis of time.
- FIG. 4 is a time chart displaying each information and classification result.
- operation means that the equipment 50 is in operation
- pressure means work with equipment
- non-equipment work means non-equipment work
- inside means internal setup work
- outside means external setup. Each work is shown. The same applies to other figures.
- product information, equipment operating status, working status, and classification results are shown as a time chart, which makes it easier to understand how these were performed over time.
- FIG. 5 is a pie chart showing the work classification results in terms of time.
- FIG. 6 is a table showing the work classification results together with the time and work time.
- this table also displays playback icons for playing back video data.
- the playback icon is linked with the video data recorded in the video recording unit 13 based on the time.
- FIG. 7 is a diagram showing an example of the reproduced video. For example, when the playback icon indicated by the dotted circle 300 in the figure is clicked, the image shown in FIG. 7 is played back. In this image, the state in which the worker 30 is working on the automatic processing machine 50a is projected.
- the playback of video data is not limited to playback from a table with a playback icon as shown in FIG.
- the video data may be reproduced by clicking each information or the classification result portion.
- the video data to be played is, for example, from the start to the end of the specified work classification. Further, the video data to be reproduced may be reproduced including before and after from the start to the end of the designated work classification.
- the video data to be reproduced may be reproduced from an arbitrary time regardless of the work classification.
- FIG. 8 is a table showing the work contents, acquired information, and classification results in detail.
- each information or classification result may be associated with video data, and the video data may be reproduced by clicking each information or classification result in the table.
- each information linked by the linking unit 17 is output on a time basis or in a time ratio. These outputs are performed in response to a request from the terminal 18, for example, and are displayed on the display of the terminal 18.
- FIG. 9 is a flowchart showing a processing procedure of work classification.
- the server 10 executes the functions of the respective parts already described.
- the server 10 identifies the worker 30 (S1).
- the server 10 acquires the operating state of the equipment 50 (S2).
- the server 10 acquires video data from the camera 20 (S3).
- the server 10 acquires the product information (S4).
- the server 10 detects the position and orientation of the worker 30 from the video data (S5).
- the server 10 classifies the work from each obtained information (S6).
- the server 10 associates the obtained information, the classification result, and the video data (S7).
- the server 10 ends this process if there is an input to end the process (S8: YES). On the other hand, if there is no input for the end of processing (S8: NO), the server 10 returns to S1 and continues this processing.
- each step of S1 to 4 is performed. It may be in any order.
- the operating state of the equipment 50 and the working state of the worker 30 are acquired, and the setup change work is classified into four, an inner setup work, an outer setup work, waste, and absent, according to the combination thereof. ..
- the work in which the equipment 50 is in operation, the work in which the equipment 50 is stopped, and the work unrelated to the equipment 50 can be classified. Therefore, in the present embodiment, what was conventionally classified manually by a person can be automatically classified, and the classification work can be greatly streamlined.
- each acquired information, classification result, and video data are synchronized and linked according to time.
- each associated information and the classification result can be displayed as a time chart, a graph, a table, or the like.
- the video data related to them is also linked, the video data related to each information and the classification result can be easily reproduced.
- the work situation can be visualized and the improvement analysis can be made more efficient.
- the setup change work can be shortened, the downtime of the equipment 50 can be reduced, and the productivity can be improved.
- each function described as an embodiment may be executed in the camera 20 by incorporating it into the camera 20 such as a board PC or integrating it with the camera 20. By doing so, this embodiment can be implemented only by installing the camera 20.
- a cloud server on the Internet may be used.
- the equipment 50 and the camera 20 are provided with an interface capable of connecting to the Internet, and the operating state of the equipment 50 and the video data of the camera 20 are transmitted to the cloud server.
- the connection between the equipment 50 and the camera 20 and the Internet may be a wireless connection or a wired connection.
- the initial investment can be suppressed because the existing webcam that can connect to the Internet can be used for the camera 20 simply by attaching an interface for connecting to the Internet to the equipment 50.
- the operating state of the equipment 50 is acquired from the emission color of the laminated signal light 51 in the video data, only the installation of the webcam is required, so the initial investment can be further reduced. Can be done.
- skeleton recognition technique not only the skeleton recognition technique but also other sensors may be used to detect the position and orientation of the worker 30.
- sensors may be used to detect the position and orientation of the worker 30.
- LIDAR Light Detection and Ringing
- a technique of determining the direction of a person from the temperature of the person's face using an infrared sensor (thermal camera) can be used.
- the worker identification unit 11 recognizes the worker 30 individually, but if it is not necessary to associate the work classification result for each worker 30, the individual worker 30 is recognized. No need to identify. If the individual worker 30 is not identified, the worker identification unit 11 may be omitted. If the worker identification unit 11 is not provided, the RFID tag 31, its reading device, the device for biometric authentication, and the like are naturally unnecessary, so that the capital investment cost can be reduced.
- the operating state of the equipment 50 may be acquired by sound when the equipment 50 is equipped with an acoustic alarm that notifies the operation or abnormal stop by sound.
- the laminated signal lamp 51 is not limited to three colors, and can be applied to a two-color laminated signal lamp 51, a one-color laminated signal lamp 51, and the like.
- the two-color laminated signal lamp 51 the normal operation of green and the abnormal stop of red are performed. In this case, since green is in operation and red and off are stopped, processing may be performed accordingly.
- the second region 42 is provided.
- the work classification can be carried out even if only the first region 41 is used.
- the work classification in this case is classified as an external setup work when the worker 30 is present in the first area 41 and does not face the equipment 50 while the equipment is in operation.
- the equipment is in operation and the worker 30 does not exist in the first area 41, it is classified as absent.
- Other classifications are the same as those shown in FIG.
- the work classification program according to the present invention can also be realized by a dedicated hardware circuit.
- this work classification program is provided by a computer-readable recording medium such as a USB (Universal Serial Bus) memory or a DVD (Digital Versaille Disc) -ROM (Read Only Memory), or is provided by a computer-readable recording medium such as the Internet regardless of the recording medium. It is also possible to provide it online via the network of.
- this work classification program is usually stored in a magnetic disk device or the like that constitutes a storage unit.
- this work classification program can be provided as a single application software, or can be provided by being incorporated into another software as one function.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Manufacturing & Machinery (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- General Health & Medical Sciences (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Automation & Control Theory (AREA)
- General Factory Administration (AREA)
Abstract
L'invention concerne un système de classification de tâches permettant de classer des tâches en celles avec des installations en fonctionnement et celles avec des installations sous suspension. Le système de classification de tâches (1) comprend une unité d'acquisition d'état de fonctionnement d'installation (12) qui acquiert l'état de fonctionnement d'une installation (50), une caméra (20) qui capture une plage comprenant un travailleur (30), une unité de détermination de position/orientation de travailleur (14) qui détermine la position et l'orientation du travailleur (30) à partir des données vidéo capturées par la caméra (20), et une unité de classification de tâche (15) qui classifie une tâche de changement en fonction d'une combinaison de l'état de fonctionnement de l'installation (50) et de la position et de l'orientation du travailleur (30).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021530663A JP7347509B2 (ja) | 2019-07-10 | 2020-07-02 | 作業分類システムおよび作業分類プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-128731 | 2019-07-10 | ||
JP2019128731 | 2019-07-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021006183A1 true WO2021006183A1 (fr) | 2021-01-14 |
Family
ID=74115291
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/026072 WO2021006183A1 (fr) | 2019-07-10 | 2020-07-02 | Système de classification de tâches et programme de classification de tâches |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7347509B2 (fr) |
WO (1) | WO2021006183A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024121991A1 (fr) * | 2022-12-07 | 2024-06-13 | ファナック株式会社 | Dispositif de gestion |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009157517A (ja) * | 2007-12-25 | 2009-07-16 | Shibuya Kogyo Co Ltd | 生産管理システム |
JP2012203770A (ja) * | 2011-03-28 | 2012-10-22 | Hitachi Chem Co Ltd | 作業分析システム |
JP2013073279A (ja) * | 2011-09-26 | 2013-04-22 | Omron Corp | データ処理装置、データ処理システム、およびデータ処理方法 |
WO2018138925A1 (fr) * | 2017-01-30 | 2018-08-02 | 三菱電機株式会社 | Dispositif de traitement de données et procédé de traitement de données |
JP2019023803A (ja) * | 2017-07-24 | 2019-02-14 | 株式会社日立製作所 | 作業改善支援システムおよび方法 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020021451A (ja) * | 2018-07-18 | 2020-02-06 | コニカミノルタ株式会社 | 設備稼働率算出システムおよび設備稼働率算出プログラム |
-
2020
- 2020-07-02 WO PCT/JP2020/026072 patent/WO2021006183A1/fr active Application Filing
- 2020-07-02 JP JP2021530663A patent/JP7347509B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009157517A (ja) * | 2007-12-25 | 2009-07-16 | Shibuya Kogyo Co Ltd | 生産管理システム |
JP2012203770A (ja) * | 2011-03-28 | 2012-10-22 | Hitachi Chem Co Ltd | 作業分析システム |
JP2013073279A (ja) * | 2011-09-26 | 2013-04-22 | Omron Corp | データ処理装置、データ処理システム、およびデータ処理方法 |
WO2018138925A1 (fr) * | 2017-01-30 | 2018-08-02 | 三菱電機株式会社 | Dispositif de traitement de données et procédé de traitement de données |
JP2019023803A (ja) * | 2017-07-24 | 2019-02-14 | 株式会社日立製作所 | 作業改善支援システムおよび方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024121991A1 (fr) * | 2022-12-07 | 2024-06-13 | ファナック株式会社 | Dispositif de gestion |
Also Published As
Publication number | Publication date |
---|---|
JPWO2021006183A1 (fr) | 2021-01-14 |
JP7347509B2 (ja) | 2023-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7074965B2 (ja) | 金属処理産業における内部個人位置特定に基づく製造制御 | |
JP4752721B2 (ja) | 移動パターン特定装置、移動パターン特定方法、移動パターン特定プログラム、およびこれを記録した記録媒体 | |
JP6551531B2 (ja) | 製造状態表示システム、製造状態表示方法および製造状態表示プログラム | |
JP7119532B2 (ja) | 生産性向上支援システムおよび生産性向上支援プログラム | |
JP7414797B2 (ja) | 製造管理方法 | |
CN109407544A (zh) | 非侵入式数据提取系统的仿真机台操作画面的系统模块 | |
WO2020250498A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations, et support d'enregistrement | |
JP2018116325A (ja) | 異常検知システム、異常検知装置、異常検知方法およびプログラム | |
WO2021006183A1 (fr) | Système de classification de tâches et programme de classification de tâches | |
JP2015225630A (ja) | 作業管理装置、作業管理システムおよび作業管理方法 | |
JP7139987B2 (ja) | 工程情報取得システム、工程情報取得方法、および工程情報取得プログラム | |
WO2020246082A1 (fr) | Dispositif de surveillance de travail et procédé de surveillance de travail | |
JP6806148B2 (ja) | 作業時間分析システム、作業時間分析プログラム、及び作業時間分析方法 | |
JP5891191B2 (ja) | 稼動実績取得システム及び稼動実績取得方法 | |
JP2021163293A (ja) | 作業分析装置及び作業分析プログラム | |
JP7018858B2 (ja) | 作業管理システム、作業管理サーバ、および、作業管理方法 | |
JP2021022232A (ja) | 生産実績記録システムおよび生産実績記録プログラム | |
US20240231304A1 (en) | Operation management device | |
JP7424825B2 (ja) | I/o信号情報表示システム | |
CN114076764A (zh) | 集中式复判系统及方法 | |
JP6948294B2 (ja) | 作業異常検知支援装置、作業異常検知支援方法、および作業異常検知支援プログラム | |
CN114066799A (zh) | 检测系统和检测方法 | |
WO2021171702A1 (fr) | Système de détermination de personne et procédé de détermination de personne | |
WO2023095329A1 (fr) | Système d'évaluation de mouvement, procédé d'évaluation de mouvement et support lisible par ordinateur non transitoire | |
JP7376446B2 (ja) | 作業分析プログラム、および、作業分析装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20836740 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021530663 Country of ref document: JP Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20836740 Country of ref document: EP Kind code of ref document: A1 |