WO2021006183A1 - Task classification system and task classification program - Google Patents

Task classification system and task classification program Download PDF

Info

Publication number
WO2021006183A1
WO2021006183A1 PCT/JP2020/026072 JP2020026072W WO2021006183A1 WO 2021006183 A1 WO2021006183 A1 WO 2021006183A1 JP 2020026072 W JP2020026072 W JP 2020026072W WO 2021006183 A1 WO2021006183 A1 WO 2021006183A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
worker
equipment
area
classification
Prior art date
Application number
PCT/JP2020/026072
Other languages
French (fr)
Japanese (ja)
Inventor
巨樹 松山
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2021530663A priority Critical patent/JP7347509B2/en
Publication of WO2021006183A1 publication Critical patent/WO2021006183A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Definitions

  • the present invention relates to a work classification system and a work classification program.
  • Patent Document 1 the posture or motion of the worker is determined, and the work motion directly related to the work and the non-work motion not directly related to the work are extracted from the determined posture or motion of the worker. Then, in this technology, the burden value for the extracted operation is calculated, and the load evaluation information for the non-work operation is presented using the result, and this information is used for equipment layout, line design, line improvement, and safety improvement. It is used for such purposes.
  • Patent Document 2 the alarm information indicating that the worker is performing non-routine work and the work record information of the worker are acquired, and the alarm information and the work record information are associated with each other to perform the work capable of routine work. Work is allocated to each worker according to the number of people. As a result, in this technology, work management is performed by distinguishing between non-routine work and routine work.
  • Patent Document 3 a first worker image in which a worker at the start of work is shown and a second worker image in which a worker at the end of work is shown are acquired, and work related to one work item is obtained. Measure the work time required for. Further, in this technique, the work matter is recognized from the second worker image, and the information on the worker and the implementation status for each work matter is displayed based on the measurement result of the work time and the work recognition result. As a result, in this technique, the worker is identified from the image, the working time of the worker is acquired, and the work matter is recognized. Therefore, in this technique, it is not necessary for the operator to input information such as operating the terminal, so that it is possible to avoid a decrease in work efficiency.
  • the start / end of work is determined by the image at the time of entering / exiting the work area, and the work can be classified only by the unit of entering / exiting the work area. Therefore, even in the technique of Patent Document 3, classification is not performed according to the operating state of the equipment.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a work classification system and a work classification program capable of classifying work in which the equipment is in operation and work in which the equipment is stopped. Is.
  • the equipment operation status acquisition unit that acquires the operation status of the equipment, A camera that captures the area including the worker, A worker position / orientation determination unit that determines the position and orientation of the worker from the video data captured by the camera, and A work classification unit that classifies setup change work based on the combination of the operating state of the equipment and the position and orientation of the worker. Has a work classification system.
  • the setup change work is a work when one product is produced or processed.
  • a predetermined range within the shooting range of the camera and in the vicinity of the equipment is set as the first area.
  • the work classification unit classifies the setup change work as an internal setup work when the worker is present in the first area and faces the equipment while the equipment is stopped, and the equipment is in operation. When the worker is not facing the equipment in the first area, it is classified as an external setup work, and the worker is not facing the equipment in the first area while the equipment is stopped.
  • the work classification system according to (1) above which classifies as waste when it is outside the first area, and classifies it as absent when the worker is outside the first area while the equipment is in operation.
  • the setup change work is a work when one product is produced or processed.
  • a predetermined range within the shooting range of the camera and in the vicinity of the equipment is defined as a first region, and a range outside the first region is defined as a second region.
  • the work classification unit classifies the setup change work as an internal setup work when the worker is present in the first area and faces the equipment while the equipment is stopped, and the equipment is in operation.
  • the worker is not facing the equipment in the first area and when the worker is present in the second area, it is classified as an external setup work, and the worker is in the first area while the equipment is stopped. If the equipment is not facing the facility or is outside the first area, it is classified as waste, and if the worker is outside the first area and the second area while the facility is in operation, it is regarded as absent.
  • the work classification system according to (1) above, which classifies.
  • the worker position / orientation determination unit extracts the coordinate values of a predetermined joint or skeleton of the worker from the video data, and determines the position and orientation of the worker from the coordinate values.
  • the work classification system according to any one of 1) to (4).
  • a video recording unit that records the video data and The work classification system according to any one of (1) to (5) above, further comprising a classification result classified by the work classification unit and a linking unit for linking the video data.
  • the linking unit further adds the operating state acquired by the equipment operating state acquisition unit and the position and orientation of the worker determined by the worker position / orientation determination unit to the classification result and the video data.
  • the setup change work is a work when one product is produced or processed.
  • a predetermined range in the vicinity of the equipment is set as the first region within the range in which the video data is captured, and the setup change work is performed by the worker in the first region while the equipment is stopped. It is classified as an inner setup work when it exists inside and faces the equipment, and when the worker is not facing the equipment in the first area while the equipment is in operation, the outer setup work.
  • the equipment is stopped and the worker is not facing the equipment in the first area or outside the first area, it is classified as waste, and the worker is operating the equipment.
  • the work classification program according to (8) above, which classifies as absent when it is outside the first area.
  • the setup change work is a work when one product is produced or processed.
  • the setup change operation is performed by setting a predetermined range in the vicinity of the equipment as a first region and a range outside the first region as a second region within the range in which the video data is captured. Is classified as an internal setup work when the worker is present in the first area while the equipment is stopped and is facing the equipment, and the worker is in the first area while the equipment is in operation. When the equipment is not facing the equipment and when it is present in the second area, it is classified as an external setup work, and the worker is not facing the equipment in the first area while the equipment is stopped. The case or when it is outside the first area, it is classified as waste, and when the worker is outside the first area and the second area, it is classified as absent, according to the above (8). Work classification program.
  • step (b) the coordinate values of a predetermined joint or skeleton of the worker are extracted from the video data, and the position and orientation of the worker are determined from the coordinate values.
  • the work classification program according to any one of (11).
  • step (g) further links the operating state, the position and orientation of the worker, to the classification result and the video data.
  • the operating state of the equipment is acquired, the position and orientation of the worker are determined, and the setup change work is classified from the combination thereof.
  • the work in which the equipment is in operation and the work in which the equipment is stopped can be classified.
  • FIG. 1 is an explanatory diagram for explaining the functional configuration of the work classification system of the embodiment.
  • the work classification system 1 has a server 10, a camera 20 connected to the server 10, and a terminal 18 connected to the server 10.
  • the server 10 has the functions of the worker identification unit 11, the equipment operation status acquisition unit 12, the video recording unit 13, the worker position / orientation determination unit 14, the work classification unit 15, the product information acquisition unit 16, and the linking unit. Has 17.
  • the server 10 is a computer and has a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), an HDD (Hard Disk Drive), and an interface (IF) for external connection. Further, the server 10 has an input device such as a keyboard and a mouse, and an output device such as a display.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • HDD Hard Disk Drive
  • IF interface
  • the server 10 has an input device such as a keyboard and a mouse, and an output device such as a display.
  • each function is realized by the CPU reading and executing the program stored in the ROM or HDD.
  • the interface communicates with the equipment 50, the camera 20, and the terminal 18.
  • the interface includes, for example, a network interface according to a standard such as Ethernet (registered trademark), SATA, PCI Express, USB, and IEEE 1394, and various local connection interfaces such as a wireless communication interface such as Bluetooth (registered trademark) and IEEE 802.11.
  • the interface may also be a line for making a local analog connection with the equipment 50 and the camera 20.
  • the interface may be a wireless LAN (WiFi) or a mobile phone line for connecting to the terminal 18.
  • WiFi wireless LAN
  • the terminal 18 is a computer.
  • a personal computer PC (Personal Computer)
  • a tablet a smartphone, or the like is used.
  • These are computers used by workers, managers, and other stakeholders.
  • These computers have input and output devices such as keyboards, mice, touch panels, and displays.
  • the worker identification unit 11 individually recognizes the worker 30 that has entered the work area (details will be described later).
  • the worker 30 is identified, for example, by using an RFID (Radio Frequency Identification) tag 31 or by using biometric authentication such as face authentication, iris authentication, and fingerprint authentication.
  • RFID Radio Frequency Identification
  • biometric authentication such as face authentication, iris authentication, and fingerprint authentication.
  • the RFID tag 31 is carried by the worker 30 or attached to a work clothes, a hat, a helmet, or the like.
  • a reading device for reading the RFID tag 31 is installed in the work area.
  • the worker identification unit 11 causes the RFID tag 31 to be read by the reading device to acquire the identification information of each worker.
  • iris recognition the worker identification unit 11 identifies the worker 30 by using these authentication sensors and cameras.
  • the equipment operating state acquisition unit 12 acquires the operating state of the equipment 50.
  • the operating state of the equipment 50 is, for example, an operation by normal operation for production, a completion stop when machining is completed and stopped, an abnormal stop when an abnormality occurs, and a stop where the equipment 50 is not used.
  • the equipment operating state acquisition unit 12 acquires the operating state of the equipment by, for example, detecting the emission color of the laminated signal lamp 51 provided in the equipment 50 or detecting the switching signal. When detecting the emission color or switching signal, a dedicated sensor (not shown) for detecting them is used.
  • the equipment operating state acquisition unit 12 (server 10) is connected to a sensor that detects the emission color and the switching signal.
  • the laminated signal lamp 51 is also called a three-color laminated lamp, a signal tower, or the like, and most of them emit light of three color lamps of green, yellow, and red according to the operating state of the equipment 50.
  • the laminated signal light 51 is usually green during normal operation, yellow when stopped due to production completion, and red when an abnormality occurs.
  • the equipment operating state acquisition unit 12 may acquire the signal.
  • the equipment operating state acquisition unit 12 (server 10) is connected to the equipment 50.
  • the equipment operating state acquisition unit 12 may acquire the operating state from the video data of the camera 20.
  • the laminated signal light 51 is placed in the shooting range of the camera 20.
  • the equipment operating state acquisition unit 12 detects the emission color of the laminated signal lamp 51 from the video data.
  • the video recording unit 13 records video data from the camera 20.
  • the video recording unit 13 is, for example, an HDD in the server 10. Video data is recorded over time.
  • the worker position / orientation determination unit 14 detects the position and orientation of the worker 30 from the video data.
  • the camera 20 has a shooting range 25 including a work area in which the worker 30 works on the equipment 50. As described above, when the operating state of the equipment 50 is detected from the video data of the camera 20, the laminated signal light 51 is included in the shooting range 25.
  • Existing technology can be used to detect the position and orientation of the worker 30.
  • open pose open pose (open pose (see https://github.com/CMU-Perceptual-Computing-Lab/openpose, etc.)
  • deep pose deep pose (https://www.slideshare.net/mitmul/deeppose-)
  • Use technology that recognizes the posture by estimating the human skeleton from dual video data (video) such as human-pose-estimation-via-deep-neural-networks)) (hereinafter referred to as skeleton recognition technology).
  • video video data
  • skeleton recognition technology a technology using a depth camera (RBG-D camera) or TOF (Time of Flight), such as Kinect (registered trademark) of Microsoft Corporation, can be used.
  • Kinect registered trademark
  • the skeleton recognition technology can recognize the posture of a person from two-dimensional video data. Therefore, a general movie camera can be used as the camera 20, and the system configuration is simplified.
  • FIG. 2a and 2b are explanatory views for explaining the position and orientation of the worker 30 with respect to the equipment 50.
  • FIG. 2a is an example in which the equipment 50 is installed independently, such as an automatic processing machine 50a.
  • FIG. 2b shows an example in which the equipment 50 is a production line 50b for assembly line work.
  • the first region 41 and the second region 42 are set in the photographing range 25 as shown in FIGS. 2a and 2b.
  • the first area 41 is a predetermined range in the vicinity of the equipment, for example, a range in which the worker 30 can directly work on the equipment 50 or the product set in the equipment 50.
  • the first region 41 is, for example, a range in which the worker 30 can reach the automatic processing machine 50a, the production line 50b, and the products set in them if the worker 30 reaches out.
  • the second region 42 is a region outside the first region 41. The first region 41 and the second region 42 are exclusively set.
  • the shooting range 25 is a range including at least the first area 41 and the second area 42. In FIGS. 2a and 2b, the outside of the second region 42 is also photographed.
  • the worker position / orientation determination unit 14 determines from the video data whether the worker 30 exists in the first area 41, exists in the second area 42, or does not exist (absence) in both areas.
  • the person in the image is specified by the three-dimensional coordinate value.
  • the three-dimensional coordinate system to be used and the coordinate origin are set in advance. Further, coordinate values indicating the ranges of the first region 41 and the second region 42 are also set in advance in the three-dimensional coordinate system.
  • the equipment 50 when the equipment 50 is an automatic processing machine 50a, work is performed on the outer surface of the automatic processing machine 50a, for example, an operation panel, an outer door, a raw material supply port, an access panel, etc., as the first area 41.
  • the range is set so that the person 30 can access it (the range shown in the figure a).
  • the second region 42 is set further outside from the outer peripheral end of the first region 41 (range b in the figure).
  • a range of a certain distance is set as the first region 41 in parallel with the production line 50b (range in the figure a).
  • the range of a certain distance is a range in which the worker 30 can access the work on the production line.
  • the second region 42 is set further outside from the outer peripheral end of the first region 41 (range b in the figure).
  • the second area 42 is preferably a range in which the worker 30 can immediately move to work, for example.
  • the second region 42 is, for example, a range of 1 m from the outer peripheral end of the first region 41.
  • the ranges of the first region 41 and the second region 42 are merely examples and can be set arbitrarily.
  • the worker position / orientation determination unit 14 first detects the position of the worker 30.
  • the worker position / orientation determination unit 14 extracts the coordinate values of predetermined joints from the skeleton of the worker 30 obtained by using the skeleton recognition technique, and determines in which region the worker 30 exists. Or determine if it does not exist in both.
  • the predetermined joints are, for example, the neck, shoulders, elbows, wrists, hips, knees, ankles, and the like, and may be determined in advance.
  • the worker 30 is present so as to straddle the boundary between the first region 41 and the second region 42, it is assumed that the worker 30 is present in the region including the predetermined joint of the worker 30, for example. To do. As a result, it is detected whether the worker 30 is present in either region or is not present in either region.
  • the coordinate value of the skeleton provided by the skeleton recognition technique may be used instead of the coordinate value of the joint.
  • the worker position / orientation determination unit 14 then detects the orientation of the worker 30.
  • the detection of the orientation of the worker 30 is executed only for the worker 30 existing in the first region 41.
  • the worker position / orientation determination unit 14 determines that the direction in which the arm (particularly the wrist joint) protrudes with respect to the joint position of the neck or shoulder is the direction in which the person is facing.
  • the worker 30a shown by the solid line is determined to be facing the equipment 50 because the direction in which both arms are protruding is the direction of the equipment 50.
  • the worker 30b shown by the dotted line is determined not to face the equipment 50 because both arms are facing in a direction other than the equipment 50 (diagonally backward in the figure).
  • the worker 30 if at least one arm protrudes in the direction of the equipment 50 with respect to the neck or shoulder, it is determined that the worker 30 is facing the equipment 50.
  • both arms of the worker 30 may protrude in the direction of the equipment 50, or only one hand may protrude in the direction of the equipment 50.
  • work on the operation panel such as changing the setting of the machining program or restarting can be performed with one hand. Therefore, in the present embodiment, when at least one arm is facing the equipment 50, it is determined that the worker 30 is facing the equipment 50.
  • the worker position / orientation determination unit 14 detects the position of each worker 30, and further, the plurality of workers 30 are present in the first area 41. If so, the orientation of each worker 30 is determined.
  • a two-dimensional coordinate system may be used to determine the position and orientation of the worker.
  • the camera 20 is installed at a position where the camera 20 is photographed from above the worker 30 (may be from diagonally above).
  • a two-dimensional coordinate system is set in the video data, and a first region 41 and a second region 42 are set. Then, the worker position / orientation determination unit 14 determines whether or not the worker 30 exists in each area in the same manner as described above.
  • the work classification unit 15 converts the information on the operating state of the equipment 50 from the equipment operating state acquisition unit 12 and the information on the state (position and orientation) of the worker 30 from the worker position / orientation determination unit 14 into the equipment 50. Classify the work being done on it.
  • FIG. 3 is a classification judgment table for classifying work.
  • the work classification unit 15 classifies the state of the worker 30 from the position and orientation of the worker 30 into three categories: anti-equipment work, non-equipment work, and absence.
  • the equipment work is a case where the worker 30 existing in the first area 41 is facing the equipment 50.
  • the non-equipment work is a case where the worker 30 existing in the first area 41 does not face the equipment 50 and a case where the worker 30 exists in the second area 42.
  • this non-equipment work there is a worker 30 in the vicinity of the equipment 50, and the work on the equipment 50 is not performed, but the work on the equipment 50 can be performed immediately. If it is possible to work on such equipment 50, it is classified as different from absence.
  • Absence is when the worker 30 does not exist in either the first area 41 or the second area 42.
  • the work classification unit 15 classifies the work from the combination of the operation state of the equipment 50 from the equipment operation state acquisition unit 12 and the work state of the worker 30.
  • the work to be classified is the setup change work.
  • the setup change work is a work that is performed after the production or processing of a certain product is completed, until the jig or tool of the equipment 50 is changed in order to produce or process the next product, and the production or processing of the product is completed.
  • Jig tool changes include changing jigs and tools, changing equipment programs required for production and machining, and replacing and installing additional parts required for products to be produced or machined.
  • This setup change work is classified into the following four categories: inner setup work, outer setup work, waste, and absence.
  • the internal setup work is work performed on the equipment 50 while the equipment 50 is stopped.
  • the internal setup work includes, for example, replacement of jigs and tools and tools, change of equipment program, installation of additional parts, setting of pre-processed product to equipment 50, and the like. These operations are performed while the equipment 50 is stopped. Therefore, when the equipment is stopped (yellow, red, and when the lights are off), the combination of equipment work is classified as internal setup work.
  • the external setup work is a work performed while the equipment 50 is in operation.
  • the external setup work is performed after preparatory work such as preparing jigs and tools and additional parts for the production or processing of the next product, carrying out the product whose production has been completed or processed, and cleaning and cleaning around the equipment 50. Cleaning up work, etc. It is efficient that these operations are performed while the equipment 50 is in operation, i.e., while the previous product is being produced or processed. Therefore, when the equipment is in operation (when it is green) and the combination of non-equipment work is performed, it is classified as external setup work.
  • Waste is a state in which work is being performed on other than the equipment 50 even though the equipment 50 is stopped. Waste is, for example, finding the jigs and tools needed for the next product to be produced or processed, or going to pick up additional parts. Therefore, non-equipment work when the equipment is stopped in yellow, red, and off is classified as waste.
  • Absence is when the worker 30 does not exist in either the first area 41 or the second area 42 while the equipment is in operation. During the operation of the equipment, the worker 30 does not usually have to be in the vicinity of the equipment 50, so that it is in a normal state and is simply classified as absent.
  • the work classification unit 15 records the start and end times of each classified work together.
  • the start and end of each classified work starts at the classified time and ends at the time when the next classification is switched to. Specifically, it is recorded at the following timing.
  • the time when the direction is changed is recorded.
  • the time when the worker 30 enters the first area 41 from the second area 42 or the time when the worker 30 goes out of the second area 42 is recorded.
  • the worker 30 is absent, the time when the worker 30 enters the second area 42 is recorded. If there is no change in the position and orientation of the worker 30, the time when the operating state of the equipment 50 has changed is recorded.
  • the work classification unit 15 records the classification results in association with each worker 30 identified by the worker identification unit 11.
  • the classification result associated with each worker 30 is recorded in, for example, the HDD of the server 10.
  • the product information acquisition unit 16 acquires product information, which is information on products produced and processed by the equipment 50.
  • the product information is acquired, for example, when the product before processing arrives at the equipment 50, and is recorded together with the time at that time. The time of the product information is recorded when the processing is completed.
  • the product information may be read from, for example, a barcode attached to the product, an RFID tag 31, or the like. Further, the product information may be input by the worker 30 from the equipment 50 or a terminal installed near the equipment 50. Further, the product information may be recalled in order from which the server 10 or another computer stores the product list.
  • the linking unit 17 includes worker information from the worker identification unit 11, operating status of the equipment 50 from the equipment operating status acquisition unit 12 (referred to as equipment operation information), product information from the product information acquisition unit 16, and a work classification unit.
  • equipment operation information operating status of the equipment 50 from the equipment operating status acquisition unit 12
  • product information from the product information acquisition unit 16
  • work classification unit The work classification results from 15 and the video data recorded in the video recording unit 13 are linked (linked).
  • the linking unit 17 links each information, the classification result, and the video data on the basis of time.
  • FIG. 4 is a time chart displaying each information and classification result.
  • operation means that the equipment 50 is in operation
  • pressure means work with equipment
  • non-equipment work means non-equipment work
  • inside means internal setup work
  • outside means external setup. Each work is shown. The same applies to other figures.
  • product information, equipment operating status, working status, and classification results are shown as a time chart, which makes it easier to understand how these were performed over time.
  • FIG. 5 is a pie chart showing the work classification results in terms of time.
  • FIG. 6 is a table showing the work classification results together with the time and work time.
  • this table also displays playback icons for playing back video data.
  • the playback icon is linked with the video data recorded in the video recording unit 13 based on the time.
  • FIG. 7 is a diagram showing an example of the reproduced video. For example, when the playback icon indicated by the dotted circle 300 in the figure is clicked, the image shown in FIG. 7 is played back. In this image, the state in which the worker 30 is working on the automatic processing machine 50a is projected.
  • the playback of video data is not limited to playback from a table with a playback icon as shown in FIG.
  • the video data may be reproduced by clicking each information or the classification result portion.
  • the video data to be played is, for example, from the start to the end of the specified work classification. Further, the video data to be reproduced may be reproduced including before and after from the start to the end of the designated work classification.
  • the video data to be reproduced may be reproduced from an arbitrary time regardless of the work classification.
  • FIG. 8 is a table showing the work contents, acquired information, and classification results in detail.
  • each information or classification result may be associated with video data, and the video data may be reproduced by clicking each information or classification result in the table.
  • each information linked by the linking unit 17 is output on a time basis or in a time ratio. These outputs are performed in response to a request from the terminal 18, for example, and are displayed on the display of the terminal 18.
  • FIG. 9 is a flowchart showing a processing procedure of work classification.
  • the server 10 executes the functions of the respective parts already described.
  • the server 10 identifies the worker 30 (S1).
  • the server 10 acquires the operating state of the equipment 50 (S2).
  • the server 10 acquires video data from the camera 20 (S3).
  • the server 10 acquires the product information (S4).
  • the server 10 detects the position and orientation of the worker 30 from the video data (S5).
  • the server 10 classifies the work from each obtained information (S6).
  • the server 10 associates the obtained information, the classification result, and the video data (S7).
  • the server 10 ends this process if there is an input to end the process (S8: YES). On the other hand, if there is no input for the end of processing (S8: NO), the server 10 returns to S1 and continues this processing.
  • each step of S1 to 4 is performed. It may be in any order.
  • the operating state of the equipment 50 and the working state of the worker 30 are acquired, and the setup change work is classified into four, an inner setup work, an outer setup work, waste, and absent, according to the combination thereof. ..
  • the work in which the equipment 50 is in operation, the work in which the equipment 50 is stopped, and the work unrelated to the equipment 50 can be classified. Therefore, in the present embodiment, what was conventionally classified manually by a person can be automatically classified, and the classification work can be greatly streamlined.
  • each acquired information, classification result, and video data are synchronized and linked according to time.
  • each associated information and the classification result can be displayed as a time chart, a graph, a table, or the like.
  • the video data related to them is also linked, the video data related to each information and the classification result can be easily reproduced.
  • the work situation can be visualized and the improvement analysis can be made more efficient.
  • the setup change work can be shortened, the downtime of the equipment 50 can be reduced, and the productivity can be improved.
  • each function described as an embodiment may be executed in the camera 20 by incorporating it into the camera 20 such as a board PC or integrating it with the camera 20. By doing so, this embodiment can be implemented only by installing the camera 20.
  • a cloud server on the Internet may be used.
  • the equipment 50 and the camera 20 are provided with an interface capable of connecting to the Internet, and the operating state of the equipment 50 and the video data of the camera 20 are transmitted to the cloud server.
  • the connection between the equipment 50 and the camera 20 and the Internet may be a wireless connection or a wired connection.
  • the initial investment can be suppressed because the existing webcam that can connect to the Internet can be used for the camera 20 simply by attaching an interface for connecting to the Internet to the equipment 50.
  • the operating state of the equipment 50 is acquired from the emission color of the laminated signal light 51 in the video data, only the installation of the webcam is required, so the initial investment can be further reduced. Can be done.
  • skeleton recognition technique not only the skeleton recognition technique but also other sensors may be used to detect the position and orientation of the worker 30.
  • sensors may be used to detect the position and orientation of the worker 30.
  • LIDAR Light Detection and Ringing
  • a technique of determining the direction of a person from the temperature of the person's face using an infrared sensor (thermal camera) can be used.
  • the worker identification unit 11 recognizes the worker 30 individually, but if it is not necessary to associate the work classification result for each worker 30, the individual worker 30 is recognized. No need to identify. If the individual worker 30 is not identified, the worker identification unit 11 may be omitted. If the worker identification unit 11 is not provided, the RFID tag 31, its reading device, the device for biometric authentication, and the like are naturally unnecessary, so that the capital investment cost can be reduced.
  • the operating state of the equipment 50 may be acquired by sound when the equipment 50 is equipped with an acoustic alarm that notifies the operation or abnormal stop by sound.
  • the laminated signal lamp 51 is not limited to three colors, and can be applied to a two-color laminated signal lamp 51, a one-color laminated signal lamp 51, and the like.
  • the two-color laminated signal lamp 51 the normal operation of green and the abnormal stop of red are performed. In this case, since green is in operation and red and off are stopped, processing may be performed accordingly.
  • the second region 42 is provided.
  • the work classification can be carried out even if only the first region 41 is used.
  • the work classification in this case is classified as an external setup work when the worker 30 is present in the first area 41 and does not face the equipment 50 while the equipment is in operation.
  • the equipment is in operation and the worker 30 does not exist in the first area 41, it is classified as absent.
  • Other classifications are the same as those shown in FIG.
  • the work classification program according to the present invention can also be realized by a dedicated hardware circuit.
  • this work classification program is provided by a computer-readable recording medium such as a USB (Universal Serial Bus) memory or a DVD (Digital Versaille Disc) -ROM (Read Only Memory), or is provided by a computer-readable recording medium such as the Internet regardless of the recording medium. It is also possible to provide it online via the network of.
  • this work classification program is usually stored in a magnetic disk device or the like that constitutes a storage unit.
  • this work classification program can be provided as a single application software, or can be provided by being incorporated into another software as one function.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • General Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Automation & Control Theory (AREA)
  • General Factory Administration (AREA)

Abstract

Provided is a task classification system capable of classifying tasks into those with facilities in operation and those with facilities under suspension. A task classification system (1) comprising a facility operating status acquisition unit (12) that acquires the operating status of a facility (50), a camera (20) that captures a range including a worker (30), a worker position/orientation determination unit (14) that determines the position and orientation of the worker (30) from video data captured by the camera (20), and a task classification unit (15) that classifies a changeover task according to a combination of the operating status of the facility (50) and the position and orientation of the worker (30).

Description

作業分類システムおよび作業分類プログラムWork classification system and work classification program
 本発明は、作業分類システムおよび作業分類プログラムに関する。 The present invention relates to a work classification system and a work classification program.
 生産現場では、マスカスタマイゼーションに向けて多品種少量生産のニーズが増加している。多品種少量生産では、段取り替え作業が多く発生する。段取り替え作業は、作業ロスが発生する原因となる。このため、多品種少量生産では、いかにして段取り替え作業において作業ロスを抑えるかが、課題となっている。 At the production site, there is an increasing need for high-mix low-volume production for mass customization. In high-mix low-volume production, many setup changes are required. The setup change work causes a work loss. For this reason, in high-mix low-volume production, how to suppress work loss in setup change work is an issue.
 作業ロスを抑えるためには、まず、生産現場でどのような作業が行われているか、作業を分類する必要がある。 In order to reduce work loss, it is first necessary to classify the work as to what kind of work is being performed at the production site.
 従来技術としては、たとえば、以下のような技術がある。 As the conventional technology, for example, there are the following technologies.
 特許文献1では、作業者の姿勢または動作を判定し、判定した作業者の姿勢または動作から、作業に直接関わっている作業動作と、作業に直接関わっていない非作業動作を抽出している。そして、この技術では、抽出した動作についての負担値を算出し、その結果を用いて非作業動作についての負担評価情報を提示し、この情報を設備配置、ライン設計、ライン改善、安全性向上のためなどに使用している。 In Patent Document 1, the posture or motion of the worker is determined, and the work motion directly related to the work and the non-work motion not directly related to the work are extracted from the determined posture or motion of the worker. Then, in this technology, the burden value for the extracted operation is calculated, and the load evaluation information for the non-work operation is presented using the result, and this information is used for equipment layout, line design, line improvement, and safety improvement. It is used for such purposes.
 また、特許文献2では、作業者が非定常作業を行っている旨のアラーム情報と、作業者の作業実績情報を取得して、アラーム情報と作業実績情報とを関連付けし、定常作業可能な作業者数に応じた各作業者への作業の割り振りを行っている。これにより、この技術では、非定常作業と定常作業を区別して作業管理を行っている。 Further, in Patent Document 2, the alarm information indicating that the worker is performing non-routine work and the work record information of the worker are acquired, and the alarm information and the work record information are associated with each other to perform the work capable of routine work. Work is allocated to each worker according to the number of people. As a result, in this technology, work management is performed by distinguishing between non-routine work and routine work.
 また、特許文献3では、作業開始時の作業者が写る第1の作業者画像と、作業終了時の作業者が写る第2の作業者画像とを取得して、1つの作業案件に係る作業に要した作業時間を計測する。また、この技術では、第2の作業者画像から作業案件を認識し、作業時間の計測結果および作業認識結果に基づいて、作業者および作業案件ごとの実施状況に関する情報を表示する。これにより、この技術では、画像から作業者を特定し、作業者の作業時間を取得するとともに、作業案件を認識する。このため、この技術では、端末を操作するなどの情報の入力を作業者にさせる必要がないため、作業効率の低下を避けることができる。 Further, in Patent Document 3, a first worker image in which a worker at the start of work is shown and a second worker image in which a worker at the end of work is shown are acquired, and work related to one work item is obtained. Measure the work time required for. Further, in this technique, the work matter is recognized from the second worker image, and the information on the worker and the implementation status for each work matter is displayed based on the measurement result of the work time and the work recognition result. As a result, in this technique, the worker is identified from the image, the working time of the worker is acquired, and the work matter is recognized. Therefore, in this technique, it is not necessary for the operator to input information such as operating the terminal, so that it is possible to avoid a decrease in work efficiency.
特開2017-68431号公報JP-A-2017-68431 特開2011-216014号公報JP 2011-216014 特開2015-225630号公報JP-A-2015-225630
 しかしながら、特許文献1の技術では、作業の分類を作業に直接関わった作業動作と関わっていない非作業動作に分類しているだけである。これでは、段取り替え作業のように、設備を止めなければならない作業と設備を止めなくてもできる作業など、設備の稼働状態に応じた分類ができない。 However, in the technique of Patent Document 1, the work is only classified into the work operation directly related to the work and the non-work operation not related to the work. In this case, it is not possible to classify work according to the operating state of the equipment, such as work that requires the equipment to be stopped and work that can be performed without stopping the equipment, such as setup change work.
 特許文献2の技術では、非定常作業の発生の認識は作業者から発せられるアラーム情報であり、非定常作業と定常作業などの作業分類が自動化されていない。また、特許文献2の技術においても、設備の稼働状態に応じた分類は行われていない。 In the technique of Patent Document 2, recognition of the occurrence of non-routine work is alarm information issued by an operator, and work classification such as non-routine work and routine work is not automated. Further, even in the technique of Patent Document 2, classification is not performed according to the operating state of the equipment.
 特許文献3の技術は、作業の開始・終了は作業エリアの入退出時の画像で判定しており、作業エリアの入退出の単位でしか作業を分類できない。このため、特許文献3の技術においても、設備の稼働状態に応じた分類は行われていない。 In the technique of Patent Document 3, the start / end of work is determined by the image at the time of entering / exiting the work area, and the work can be classified only by the unit of entering / exiting the work area. Therefore, even in the technique of Patent Document 3, classification is not performed according to the operating state of the equipment.
 本発明は、上記の事情に鑑みてなされたものであり、本発明の目的は、設備が稼働中の作業と、設備が停止中の作業を分類できる作業分類システムおよび作業分類プログラムを提供することである。 The present invention has been made in view of the above circumstances, and an object of the present invention is to provide a work classification system and a work classification program capable of classifying work in which the equipment is in operation and work in which the equipment is stopped. Is.
 本発明の上記の目的は、下記の手段によって達成される。 The above object of the present invention is achieved by the following means.
 (1)設備の稼働状態を取得する設備稼働状態取得部と、
 作業者を含む範囲を撮影するカメラと、
 前記カメラが撮影した映像データから前記作業者の位置および向きを判定する作業者位置・向き判定部と、
 前記設備の稼働状態と、前記作業者の位置および向きとの組み合わせから段取り替え作業を分類する作業分類部と、
 を有する、作業分類システム。
(1) The equipment operation status acquisition unit that acquires the operation status of the equipment,
A camera that captures the area including the worker,
A worker position / orientation determination unit that determines the position and orientation of the worker from the video data captured by the camera, and
A work classification unit that classifies setup change work based on the combination of the operating state of the equipment and the position and orientation of the worker.
Has a work classification system.
 (2)前記段取り替え作業は、1つの製品が生産または加工される際の作業であり、
 前記カメラの撮影範囲内であって、前記設備近傍の所定範囲を第1領域とし、
 前記作業分類部は、前記段取り替え作業を、設備停止中で前記作業者が前記第1領域内に存在し、かつ前記設備の方を向いている場合に内段取り作業と分類し、設備稼働中で前記作業者が前記第1領域内で前記設備の方を向いていない場合に外段取り作業と分類し、設備停止中で前記作業者が前記第1領域内で前記設備の方を向いていない場合または前記第1領域外である場合にムダと分類し、設備稼働中で前記作業者が前記第1領域外である場合に不在と分類する、上記(1)に記載の作業分類システム。
(2) The setup change work is a work when one product is produced or processed.
A predetermined range within the shooting range of the camera and in the vicinity of the equipment is set as the first area.
The work classification unit classifies the setup change work as an internal setup work when the worker is present in the first area and faces the equipment while the equipment is stopped, and the equipment is in operation. When the worker is not facing the equipment in the first area, it is classified as an external setup work, and the worker is not facing the equipment in the first area while the equipment is stopped. The work classification system according to (1) above, which classifies as waste when it is outside the first area, and classifies it as absent when the worker is outside the first area while the equipment is in operation.
 (3)前記段取り替え作業は、1つの製品が生産または加工される際の作業であり、
 前記カメラの撮影範囲内であって、前記設備近傍の所定範囲を第1領域とし、前記第1領域の外側の範囲を第2領域とし、
 前記作業分類部は、前記段取り替え作業を、設備停止中で前記作業者が前記第1領域内に存在し、かつ前記設備の方を向いている場合に内段取り作業と分類し、設備稼働中で前記作業者が前記第1領域内で前記設備の方を向いていない場合および前記第2領域内に存在する場合に外段取り作業と分類し、設備停止中で前記作業者が前記第1領域内で前記設備の方を向いていない場合または前記第1領域外である場合にムダと分類し、設備稼働中で前記作業者が前記第1領域および前記第2領域外である場合に不在と分類する、上記(1)に記載の作業分類システム。
(3) The setup change work is a work when one product is produced or processed.
A predetermined range within the shooting range of the camera and in the vicinity of the equipment is defined as a first region, and a range outside the first region is defined as a second region.
The work classification unit classifies the setup change work as an internal setup work when the worker is present in the first area and faces the equipment while the equipment is stopped, and the equipment is in operation. When the worker is not facing the equipment in the first area and when the worker is present in the second area, it is classified as an external setup work, and the worker is in the first area while the equipment is stopped. If the equipment is not facing the facility or is outside the first area, it is classified as waste, and if the worker is outside the first area and the second area while the facility is in operation, it is regarded as absent. The work classification system according to (1) above, which classifies.
 (4)前記作業者を識別する作業者識別部をさらに有し、
 前記作業分類部は、分類した分類結果を、前記作業者識別部が識別した前記作業者ごとに対応付けする、上記(1)~(3)のいずれか一つに記載の作業分類システム。
(4) Further having a worker identification unit for identifying the worker,
The work classification system according to any one of (1) to (3) above, wherein the work classification unit associates the classified classification results with each of the workers identified by the worker identification unit.
 (5)前記作業者位置・向き判定部は、前記映像データから前記作業者の所定の関節または骨格の座標値を抽出し、その座標値から前記作業者の位置および向きを判定する、上記(1)~(4)のいずれか一つに記載の作業分類システム。 (5) The worker position / orientation determination unit extracts the coordinate values of a predetermined joint or skeleton of the worker from the video data, and determines the position and orientation of the worker from the coordinate values. The work classification system according to any one of 1) to (4).
 (6)前記映像データを記録する映像記録部と、
 前記作業分類部が分類した分類結果と、前記映像データとを紐付けする紐付け部と、をさらに有する、上記(1)~(5)のいずれか一つに記載の作業分類システム。
(6) A video recording unit that records the video data and
The work classification system according to any one of (1) to (5) above, further comprising a classification result classified by the work classification unit and a linking unit for linking the video data.
 (7)前記紐付け部は、前記設備稼働状態取得部が取得した稼働状態、前記作業者位置・向き判定部が判定した前記作業者の位置および向きを、前記分類結果および前記映像データにさらに紐付けする、上記(6)に記載の作業分類システム。 (7) The linking unit further adds the operating state acquired by the equipment operating state acquisition unit and the position and orientation of the worker determined by the worker position / orientation determination unit to the classification result and the video data. The work classification system according to (6) above, which is linked.
 (8)設備の稼働状態を取得する段階(a)と、
 作業者を含む範囲が撮影された映像データから前記作業者の位置および向きを判定する段階(b)と、
 前記設備の稼働状態と、前記作業者の位置および向きとの組み合わせから段取り替え作業を分類する段階(c)と、
 を、コンピューターに実行させるための作業分類プログラム。
(8) The stage (a) of acquiring the operating status of the equipment and
The step (b) of determining the position and orientation of the worker from the video data in which the range including the worker is captured, and
The stage (c) of classifying the setup change work based on the combination of the operating state of the equipment and the position and orientation of the worker, and
A work classification program that allows a computer to execute.
 (9)前記段取り替え作業は、1つの製品が生産または加工される際の作業であり、
 前記段階(c)は、前記映像データが撮影された範囲内であって、前記設備近傍の所定範囲を第1領域とし、前記段取り替え作業を、設備停止中で前記作業者が前記第1領域内に存在し、かつ前記設備の方を向いている場合に内段取り作業と分類し、設備稼働中で前記作業者が前記第1領域内で前記設備の方を向いていない場合に外段取り作業と分類し、設備停止中で前記作業者が前記第1領域内で前記設備の方を向いていない場合または前記第1領域外である場合にムダと分類し、設備稼働中で前記作業者が前記第1領域外である場合に不在と分類する、上記(8)に記載の作業分類プログラム。
(9) The setup change work is a work when one product is produced or processed.
In the step (c), a predetermined range in the vicinity of the equipment is set as the first region within the range in which the video data is captured, and the setup change work is performed by the worker in the first region while the equipment is stopped. It is classified as an inner setup work when it exists inside and faces the equipment, and when the worker is not facing the equipment in the first area while the equipment is in operation, the outer setup work. When the equipment is stopped and the worker is not facing the equipment in the first area or outside the first area, it is classified as waste, and the worker is operating the equipment. The work classification program according to (8) above, which classifies as absent when it is outside the first area.
 (10)前記段取り替え作業は、1つの製品が生産または加工される際の作業であり、
 前記段階(c)は、前記映像データが撮影された範囲内であって、前記設備近傍の所定範囲を第1領域とし、前記第1領域の外側の範囲を第2領域とし、前記段取り替え作業を、設備停止中で前記作業者が前記第1領域内に存在し、かつ前記設備の方を向いている場合に内段取り作業と分類し、設備稼働中で前記作業者が前記第1領域内で前記設備の方を向いていない場合および前記第2領域内に存在する場合に外段取り作業と分類し、設備停止中で前記作業者が前記第1領域内で前記設備の方を向いていない場合または前記第1領域外である場合にムダと分類し、設備稼働中で前記作業者が前記第1領域および前記第2領域外である場合に不在と分類する、上記(8)に記載の作業分類プログラム。
(10) The setup change work is a work when one product is produced or processed.
In the step (c), the setup change operation is performed by setting a predetermined range in the vicinity of the equipment as a first region and a range outside the first region as a second region within the range in which the video data is captured. Is classified as an internal setup work when the worker is present in the first area while the equipment is stopped and is facing the equipment, and the worker is in the first area while the equipment is in operation. When the equipment is not facing the equipment and when it is present in the second area, it is classified as an external setup work, and the worker is not facing the equipment in the first area while the equipment is stopped. The case or when it is outside the first area, it is classified as waste, and when the worker is outside the first area and the second area, it is classified as absent, according to the above (8). Work classification program.
 (11)前記作業者を識別する段階(d)と、
 前記段階(c)により分類した分類結果を、識別した前記作業者ごとに対応付けする段階(e)と、をさらに有する、上記(8)~(10)のいずれか一つに記載の作業分類プログラム。
(11) The step (d) of identifying the worker and
The work classification according to any one of (8) to (10) above, further comprising a step (e) of associating the classification results classified according to the step (c) with each identified worker. program.
 (12)前記段階(b)は、前記映像データから前記作業者の所定の関節または骨格の座標値を抽出し、その座標値から前記作業者の位置および向きを判定する、上記(8)~(11)のいずれか一つに記載の作業分類プログラム。 (12) In the step (b), the coordinate values of a predetermined joint or skeleton of the worker are extracted from the video data, and the position and orientation of the worker are determined from the coordinate values. The work classification program according to any one of (11).
 (13)前記映像データを記録する段階(f)と、
 前記段階(c)により分類した分類結果と、前記映像データとを紐付けする段階(g)と、をさらに有する、上記(8)~(12)のいずれか一つに記載の作業分類プログラム。
(13) The step (f) of recording the video data and
The work classification program according to any one of (8) to (12) above, further comprising a classification result classified by the step (c) and a step (g) for associating the video data.
 (14)前記段階(g)は、前記稼働状態、前記作業者の位置および向きを、前記分類結果および前記映像データにさらに紐付けする、上記(13)に記載の作業分類プログラム。 (14) The work classification program according to (13) above, wherein the step (g) further links the operating state, the position and orientation of the worker, to the classification result and the video data.
 本発明によれば、設備の稼働状態を取得するとともに、作業者の位置および向きを判定して、これらの組み合わせから段取り替え作業を分類する。これにより、本発明では、設備が稼働中の作業と、設備が停止中の作業を分類できる。 According to the present invention, the operating state of the equipment is acquired, the position and orientation of the worker are determined, and the setup change work is classified from the combination thereof. Thereby, in the present invention, the work in which the equipment is in operation and the work in which the equipment is stopped can be classified.
実施形態の作業分類システムの機能構成を説明するための説明図である。It is explanatory drawing for demonstrating the functional structure of the work classification system of embodiment. 設備に対する作業者の位置と向きを説明するための説明図である。It is explanatory drawing for demonstrating the position and direction of a worker with respect to equipment. 設備に対する作業者の位置と向きを説明するための説明図である。It is explanatory drawing for demonstrating the position and direction of a worker with respect to equipment. 作業を分類するための分類判定表である。It is a classification judgment table for classifying work. 各情報および分類結果を表示させたタイムチャートである。It is a time chart which displayed each information and classification result. 作業の分類結果を時間割合で表示した円グラフである。It is a pie chart showing the classification result of work by the time ratio. 作業の分類結果を時刻および作業時間とともに示す表である。It is a table which shows the classification result of work together with time and work time. 再生された映像例を示す図である。It is a figure which shows the reproduced video example. 作業内容、取得した情報、および分類結果を詳細に示す表である。It is a table which shows the work contents, the acquired information, and the classification result in detail. 作業分類の処理手順を示すフローチャートである。It is a flowchart which shows the processing procedure of work classification.
 以下、図面を参照して、本発明の実施形態を詳細に説明する。なお、図面の説明において同一の要素には同一の符号を付し、重複する説明を省略する。また、図面の寸法比率は、説明の都合上誇張されており、実際の比率とは異なる場合がある。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the description of the drawings, the same elements are designated by the same reference numerals, and duplicate description will be omitted. In addition, the dimensional ratios in the drawings are exaggerated for convenience of explanation and may differ from the actual ratios.
 図1は、実施形態の作業分類システムの機能構成を説明するための説明図である。 FIG. 1 is an explanatory diagram for explaining the functional configuration of the work classification system of the embodiment.
 作業分類システム1は、サーバー10と、サーバー10に接続されたカメラ20と、サーバー10に接続された端末18を有する。 The work classification system 1 has a server 10, a camera 20 connected to the server 10, and a terminal 18 connected to the server 10.
 サーバー10は、その機能として、作業者識別部11、設備稼働状態取得部12、映像記録部13、作業者位置・向き判定部14、作業分類部15、製品情報取得部16、および紐付け部17を有する。 The server 10 has the functions of the worker identification unit 11, the equipment operation status acquisition unit 12, the video recording unit 13, the worker position / orientation determination unit 14, the work classification unit 15, the product information acquisition unit 16, and the linking unit. Has 17.
 サーバー10は、コンピューターであり、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)、HDD(Hard Disk Drive)、および外部接続のためのインターフェース(IF)などを有する。また、サーバー10は、キーボードやマウスなどの入力装置、ディスプレイなどの出力装置を有する。 The server 10 is a computer and has a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), an HDD (Hard Disk Drive), and an interface (IF) for external connection. Further, the server 10 has an input device such as a keyboard and a mouse, and an output device such as a display.
 サーバー10においては、CPUがROMやHDDに記憶されているプログラムを読み出して実行することで、各機能が実現される。 In the server 10, each function is realized by the CPU reading and executing the program stored in the ROM or HDD.
 また、サーバー10においては、インターフェースが、設備50、カメラ20、端末18との通信を行う。インターフェースは、たとえば、イーサネット(登録商標)、SATA、PCI Express、USB、IEEE1394などの規格によるネットワークインターフェース、Bluetooth(登録商標)、IEEE802.11などの無線通信インターフェースなどの各種ローカル接続インターフェースなどである。また、インターフェースは、設備50およびカメラ20とローカルなアナログ接続を行うための回線でもよい。また、インターフェースは、端末18と接続するために、無線LAN(WiFi)や携帯電話回線でもよい。 Further, in the server 10, the interface communicates with the equipment 50, the camera 20, and the terminal 18. The interface includes, for example, a network interface according to a standard such as Ethernet (registered trademark), SATA, PCI Express, USB, and IEEE 1394, and various local connection interfaces such as a wireless communication interface such as Bluetooth (registered trademark) and IEEE 802.11. The interface may also be a line for making a local analog connection with the equipment 50 and the camera 20. Further, the interface may be a wireless LAN (WiFi) or a mobile phone line for connecting to the terminal 18.
 端末18は、コンピューターである。端末18としては、たとえば、パーソナルコンピューター(PC(Personal Computer))、タブレット、スマートフォンなどが利用される。これらは、作業者や管理者、その他の関係者が使用するコンピューターである。これらのコンピューターは、キーボード、マウス、タッチパネル、ディスプレイなどの入力装置および出力装置を有する。 The terminal 18 is a computer. As the terminal 18, for example, a personal computer (PC (Personal Computer)), a tablet, a smartphone, or the like is used. These are computers used by workers, managers, and other stakeholders. These computers have input and output devices such as keyboards, mice, touch panels, and displays.
 サーバー10の各機能を説明する。 Each function of the server 10 will be explained.
 作業者識別部11は、作業領域(詳細後述)に入ってきた作業者30を個別に認識する。作業者30の識別は、たとえば、RFID(Radio Frequency Identifier)タグ31を用いたり、顔認証、虹彩認証、指紋認証などの生体認証を用いたりして行われる。たとえば、RFIDタグ31を用いる場合は、RFIDタグ31を作業者30に携帯させたり、作業衣や帽子、ヘルメットなどに取り付けたりする。一方、作業領域には、RFIDタグ31を読み取る読み取り装置を設置する。そして、作業者識別部11は、読み取り装置によってRFIDタグ31を読み取らせて、作業者ひとりひとりの識別情報を取得する。また、虹彩認証を用いる場合も同様に、作業者識別部11は、これらの認証センサーやカメラを用いて作業者30を識別する。 The worker identification unit 11 individually recognizes the worker 30 that has entered the work area (details will be described later). The worker 30 is identified, for example, by using an RFID (Radio Frequency Identification) tag 31 or by using biometric authentication such as face authentication, iris authentication, and fingerprint authentication. For example, when the RFID tag 31 is used, the RFID tag 31 is carried by the worker 30 or attached to a work clothes, a hat, a helmet, or the like. On the other hand, a reading device for reading the RFID tag 31 is installed in the work area. Then, the worker identification unit 11 causes the RFID tag 31 to be read by the reading device to acquire the identification information of each worker. Similarly, when iris recognition is used, the worker identification unit 11 identifies the worker 30 by using these authentication sensors and cameras.
 設備稼働状態取得部12は、設備50の稼働状態を取得する。設備50の稼働状態は、たとえば、生産のための正常な運転による稼働、加工が完了して停止した完了停止、異常が発生したときの異常停止、および設備50を使用しない停止である。 The equipment operating state acquisition unit 12 acquires the operating state of the equipment 50. The operating state of the equipment 50 is, for example, an operation by normal operation for production, a completion stop when machining is completed and stopped, an abnormal stop when an abnormality occurs, and a stop where the equipment 50 is not used.
 設備稼働状態取得部12は、たとえは、設備50に備え付けられている積層信号灯51の発光色を検出したり、スイッチング信号を検出したりして、設備の稼働状態を取得する。発光色やスイッチング信号を検出する場合は、それらを検出する専用のセンサー(不図示)が用いられる。設備稼働状態取得部12(サーバー10)は、発光色やスイッチング信号を検出するセンサーと接続される。 The equipment operating state acquisition unit 12 acquires the operating state of the equipment by, for example, detecting the emission color of the laminated signal lamp 51 provided in the equipment 50 or detecting the switching signal. When detecting the emission color or switching signal, a dedicated sensor (not shown) for detecting them is used. The equipment operating state acquisition unit 12 (server 10) is connected to a sensor that detects the emission color and the switching signal.
 積層信号灯51は、3色積層灯、シグナルタワーなどとも称されており、その多くは、緑色、黄色、赤色の3色のランプが設備50の稼働状態に合わせて発光する。積層信号灯51は、通常、正常稼働中は緑色、生産完了による停止中は黄色、異常発生時には赤色となる。 The laminated signal lamp 51 is also called a three-color laminated lamp, a signal tower, or the like, and most of them emit light of three color lamps of green, yellow, and red according to the operating state of the equipment 50. The laminated signal light 51 is usually green during normal operation, yellow when stopped due to production completion, and red when an abnormality occurs.
 また、設備稼働状態取得部12は、設備50から稼働状態を示す信号が出力されている場合は、その信号を取得してもよい。この場合、設備稼働状態取得部12(サーバー10)は、設備50と接続される。 Further, when the equipment operating state acquisition unit 12 outputs a signal indicating the operating state from the equipment 50, the equipment operating state acquisition unit 12 may acquire the signal. In this case, the equipment operating state acquisition unit 12 (server 10) is connected to the equipment 50.
 また、設備稼働状態取得部12は、カメラ20の映像データから稼働状態を取得してもよい。この場合は、カメラ20の撮影範囲に積層信号灯51が入るようにする。設備稼働状態取得部12は、映像データから積層信号灯51の発光色を検出する。 Further, the equipment operating state acquisition unit 12 may acquire the operating state from the video data of the camera 20. In this case, the laminated signal light 51 is placed in the shooting range of the camera 20. The equipment operating state acquisition unit 12 detects the emission color of the laminated signal lamp 51 from the video data.
 映像記録部13は、カメラ20からの映像データを記録する。映像記録部13は、たとえば、サーバー10内のHDDである。映像データは、時間とともに記録される。 The video recording unit 13 records video data from the camera 20. The video recording unit 13 is, for example, an HDD in the server 10. Video data is recorded over time.
 作業者位置・向き判定部14は、映像データから作業者30の位置および向きを検出する。 The worker position / orientation determination unit 14 detects the position and orientation of the worker 30 from the video data.
 カメラ20は、作業者30が設備50に対して作業を行う作業領域を含む範囲を撮影範囲25とする。既に説明したように、カメラ20の映像データから設備50の稼働状態を検出する場合は、撮影範囲25に積層信号灯51を含むようにする。 The camera 20 has a shooting range 25 including a work area in which the worker 30 works on the equipment 50. As described above, when the operating state of the equipment 50 is detected from the video data of the camera 20, the laminated signal light 51 is included in the shooting range 25.
 作業者30の位置および向きの検出には、既存の技術を用いることができる。たとえば、オープンポーズ(Open Pose(https://github.com/CMU-Perceptual-Computing-Lab/openpose など参照))、ディープポーズ(Deep Pose(https://www.slideshare.net/mitmul/deeppose-human-pose-estimation-via-deep-neural-networks など参照))など2元の映像データ(動画)から人の骨格を推定して姿勢を認識する技術(以下、骨格認識技術という)を用いることができる。また、骨格認識技術としては、他にも、たとえばマイクロソフト社のキネクト(登録商標)など、デプスカメラ(RBG-Dカメラ)やTOF(Time of Flight)を使用した技術を用いることができる。 Existing technology can be used to detect the position and orientation of the worker 30. For example, open pose (open pose (see https://github.com/CMU-Perceptual-Computing-Lab/openpose, etc.)), deep pose (deep pose (https://www.slideshare.net/mitmul/deeppose-) Use technology that recognizes the posture by estimating the human skeleton from dual video data (video) such as human-pose-estimation-via-deep-neural-networks)) (hereinafter referred to as skeleton recognition technology). Can be done. In addition, as the skeleton recognition technology, a technology using a depth camera (RBG-D camera) or TOF (Time of Flight), such as Kinect (registered trademark) of Microsoft Corporation, can be used.
 骨格認識技術は、2次元の映像データから人の姿勢を認識できる。したがって、カメラ20としては、一般的なムービーカメラを用いることができ、システムの構成が簡単になる。 The skeleton recognition technology can recognize the posture of a person from two-dimensional video data. Therefore, a general movie camera can be used as the camera 20, and the system configuration is simplified.
 図2aおよび図2bは、設備50に対する作業者30の位置と向きを説明するための説明図である。図2aは、設備50が、自動加工機50aなどのように単独で設置されている例である。図2bは、設備50が、流れ作業用の生産ライン50bの例である。 2a and 2b are explanatory views for explaining the position and orientation of the worker 30 with respect to the equipment 50. FIG. 2a is an example in which the equipment 50 is installed independently, such as an automatic processing machine 50a. FIG. 2b shows an example in which the equipment 50 is a production line 50b for assembly line work.
 骨格認識技術を用いた作業者30の位置の検出においては、図2aおよび図2bに示すように、撮影範囲25の中に、第1領域41および第2領域42が設定される。第1領域41は、設備近傍の所定範囲であり、たとえば、設備50または設備50にセットされている製品に対して作業者30が直接作業できる範囲である。具体的には、第1領域41は、たとえば、作業者30が手を伸ばせば、自動加工機50aや生産ライン50b、さらにそれらにセットされた製品に届く範囲である。一方、第2領域42は、第1領域41の外側の領域である。第1領域41と第2領域42は排他的に設定される。 In detecting the position of the worker 30 using the skeleton recognition technique, the first region 41 and the second region 42 are set in the photographing range 25 as shown in FIGS. 2a and 2b. The first area 41 is a predetermined range in the vicinity of the equipment, for example, a range in which the worker 30 can directly work on the equipment 50 or the product set in the equipment 50. Specifically, the first region 41 is, for example, a range in which the worker 30 can reach the automatic processing machine 50a, the production line 50b, and the products set in them if the worker 30 reaches out. On the other hand, the second region 42 is a region outside the first region 41. The first region 41 and the second region 42 are exclusively set.
 撮影範囲25は、少なくとも第1領域41と第2領域42を含む範囲とする。図2aおよび図2bにおいては、第2領域42のさらに外側も撮影されている例である。 The shooting range 25 is a range including at least the first area 41 and the second area 42. In FIGS. 2a and 2b, the outside of the second region 42 is also photographed.
 作業者位置・向き判定部14は、映像データから作業者30が第1領域41に存在するか、第2領域42に存在するか、または両方の領域に存在しない(不在)かを判定する。 The worker position / orientation determination unit 14 determines from the video data whether the worker 30 exists in the first area 41, exists in the second area 42, or does not exist (absence) in both areas.
 骨格認識技術においては、映像の中にいる人物が3次元座標値で特定される。使用する3次元座標系や座標原点はあらかじめ設定される。また、第1領域41および第2領域42の範囲を示す座標値も、3次元座標系の中であらかじめ設定される。 In the skeleton recognition technology, the person in the image is specified by the three-dimensional coordinate value. The three-dimensional coordinate system to be used and the coordinate origin are set in advance. Further, coordinate values indicating the ranges of the first region 41 and the second region 42 are also set in advance in the three-dimensional coordinate system.
 たとえば、図2aに示すように、設備50が自動加工機50aの場合は、第1領域41として、自動加工機50aの外面、たとえば操作パネルや外面の扉、原材料供給口、アクセスパネルなどに作業者30がアクセスできる範囲(図示aの範囲)に設定される。第2領域42は、第1領域41の外周端からさらに外側に設定される(図示bの範囲)。 For example, as shown in FIG. 2a, when the equipment 50 is an automatic processing machine 50a, work is performed on the outer surface of the automatic processing machine 50a, for example, an operation panel, an outer door, a raw material supply port, an access panel, etc., as the first area 41. The range is set so that the person 30 can access it (the range shown in the figure a). The second region 42 is set further outside from the outer peripheral end of the first region 41 (range b in the figure).
 また、たとえば、図2bに示すように、設備50が生産ライン50bの場合は、第1領域41として、生産ライン50bと平行に一定距離の範囲が設定される(図示aの範囲)。一定距離の範囲は、作業者30が生産ライン上のワークにアクセスできる範囲などである。第2領域42は、第1領域41の外周端からさらに外側に設定される(図示bの範囲)。 Further, for example, as shown in FIG. 2b, when the equipment 50 is the production line 50b, a range of a certain distance is set as the first region 41 in parallel with the production line 50b (range in the figure a). The range of a certain distance is a range in which the worker 30 can access the work on the production line. The second region 42 is set further outside from the outer peripheral end of the first region 41 (range b in the figure).
 なお、第2領域42は、上記のいずれの場合も、たとえば、作業者30が、すぐに作業に移ることができるような範囲とすることが好ましい。第2領域42は、具体的には、たとえば、第1領域41の外周端から1mの範囲とする。 In any of the above cases, the second area 42 is preferably a range in which the worker 30 can immediately move to work, for example. Specifically, the second region 42 is, for example, a range of 1 m from the outer peripheral end of the first region 41.
 もちろん、これら第1領域41と第2領域42の範囲は、あくまでも一例であり、任意に設定される。 Of course, the ranges of the first region 41 and the second region 42 are merely examples and can be set arbitrarily.
 作業者位置・向き判定部14は、まず、作業者30の位置検出を行う。作業者位置・向き判定部14は、骨格認識技術を用いて得られた作業者30の骨格の中から、所定の関節の座標値を抽出し、作業者30がどの領域内に存在するか、または両方に存在しないかを判定する。所定の関節は、たとえば、首、肩、肘、手首、腰、膝、足首などであり、あらかじめ決めておけばよい。第1領域41と第2領域42との境界を跨ぐように作業者30が存在する場合は、たとえば作業者30の所定の関節が含まれる方の領域に、その作業者30が存在するものとする。これにより、作業者30がどちらの領域に存在しているか、またはどちらの領域にも存在していないかが検出される。なお、関節の座標値に変えて、骨格認識技術により提供される骨格の座標値が使用されてもよい。 The worker position / orientation determination unit 14 first detects the position of the worker 30. The worker position / orientation determination unit 14 extracts the coordinate values of predetermined joints from the skeleton of the worker 30 obtained by using the skeleton recognition technique, and determines in which region the worker 30 exists. Or determine if it does not exist in both. The predetermined joints are, for example, the neck, shoulders, elbows, wrists, hips, knees, ankles, and the like, and may be determined in advance. When the worker 30 is present so as to straddle the boundary between the first region 41 and the second region 42, it is assumed that the worker 30 is present in the region including the predetermined joint of the worker 30, for example. To do. As a result, it is detected whether the worker 30 is present in either region or is not present in either region. In addition, the coordinate value of the skeleton provided by the skeleton recognition technique may be used instead of the coordinate value of the joint.
 作業者位置・向き判定部14は、次に、作業者30の向きを検出する。作業者30の向きの検出は、第1領域41に存在する作業者30に対してのみ実行する。 The worker position / orientation determination unit 14 then detects the orientation of the worker 30. The detection of the orientation of the worker 30 is executed only for the worker 30 existing in the first region 41.
 作業者位置・向き判定部14は、首または肩の関節位置に対して腕(特に手首関節)が突出している方向が、その人の向いている方向であると判定する。図2aおよび図2bにおいては、実線で示した作業者30aは、両腕が突出している方向が設備50の方向であるので、設備50の方を向いていると判定される。一方、点線で示した作業者30bは、両腕が設備50以外の方向(図では斜め後ろ向き)を向いているので、設備50の方を向いていないと判定される。 The worker position / orientation determination unit 14 determines that the direction in which the arm (particularly the wrist joint) protrudes with respect to the joint position of the neck or shoulder is the direction in which the person is facing. In FIGS. 2a and 2b, the worker 30a shown by the solid line is determined to be facing the equipment 50 because the direction in which both arms are protruding is the direction of the equipment 50. On the other hand, the worker 30b shown by the dotted line is determined not to face the equipment 50 because both arms are facing in a direction other than the equipment 50 (diagonally backward in the figure).
 なお、本実施形態では、少なくとも一方の腕が首または肩に対して、設備50の方向に突出していれば、作業者30が設備50を向いていると判定する。設備50に対しての作業中は、作業者30の両腕が設備50の方向に突出している場合もあるし、片手だけ設備50の方向に突出している場合もある。たとえば、加工プログラムの設定変更や再起動など、操作パネルへの作業は片手でも行うことができる。そこで、本実施形態では、少なくとも一方の腕が設備50を向いている場合には、作業者30が設備50を向いていると判定することとした。 In the present embodiment, if at least one arm protrudes in the direction of the equipment 50 with respect to the neck or shoulder, it is determined that the worker 30 is facing the equipment 50. During the work on the equipment 50, both arms of the worker 30 may protrude in the direction of the equipment 50, or only one hand may protrude in the direction of the equipment 50. For example, work on the operation panel such as changing the setting of the machining program or restarting can be performed with one hand. Therefore, in the present embodiment, when at least one arm is facing the equipment 50, it is determined that the worker 30 is facing the equipment 50.
 作業者位置・向き判定部14は、複数の作業者30が撮影範囲25内に存在する場合、それぞれの作業者30について位置を検出し、さらに第1領域41内に複数の作業者30が存在する場合は、それぞれの作業者30の向きを判定する。 When a plurality of workers 30 are within the photographing range 25, the worker position / orientation determination unit 14 detects the position of each worker 30, and further, the plurality of workers 30 are present in the first area 41. If so, the orientation of each worker 30 is determined.
 なお、作業者位置および向きの判定は、2次元座標系を用いてもよい。2次元座標系を用いる場合は、たとえば、カメラ20を、作業者30の上(斜め上方からでもよい)から撮影する位置に設置する。映像データの中に2次元座標系が設定され、第1領域41と第2領域42が設定される。そして、作業者位置・向き判定部14は、上記同様に、それぞれの領域に作業者30が存在するか否かを判定する。 A two-dimensional coordinate system may be used to determine the position and orientation of the worker. When the two-dimensional coordinate system is used, for example, the camera 20 is installed at a position where the camera 20 is photographed from above the worker 30 (may be from diagonally above). A two-dimensional coordinate system is set in the video data, and a first region 41 and a second region 42 are set. Then, the worker position / orientation determination unit 14 determines whether or not the worker 30 exists in each area in the same manner as described above.
 作業分類部15は、設備稼働状態取得部12からの設備50の稼働状態の情報と、作業者位置・向き判定部14からの作業者30の状態(位置および向き)の情報から、設備50に対して行われている作業を分類する。 The work classification unit 15 converts the information on the operating state of the equipment 50 from the equipment operating state acquisition unit 12 and the information on the state (position and orientation) of the worker 30 from the worker position / orientation determination unit 14 into the equipment 50. Classify the work being done on it.
 図3は、作業を分類するための分類判定表である。 FIG. 3 is a classification judgment table for classifying work.
 まず、作業分類部15は、作業者30の位置および向きから作業者30の状態を、対設備作業、非設備作業、および不在の3つに分類する。 First, the work classification unit 15 classifies the state of the worker 30 from the position and orientation of the worker 30 into three categories: anti-equipment work, non-equipment work, and absence.
 対設備作業は、第1領域41に存在する作業者30が設備50の方を向いている場合である。 The equipment work is a case where the worker 30 existing in the first area 41 is facing the equipment 50.
 非設備作業は、第1領域41に存在する作業者30が設備50の方を向いていない場合および第2領域42に存在する場合である。この非設備作業は、設備50の近傍に作業者30がいて、設備50に対する作業は行っていないが、すぐに設備50に対する作業を行い得る状態である。このような設備50に対する作業を行うことができる場合は、不在とは異なる分類としている。 The non-equipment work is a case where the worker 30 existing in the first area 41 does not face the equipment 50 and a case where the worker 30 exists in the second area 42. In this non-equipment work, there is a worker 30 in the vicinity of the equipment 50, and the work on the equipment 50 is not performed, but the work on the equipment 50 can be performed immediately. If it is possible to work on such equipment 50, it is classified as different from absence.
 不在は、作業者30が第1領域41にも第2領域42にも存在しない場合である。 Absence is when the worker 30 does not exist in either the first area 41 or the second area 42.
 続いて、作業分類部15は、設備稼働状態取得部12からの設備50の稼働状態と、作業者30の作業状態の組み合わせから作業を分類する。 Subsequently, the work classification unit 15 classifies the work from the combination of the operation state of the equipment 50 from the equipment operation state acquisition unit 12 and the work state of the worker 30.
 分類される作業は、段取り替え作業である。段取り替え作業とは、ある製品の生産または加工が終了した後、次の製品を生産または加工するために、設備50の治工具を変更し、その製品の生産または加工が完了するまで行われる作業である。治工具の変更には、治具やツールの交換、生産や加工に必要な設備用プログラムの変更、生産または加工する製品に必要な追加部品の交換や取り付けなどが含まれる。 The work to be classified is the setup change work. The setup change work is a work that is performed after the production or processing of a certain product is completed, until the jig or tool of the equipment 50 is changed in order to produce or process the next product, and the production or processing of the product is completed. Is. Jig tool changes include changing jigs and tools, changing equipment programs required for production and machining, and replacing and installing additional parts required for products to be produced or machined.
 この段取り替え作業は、内段取り作業、外段取り作業、ムダ、および不在の4つに次の4つに分類される。 This setup change work is classified into the following four categories: inner setup work, outer setup work, waste, and absence.
 内段取り作業は、設備50が停止中に、設備50に対して行われる作業である。内段取り作業は、たとえば、治工具やツールの交換、設備用プログラムの変更、付加部品の取り付け、加工前製品の設備50へのセットなどである。これらの作業は、設備50が停止中に行われる。したがって、設備停止中(黄色、赤色、および消灯のとき)で、対設備作業の組み合わせの場合に、内段取り作業と分類される。 The internal setup work is work performed on the equipment 50 while the equipment 50 is stopped. The internal setup work includes, for example, replacement of jigs and tools and tools, change of equipment program, installation of additional parts, setting of pre-processed product to equipment 50, and the like. These operations are performed while the equipment 50 is stopped. Therefore, when the equipment is stopped (yellow, red, and when the lights are off), the combination of equipment work is classified as internal setup work.
 外段取り作業は、設備50が稼働中に行われる作業である。外段取り作業は、たとえば、次の製品の生産または加工のための治工具や付加部品を用意するといった準備作業、生産完了または加工完了した製品の搬出、設備50の周りの片づけや清掃などの後始末作業などである。これらの作業は、設備50が稼働中、すなわち、前の製品が生産または加工されている間に行われることが効率的である。したがって、設備稼働中(緑色のとき)で、非設備作業の組み合わせの場合に、外段取り作業と分類される。 The external setup work is a work performed while the equipment 50 is in operation. The external setup work is performed after preparatory work such as preparing jigs and tools and additional parts for the production or processing of the next product, carrying out the product whose production has been completed or processed, and cleaning and cleaning around the equipment 50. Cleaning up work, etc. It is efficient that these operations are performed while the equipment 50 is in operation, i.e., while the previous product is being produced or processed. Therefore, when the equipment is in operation (when it is green) and the combination of non-equipment work is performed, it is classified as external setup work.
 ムダは、設備50が停止中にも関わらず、設備50以外に対して作業が行われている状態である。ムダは、たとえば、次に生産または加工する製品があるにもかかわらず、それに必要な治工具を探したり、付加部品を取りに行ったりするなどである。したがって、設備停止中を示す黄色、赤色、および消灯のときに非設備作業であれば、ムダと分類される。 Waste is a state in which work is being performed on other than the equipment 50 even though the equipment 50 is stopped. Waste is, for example, finding the jigs and tools needed for the next product to be produced or processed, or going to pick up additional parts. Therefore, non-equipment work when the equipment is stopped in yellow, red, and off is classified as waste.
 不在は、設備稼働中に、作業者30が第1領域41にも第2領域42にも存在しない場合である。設備稼働中は、通常、設備50付近に作業者30がいなくてもよいので、正常な状態であり、単に、不在と分類される。 Absence is when the worker 30 does not exist in either the first area 41 or the second area 42 while the equipment is in operation. During the operation of the equipment, the worker 30 does not usually have to be in the vicinity of the equipment 50, so that it is in a normal state and is simply classified as absent.
 また、作業分類部15は、分類された各作業の開始および終了時間を合わせて記録する。分類された各作業の開始および終了は、分類された時刻を開始、次の分類に切り替わる時刻を終了とする。具体的には、以下のタイミングで記録する。作業者30が第1領域41内に存在する場合は、向きが変わった時刻を記録する。作業者30が第2領域42内に存在する場合は、第2領域42から第1領域41へ入った時刻または第2領域42の外に出た時刻を記録する。作業者30が不在の場合は、第2領域42内に入った時刻を記録する。作業者30の位置および向きに変更がない場合は、設備50の稼働状態が変わった時刻を記録する。 In addition, the work classification unit 15 records the start and end times of each classified work together. The start and end of each classified work starts at the classified time and ends at the time when the next classification is switched to. Specifically, it is recorded at the following timing. When the worker 30 is in the first area 41, the time when the direction is changed is recorded. When the worker 30 is present in the second area 42, the time when the worker 30 enters the first area 41 from the second area 42 or the time when the worker 30 goes out of the second area 42 is recorded. When the worker 30 is absent, the time when the worker 30 enters the second area 42 is recorded. If there is no change in the position and orientation of the worker 30, the time when the operating state of the equipment 50 has changed is recorded.
 また、作業分類部15は、作業者識別部11が識別した作業者30ごとに、分類結果を対応付けして記録する。作業者30ごとに対応付けした分類結果は、たとえばサーバー10のHDDへ記録される。 Further, the work classification unit 15 records the classification results in association with each worker 30 identified by the worker identification unit 11. The classification result associated with each worker 30 is recorded in, for example, the HDD of the server 10.
 製品情報取得部16は、設備50で生産、加工される製品の情報である製品情報を取得する。製品情報は、たとえば、設備50に加工前の製品が到着した時点で取得されて、そのときの時刻とともに記録される。製品情報は、加工完了とともに、その時刻が記録される。製品情報は、たとえば、製品に取り付けられているバーコードやRFIDタグ31などから読み取られるようにしてもよい。また、製品情報は、作業者30によって、設備50または設備50の近くに設置された端末から入力されてもよい。また、製品情報は、サーバー10または他のコンピューターが製品リストを記憶していて、そこから順に呼び出されるものとしてもよい。 The product information acquisition unit 16 acquires product information, which is information on products produced and processed by the equipment 50. The product information is acquired, for example, when the product before processing arrives at the equipment 50, and is recorded together with the time at that time. The time of the product information is recorded when the processing is completed. The product information may be read from, for example, a barcode attached to the product, an RFID tag 31, or the like. Further, the product information may be input by the worker 30 from the equipment 50 or a terminal installed near the equipment 50. Further, the product information may be recalled in order from which the server 10 or another computer stores the product list.
 紐付け部17は、作業者識別部11からの作業者情報、設備稼働状態取得部12からの設備50の稼働状態(設備稼働情報という)、製品情報取得部16からの製品情報、作業分類部15からの作業の分類結果、および映像記録部13に記録された映像データを紐付けする(リンクさせる)。紐付け部17は、各情報、分類結果、および映像データを、時間を基準にして紐付けする。 The linking unit 17 includes worker information from the worker identification unit 11, operating status of the equipment 50 from the equipment operating status acquisition unit 12 (referred to as equipment operation information), product information from the product information acquisition unit 16, and a work classification unit. The work classification results from 15 and the video data recorded in the video recording unit 13 are linked (linked). The linking unit 17 links each information, the classification result, and the video data on the basis of time.
 図4は、各情報および分類結果を表示させたタイムチャートである。なお、図において、「稼働」とは設備50が稼働中、「対」とは対設備作業、「非」とは非設備作業、「内」とは内段取り作業、「外」とは外段取り作業をそれぞれ示す。他の図においても同じである。 FIG. 4 is a time chart displaying each information and classification result. In the figure, "operation" means that the equipment 50 is in operation, "pair" means work with equipment, "non" means non-equipment work, "inside" means internal setup work, and "outside" means external setup. Each work is shown. The same applies to other figures.
 図4に示すように、製品情報、設備の稼働状態、作業状態、および分類結果がタイムチャートとして示されることで、時間の経過とともに、これらがどのように行われたか、わかりやすくなる。 As shown in FIG. 4, product information, equipment operating status, working status, and classification results are shown as a time chart, which makes it easier to understand how these were performed over time.
 図5は、作業の分類結果を時間割合で表示した円グラフである。 FIG. 5 is a pie chart showing the work classification results in terms of time.
 図5に示すように、分類結果が時間割合の円グラフとして示されることで、分類された各作業がどの程度の時間割合で行われているか、わかりやすくなる。 As shown in FIG. 5, by showing the classification result as a pie chart of the time ratio, it becomes easy to understand how much time ratio each classified work is performed.
 図6は、作業の分類結果を時刻および作業時間とともに示す表である。 FIG. 6 is a table showing the work classification results together with the time and work time.
 図6に示すように、作業の分類結果が、分類された時刻、および作業時間とともに示されることで、各作業の開始、終了時間、その作業に要した作業時間がわかりやすくなる。また、この表では、映像データを再生するための再生アイコンも表示している。再生アイコンは、時間をもとに映像記録部13に記録されている映像データを紐付け(リンク)されている。 As shown in FIG. 6, by showing the work classification result together with the classified time and work time, it becomes easy to understand the start and end times of each work and the work time required for the work. In addition, this table also displays playback icons for playing back video data. The playback icon is linked with the video data recorded in the video recording unit 13 based on the time.
 図7は、再生された映像例を示す図である。たとえば、図中、点線の丸300で示した再生アイコンがクリックされると、図7に示すような映像が再生される。この映像では、自動加工機50aに対して作業者30が作業をしている状態が映し出されている。 FIG. 7 is a diagram showing an example of the reproduced video. For example, when the playback icon indicated by the dotted circle 300 in the figure is clicked, the image shown in FIG. 7 is played back. In this image, the state in which the worker 30 is working on the automatic processing machine 50a is projected.
 なお、映像データの再生は、図7に示したような再生アイコンのある表からの再生だけではない。映像データの再生は、たとえば、図4に示したタイムチャートにおいて、個々の情報や分類結果部分がクリックされることで再生されるようにしてもよい。 Note that the playback of video data is not limited to playback from a table with a playback icon as shown in FIG. For example, in the time chart shown in FIG. 4, the video data may be reproduced by clicking each information or the classification result portion.
 再生される映像データは、たとえば、指定された作業分類の開始から終了までである。また、再生される映像データは、指定された作業分類の開始から終了までの前後を含めて再生されるようにしてもよい。 The video data to be played is, for example, from the start to the end of the specified work classification. Further, the video data to be reproduced may be reproduced including before and after from the start to the end of the designated work classification.
 さらに、再生される映像データは、作業分類にかかわらず任意の時間から再生されるようにしてもよい。映像データが、作業分類にかかわらず任意の時間から再生されるようにした場合は、再生されている映像データに合わせて、分類結果の表示させることが好ましい。再生されている映像データが分類結果をまたぐ場合は、分類結果の表示を切り替える。 Further, the video data to be reproduced may be reproduced from an arbitrary time regardless of the work classification. When the video data is reproduced from an arbitrary time regardless of the work classification, it is preferable to display the classification result according to the reproduced video data. If the video data being played straddles the classification result, the display of the classification result is switched.
 図8は、作業内容、取得した情報、および分類結果を詳細に示す表である。 FIG. 8 is a table showing the work contents, acquired information, and classification results in detail.
 図8に示すように、作業内容、取得した情報、および分類結果が詳細に示されることで、収集された各情報が、どのように分類されたかがわかるようになる。また、この表では、各情報および分類結果に、作業内容が紐付けされているので、具体的にどのような作業行われたかもわかるようになる。また、この表では、作業を行っている作業者30も個別に表示しているため、個々の作業者30がどの作業にどの程度の時間を要したかがわかる。 As shown in FIG. 8, by showing the work contents, the acquired information, and the classification result in detail, it becomes possible to understand how each collected information is classified. In addition, in this table, since the work contents are linked to each information and the classification result, it becomes possible to know what kind of work was concretely performed. Further, since the workers 30 who are performing the work are also individually displayed in this table, it is possible to know how much time each worker 30 took for which work.
 このように詳細な情報は、タクトタイム短縮のための改善計画や生産計画を立てる際に有用なものとなる。図8に示した表においても、各情報や分類結果と映像データが紐付けされて、表内の各情報や分類結果がクリックされることで映像データが再生されるようにしてもよい。 Such detailed information will be useful when making improvement plans and production plans for shortening takt time. Also in the table shown in FIG. 8, each information or classification result may be associated with video data, and the video data may be reproduced by clicking each information or classification result in the table.
 図4~図8に示した表示例は、紐付け部17により紐付けされた各情報が時間基準、または時間割合で出力されたものである。これらの出力は、たとえば、端末18からのリクエストに応じて行われ、端末18のディスプレイに表示される。 In the display examples shown in FIGS. 4 to 8, each information linked by the linking unit 17 is output on a time basis or in a time ratio. These outputs are performed in response to a request from the terminal 18, for example, and are displayed on the display of the terminal 18.
 次に、作業分類を行う処理手順を説明する。図9は、作業分類の処理手順を示すフローチャートである。サーバー10は、この処理手順に基づき作成された作業分類プログラムを実行することで、既に説明した各部の機能が実施される。 Next, the processing procedure for classifying work will be described. FIG. 9 is a flowchart showing a processing procedure of work classification. By executing the work classification program created based on this processing procedure, the server 10 executes the functions of the respective parts already described.
 まず、サーバー10は、作業者30を識別する(S1)。 First, the server 10 identifies the worker 30 (S1).
 続いて、サーバー10は、設備50の稼働状態を取得する(S2)。 Subsequently, the server 10 acquires the operating state of the equipment 50 (S2).
 続いて、サーバー10は、カメラ20から映像データを取得する(S3)。 Subsequently, the server 10 acquires video data from the camera 20 (S3).
 続いて、サーバー10は、製品情報を取得する(S4)。 Subsequently, the server 10 acquires the product information (S4).
 続いて、サーバー10は、映像データから作業者30の位置および向きを検出する(S5)。 Subsequently, the server 10 detects the position and orientation of the worker 30 from the video data (S5).
 続いて、サーバー10は、得られた各情報から作業を分類する(S6)。 Subsequently, the server 10 classifies the work from each obtained information (S6).
 続いて、サーバー10は、得られた各情報、分類結果、および映像データを紐付する(S7)。 Subsequently, the server 10 associates the obtained information, the classification result, and the video data (S7).
 続いて、サーバー10は、処理終了の入力があれば(S8:YES)、この処理を終了する。一方、処理終了の入力がなければ(S8:NO)、サーバー10は、S1へ戻り、この処理を継続する。 Subsequently, the server 10 ends this process if there is an input to end the process (S8: YES). On the other hand, if there is no input for the end of processing (S8: NO), the server 10 returns to S1 and continues this processing.
 なお、上記処理手順において、S1~S6の処理順は、少なくともS3がS5の前までに処理されており、S1~5がS6より前に処理されていれば、S1~4の各ステップは、どのような順番であってもよい。 In the above processing procedure, if the processing order of S1 to S6 is such that S3 is processed before S5 and S1 to 5 are processed before S6, each step of S1 to 4 is performed. It may be in any order.
 以上説明した本実施形態によれば、以下の効果を奏する。 According to the present embodiment described above, the following effects are obtained.
 本実施形態は、設備50の稼働状態と作業者30の作業状態を取得し、これらの組み合わせに応じて、段取り替え作業を、内段取り作業、外段取り作業、ムダ、不在の4つに分類する。これにより、本実施形態では、設備50が稼働中の作業と、設備50が停止中の作業、さらに設備50に無関係な作業を分類できる。したがって、本実施形態では、従来、人がマニュアルで分類していたものを自動的に分類することができ、分類作業を大幅に効率化できる。 In this embodiment, the operating state of the equipment 50 and the working state of the worker 30 are acquired, and the setup change work is classified into four, an inner setup work, an outer setup work, waste, and absent, according to the combination thereof. .. Thereby, in the present embodiment, the work in which the equipment 50 is in operation, the work in which the equipment 50 is stopped, and the work unrelated to the equipment 50 can be classified. Therefore, in the present embodiment, what was conventionally classified manually by a person can be automatically classified, and the classification work can be greatly streamlined.
 また、本実施形態では、取得した各情報、分類結果、そして映像データを時間により同期させて紐付けすることとした。これにより、本実施形態では、紐付けされた各情報および分類結果をタイムチャート、グラフ、表などとして表示することができる。さらに、それらに関連する映像データも紐付けされているので、各情報および分類結果に関連する映像データを簡単に再生することができる。特に本実施形態では、映像データを紐付けしたことで、作業の状況を可視化することができ、改善分析を効率化できる。また、本実施形態では、分析の結果から、段取り替え作業を短縮することができるようになり、設備50のダウンタイムを減少させ、生産性を向上させることができる。 Further, in the present embodiment, it is decided that each acquired information, classification result, and video data are synchronized and linked according to time. Thereby, in the present embodiment, each associated information and the classification result can be displayed as a time chart, a graph, a table, or the like. Further, since the video data related to them is also linked, the video data related to each information and the classification result can be easily reproduced. In particular, in the present embodiment, by associating the video data, the work situation can be visualized and the improvement analysis can be made more efficient. Further, in the present embodiment, from the analysis result, the setup change work can be shortened, the downtime of the equipment 50 can be reduced, and the productivity can be improved.
 以上本発明の実施形態を説明したが、様々な変形が可能である。実施形態は、サーバー10の機能として説明したが、このサーバー10の機能は、小型のコンピューターに実施させることができる。たとえば、ボードPCなどカメラ20に組み込んで、またはカメラ20と一体化させて、カメラ20内で、実施形態として説明した各機能を実行させてもよい。このようにすることで、本実施形態は、カメラ20の設置だけで実施できる。 Although the embodiment of the present invention has been described above, various modifications are possible. Although the embodiment has been described as a function of the server 10, the function of the server 10 can be performed by a small computer. For example, each function described as an embodiment may be executed in the camera 20 by incorporating it into the camera 20 such as a board PC or integrating it with the camera 20. By doing so, this embodiment can be implemented only by installing the camera 20.
 また、サーバー10としては、インターネット上のクラウドサーバーを利用してもよい。クラウドサーバーを利用する場合は、設備50およびカメラ20に、インターネット接続可能なインターフェースを設け、クラウドサーバーへ、設備50の稼働状態およびカメラ20の映像データを送信する。設備50およびカメラ20と、インターネットとの接続は、無線接続でも、有線接続でもよい。クラウドサーバーを利用する場合は、設備50にインターネット接続用のインターフェースを取り付けるだけで、カメラ20は既存のインターネット接続可能なウェブカメラが利用できるので、初期投資を抑えることができる。また、クラウドサーバーを利用する場合は、映像データの中の積層信号灯51の発光色から設備50の稼働状態を取得することとすれば、ウェブカメラの設置だけで済むので、さらに初期投資を抑えることができる。 Further, as the server 10, a cloud server on the Internet may be used. When using a cloud server, the equipment 50 and the camera 20 are provided with an interface capable of connecting to the Internet, and the operating state of the equipment 50 and the video data of the camera 20 are transmitted to the cloud server. The connection between the equipment 50 and the camera 20 and the Internet may be a wireless connection or a wired connection. When using a cloud server, the initial investment can be suppressed because the existing webcam that can connect to the Internet can be used for the camera 20 simply by attaching an interface for connecting to the Internet to the equipment 50. In addition, when using a cloud server, if the operating state of the equipment 50 is acquired from the emission color of the laminated signal light 51 in the video data, only the installation of the webcam is required, so the initial investment can be further reduced. Can be done.
 また、作業者30の位置および向きの検出には、骨格認識技術だけでなく、その他のセンサーなどを用いてもよい。たとえば、物体までの距離値を得るライダー(LIDAR(Light Detection and Ranging))により人の各部位までの距離値を得て、人の姿勢を推定する技術を用いることができる。また、赤外線センサ(サーマルカメラ)を用いて人の顔の温度から人の向きを判定する技術などを用いることができる。 Further, not only the skeleton recognition technique but also other sensors may be used to detect the position and orientation of the worker 30. For example, it is possible to use a technique of estimating a person's posture by obtaining a distance value to each part of a person by a rider (LIDAR (Light Detection and Ringing)) that obtains a distance value to an object. In addition, a technique of determining the direction of a person from the temperature of the person's face using an infrared sensor (thermal camera) can be used.
 また、実施形態では、作業者識別部11が、作業者30を個別に認識することとしたが、作業の分類結果を作業者30ごとに対応付けする必要がなければ、個々の作業者30を識別する必要はない。個々の作業者30を識別しない場合は、作業者識別部11はなくてもよい。作業者識別部11を設けない場合は、当然に、RFIDタグ31とその読み取り装置、生体認証のための装置なども不要になるので、設備投資費用が軽減できる。 Further, in the embodiment, the worker identification unit 11 recognizes the worker 30 individually, but if it is not necessary to associate the work classification result for each worker 30, the individual worker 30 is recognized. No need to identify. If the individual worker 30 is not identified, the worker identification unit 11 may be omitted. If the worker identification unit 11 is not provided, the RFID tag 31, its reading device, the device for biometric authentication, and the like are naturally unnecessary, so that the capital investment cost can be reduced.
 また、設備50の稼働状態は、音で稼働中や異常停止などを知らせる音響報知機が設備50に付いている場合は、音によりを取得されてもよい。 Further, the operating state of the equipment 50 may be acquired by sound when the equipment 50 is equipped with an acoustic alarm that notifies the operation or abnormal stop by sound.
 また、積層信号灯51は、3色とは限らず、2色の積層信号灯51、1色の積層信号灯51などの場合でも適用可能である。たとえば、2色の積層信号灯51の場合は、緑色の正常稼働と赤色の異常停止などである。この場合は、緑色が稼働中、赤色および消灯時が停止中であるので、それに合わせて処理を行えばよい。同様に、1色の場合、また、4色、5色といった積層信号灯51が用いられている場合においても同様である。 Further, the laminated signal lamp 51 is not limited to three colors, and can be applied to a two-color laminated signal lamp 51, a one-color laminated signal lamp 51, and the like. For example, in the case of the two-color laminated signal lamp 51, the normal operation of green and the abnormal stop of red are performed. In this case, since green is in operation and red and off are stopped, processing may be performed accordingly. Similarly, the same applies to the case of one color and the case where a laminated signal lamp 51 having four or five colors is used.
 また、実施形態では、第2領域42を設けている。実施形態では、第2領域42を設けることで、作業者30が設備50に対して作業はしていないが、すぐに作業できる位置にいることがわかる。しかし、実施形態としては、第1領域41だけとしても作業分類は実施可能である。第1領域41だけ設定した場合は、第1領域41に存在する作業者30が設備50の方を向いていない場合を非設備作業とし、作業者30が第1領域41に存在しない場合を不在とする。したがって、この場合の作業分類は、設備稼働中で、作業者30が第1領域41に存在し、設備50の方を向いていない場合は外段取り作業と分類される。また、設備稼働中で、作業者30が第1領域41に存在しない場合は不在と分類される。そのほかの分類は図3に示したものと同じである。 Further, in the embodiment, the second region 42 is provided. In the embodiment, by providing the second area 42, it can be seen that the worker 30 is not working on the equipment 50 but is in a position where he can work immediately. However, as an embodiment, the work classification can be carried out even if only the first region 41 is used. When only the first area 41 is set, the case where the worker 30 existing in the first area 41 does not face the equipment 50 is regarded as non-equipment work, and the case where the worker 30 does not exist in the first area 41 is absent. And. Therefore, the work classification in this case is classified as an external setup work when the worker 30 is present in the first area 41 and does not face the equipment 50 while the equipment is in operation. Further, when the equipment is in operation and the worker 30 does not exist in the first area 41, it is classified as absent. Other classifications are the same as those shown in FIG.
 そのほか、実施形態の説明の中で使用した条件や数値などはあくまでも説明のためのものであり、本発明がこれら条件や数値に限定されるものではない。 In addition, the conditions and numerical values used in the description of the embodiment are for explanation purposes only, and the present invention is not limited to these conditions and numerical values.
 本発明に係る作業分類プログラムは、専用のハードウェア回路によって実現することも可能である。また、この作業分類プログラムは、USB(Universal Serial Bus)メモリやDVD(Digital Versatile Disc)-ROM(Read Only Memory)などのコンピューター読み取り可能な記録媒体によって提供したり、記録媒体によらず、インターネットなどのネットワークを介してオンラインで提供したりすることも可能である。この場合、この作業分類プログラムは、通常、記憶部を構成する磁気ディスク装置等に記憶される。また、この作業分類プログラムは、単独のアプリケーションソフトウェアとして提供したり、一機能として別のソフトウェアに組み込んで提供したりすることも可能である。 The work classification program according to the present invention can also be realized by a dedicated hardware circuit. In addition, this work classification program is provided by a computer-readable recording medium such as a USB (Universal Serial Bus) memory or a DVD (Digital Versaille Disc) -ROM (Read Only Memory), or is provided by a computer-readable recording medium such as the Internet regardless of the recording medium. It is also possible to provide it online via the network of. In this case, this work classification program is usually stored in a magnetic disk device or the like that constitutes a storage unit. In addition, this work classification program can be provided as a single application software, or can be provided by being incorporated into another software as one function.
 また、本発明は、特許請求の範囲に記載された構成に基づき様々な改変が可能であり、それらについても本発明の範疇である。 Further, the present invention can be modified in various ways based on the configurations described in the claims, and these are also within the scope of the present invention.
 本出願は、2019年7月10日に出願された日本国特許出願番号2019-128731号に基づいており、その開示内容は、参照により全体として組み入れられている。 This application is based on Japanese Patent Application No. 2019-128731 filed on July 10, 2019, the disclosure of which is incorporated as a whole by reference.
1 作業分類システム、
10 サーバー、
11 作業者識別部、
12 設備稼働状態取得部、
13 映像記録部、
14 作業者位置・向き判定部、
15 作業分類部、
16 製品情報取得部、
17 紐付け部、
18 端末、
20 カメラ、
25 撮影範囲、
30、30a、30b 作業者、
41 第1領域、
42 第2領域、
50 設備、
50a 自動加工機、
50b 生産ライン、
51 積層信号灯。
1 Work classification system,
10 servers,
11 Worker identification unit,
12 Equipment operation status acquisition department,
13 Video recording section,
14 Worker position / orientation determination unit,
15 Work classification department,
16 Product Information Acquisition Department,
17 Tying part,
18 terminals,
20 cameras,
25 shooting range,
30, 30a, 30b workers,
41 First area,
42 Second area,
50 equipment,
50a automatic processing machine,
50b production line,
51 Stacked signal light.

Claims (14)

  1.  設備の稼働状態を取得する設備稼働状態取得部と、
     作業者を含む範囲を撮影するカメラと、
     前記カメラが撮影した映像データから前記作業者の位置および向きを判定する作業者位置・向き判定部と、
     前記設備の稼働状態と、
     前記作業者の位置および向きとの組み合わせから段取り替え作業を分類する作業分類部と、
     を有する、作業分類システム。
    The equipment operating status acquisition unit that acquires the operating status of the equipment,
    A camera that captures the area including the worker,
    A worker position / orientation determination unit that determines the position and orientation of the worker from the video data captured by the camera.
    The operating status of the equipment and
    A work classification unit that classifies setup change work based on the combination of the position and orientation of the worker,
    Has a work classification system.
  2.  前記段取り替え作業は、1つの製品が生産または加工される際の作業であり、
     前記カメラの撮影範囲内であって、前記設備近傍の所定範囲を第1領域とし、
     前記作業分類部は、前記段取り替え作業を、設備停止中で前記作業者が前記第1領域内に存在し、かつ前記設備の方を向いている場合に内段取り作業と分類し、設備稼働中で前記作業者が前記第1領域内で前記設備の方を向いていない場合に外段取り作業と分類し、設備停止中で前記作業者が前記第1領域内で前記設備の方を向いていない場合または前記第1領域外である場合にムダと分類し、設備稼働中で前記作業者が前記第1領域外である場合に不在と分類する、請求項1に記載の作業分類システム。
    The setup change work is a work when one product is produced or processed.
    A predetermined range within the shooting range of the camera and in the vicinity of the equipment is set as the first area.
    The work classification unit classifies the setup change work as an internal setup work when the worker is present in the first area and faces the equipment while the equipment is stopped, and the equipment is in operation. When the worker is not facing the equipment in the first area, it is classified as an external setup work, and the worker is not facing the equipment in the first area while the equipment is stopped. The work classification system according to claim 1, wherein the work classification system is classified as waste when it is outside the first area, and is classified as absent when the worker is outside the first area while the equipment is in operation.
  3.  前記段取り替え作業は、1つの製品が生産または加工される際の作業であり、
     前記カメラの撮影範囲内であって、前記設備近傍の所定範囲を第1領域とし、前記第1領域の外側の範囲を第2領域とし、
     前記作業分類部は、前記段取り替え作業を、設備停止中で前記作業者が前記第1領域内に存在し、かつ前記設備の方を向いている場合に内段取り作業と分類し、設備稼働中で前記作業者が前記第1領域内で前記設備の方を向いていない場合および前記第2領域内に存在する場合に外段取り作業と分類し、設備停止中で前記作業者が前記第1領域内で前記設備の方を向いていない場合または前記第1領域外である場合にムダと分類し、設備稼働中で前記作業者が前記第1領域および前記第2領域外である場合に不在と分類する、請求項1に記載の作業分類システム。
    The setup change work is a work when one product is produced or processed.
    A predetermined range within the shooting range of the camera and in the vicinity of the equipment is defined as a first region, and a range outside the first region is defined as a second region.
    The work classification unit classifies the setup change work as an internal setup work when the worker is present in the first area and faces the equipment while the equipment is stopped, and the equipment is in operation. When the worker is not facing the equipment in the first area and when the worker is present in the second area, it is classified as an external setup work, and the worker is in the first area while the equipment is stopped. If the equipment is not facing the facility or is outside the first area, it is classified as waste, and if the worker is outside the first area and the second area while the facility is in operation, it is regarded as absent. The work classification system according to claim 1, wherein the work classification system is used for classification.
  4.  前記作業者を識別する作業者識別部をさらに有し、
     前記作業分類部は、分類した分類結果を、前記作業者識別部が識別した前記作業者ごとに対応付けする、請求項1~3のいずれか一つに記載の作業分類システム。
    Further having a worker identification unit for identifying the worker,
    The work classification system according to any one of claims 1 to 3, wherein the work classification unit associates the classified classification results with each of the workers identified by the worker identification unit.
  5.  前記作業者位置・向き判定部は、前記映像データから前記作業者の所定の関節または骨格の座標値を抽出し、その座標値から前記作業者の位置および向きを判定する、請求項1~4のいずれか一つに記載の作業分類システム。 Claims 1 to 4, wherein the worker position / orientation determination unit extracts coordinate values of a predetermined joint or skeleton of the worker from the video data, and determines the position and orientation of the worker from the coordinate values. The work classification system described in any one of.
  6.  前記映像データを記録する映像記録部と、
     前記作業分類部が分類した分類結果と、前記映像データとを紐付けする紐付け部と、をさらに有する、請求項1~5のいずれか一つに記載の作業分類システム。
    A video recording unit that records the video data and
    The work classification system according to any one of claims 1 to 5, further comprising a classification result classified by the work classification unit and a linking unit that links the video data.
  7.  前記紐付け部は、前記設備稼働状態取得部が取得した稼働状態、前記作業者位置・向き判定部が判定した前記作業者の位置および向きを、前記分類結果および前記映像データにさらに紐付けする、請求項6に記載の作業分類システム。 The linking unit further links the operating state acquired by the equipment operating state acquisition unit and the position and orientation of the worker determined by the worker position / orientation determination unit to the classification result and the video data. , The work classification system according to claim 6.
  8.  設備の稼働状態を取得する段階(a)と、
     作業者を含む範囲が撮影された映像データから前記作業者の位置および向きを判定する段階(b)と、
     前記設備の稼働状態と、前記作業者の位置および向きとの組み合わせから段取り替え作業を分類する段階(c)と、
     を、コンピューターに実行させるための作業分類プログラム。
    The stage (a) of acquiring the operating status of the equipment and
    The step (b) of determining the position and orientation of the worker from the video data in which the range including the worker is captured, and
    The stage (c) of classifying the setup change work based on the combination of the operating state of the equipment and the position and orientation of the worker, and
    A work classification program that allows a computer to execute.
  9.  前記段取り替え作業は、1つの製品が生産または加工される際の作業であり、
     前記段階(c)は、前記映像データが撮影された範囲内であって、前記設備近傍の所定範囲を第1領域とし、前記段取り替え作業を、設備停止中で前記作業者が前記第1領域内に存在し、かつ前記設備の方を向いている場合に内段取り作業と分類し、設備稼働中で前記作業者が前記第1領域内で前記設備の方を向いていない場合に外段取り作業と分類し、設備停止中で前記作業者が前記第1領域内で前記設備の方を向いていない場合または前記第1領域外である場合にムダと分類し、設備稼働中で前記作業者が前記第1領域外である場合に不在と分類する、請求項8に記載の作業分類プログラム。
    The setup change work is a work when one product is produced or processed.
    In the step (c), a predetermined range in the vicinity of the equipment is set as the first region within the range in which the video data is captured, and the setup change work is performed by the worker in the first region while the equipment is stopped. It is classified as an inner setup work when it exists inside and faces the equipment, and when the worker is not facing the equipment in the first area while the equipment is in operation, the outer setup work. When the equipment is stopped and the worker is not facing the equipment in the first area or outside the first area, it is classified as waste, and the worker is operating the equipment. The work classification program according to claim 8, wherein the work classification program is classified as absent when it is outside the first area.
  10.  前記段取り替え作業は、1つの製品が生産または加工される際の作業であり、
     前記段階(c)は、前記映像データが撮影された範囲内であって、前記設備近傍の所定範囲を第1領域とし、前記第1領域の外側の範囲を第2領域とし、前記段取り替え作業を、設備停止中で前記作業者が前記第1領域内に存在し、かつ前記設備の方を向いている場合に内段取り作業と分類し、設備稼働中で前記作業者が前記第1領域内で前記設備の方を向いていない場合および前記第2領域内に存在する場合に外段取り作業と分類し、設備停止中で前記作業者が前記第1領域内で前記設備の方を向いていない場合または前記第1領域外である場合にムダと分類し、設備稼働中で前記作業者が前記第1領域および前記第2領域外である場合に不在と分類する、請求項8に記載の作業分類プログラム。
    The setup change work is a work when one product is produced or processed.
    In the step (c), the setup change operation is performed by setting a predetermined range in the vicinity of the equipment as a first region and a range outside the first region as a second region within the range in which the video data is captured. Is classified as an internal setup work when the worker is present in the first area while the equipment is stopped and is facing the equipment, and the worker is in the first area while the equipment is in operation. When the work is not facing the equipment and when the work is present in the second area, it is classified as an external setup work, and the worker is not facing the equipment in the first area while the equipment is stopped. The work according to claim 8, wherein the work is classified as waste when the case is outside the first area, and is classified as absent when the worker is outside the first area and the second area while the equipment is in operation. Classification program.
  11.  前記作業者を識別する段階(d)と、
     前記段階(c)により分類した分類結果を、識別した前記作業者ごとに対応付けする段階(e)と、をさらに有する、請求項8~10のいずれか一つに記載の作業分類プログラム。
    The step (d) of identifying the worker and
    The work classification program according to any one of claims 8 to 10, further comprising a step (e) of associating the classification results classified according to the step (c) with each identified worker.
  12.  前記段階(b)は、前記映像データから前記作業者の所定の関節または骨格の座標値を抽出し、その座標値から前記作業者の位置および向きを判定する、請求項8~11のいずれか一つに記載の作業分類プログラム。 The step (b) is any one of claims 8 to 11, wherein the coordinate value of a predetermined joint or skeleton of the worker is extracted from the video data, and the position and orientation of the worker are determined from the coordinate value. The work classification program described in one.
  13.  前記映像データを記録する段階(f)と、
     前記段階(c)により分類した分類結果と、前記映像データとを紐付けする段階(g)と、をさらに有する、請求項8~12のいずれか一つに記載の作業分類プログラム。
    The step (f) of recording the video data and
    The work classification program according to any one of claims 8 to 12, further comprising a classification result classified by the step (c) and a step (g) for associating the video data.
  14.  前記段階(g)は、前記稼働状態、前記作業者の位置および向きを、前記分類結果および前記映像データにさらに紐付けする、請求項13に記載の作業分類プログラム。 The work classification program according to claim 13, wherein the step (g) further links the operating state, the position and orientation of the worker, to the classification result and the video data.
PCT/JP2020/026072 2019-07-10 2020-07-02 Task classification system and task classification program WO2021006183A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021530663A JP7347509B2 (en) 2019-07-10 2020-07-02 Work classification systems and work classification programs

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019128731 2019-07-10
JP2019-128731 2019-07-10

Publications (1)

Publication Number Publication Date
WO2021006183A1 true WO2021006183A1 (en) 2021-01-14

Family

ID=74115291

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/026072 WO2021006183A1 (en) 2019-07-10 2020-07-02 Task classification system and task classification program

Country Status (2)

Country Link
JP (1) JP7347509B2 (en)
WO (1) WO2021006183A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024121991A1 (en) * 2022-12-07 2024-06-13 ファナック株式会社 Management device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009157517A (en) * 2007-12-25 2009-07-16 Shibuya Kogyo Co Ltd Production management system
JP2012203770A (en) * 2011-03-28 2012-10-22 Hitachi Chem Co Ltd Work analysis system
JP2013073279A (en) * 2011-09-26 2013-04-22 Omron Corp Data processor, data processing system, and data processing method
WO2018138925A1 (en) * 2017-01-30 2018-08-02 三菱電機株式会社 Data processing device and data processing method
JP2019023803A (en) * 2017-07-24 2019-02-14 株式会社日立製作所 Work Improvement Support System and Method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020021451A (en) * 2018-07-18 2020-02-06 コニカミノルタ株式会社 Facility utilization rate calculation system and facility utilization rate calculation program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009157517A (en) * 2007-12-25 2009-07-16 Shibuya Kogyo Co Ltd Production management system
JP2012203770A (en) * 2011-03-28 2012-10-22 Hitachi Chem Co Ltd Work analysis system
JP2013073279A (en) * 2011-09-26 2013-04-22 Omron Corp Data processor, data processing system, and data processing method
WO2018138925A1 (en) * 2017-01-30 2018-08-02 三菱電機株式会社 Data processing device and data processing method
JP2019023803A (en) * 2017-07-24 2019-02-14 株式会社日立製作所 Work Improvement Support System and Method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024121991A1 (en) * 2022-12-07 2024-06-13 ファナック株式会社 Management device

Also Published As

Publication number Publication date
JPWO2021006183A1 (en) 2021-01-14
JP7347509B2 (en) 2023-09-20

Similar Documents

Publication Publication Date Title
JP7074965B2 (en) Manufacturing control based on internal personal location identification in the metal processing industry
JP4752721B2 (en) Movement pattern identification device, movement pattern identification method, movement pattern identification program, and recording medium recording the same
JP6551531B2 (en) Manufacturing status display system, manufacturing status display method, and manufacturing status display program
JP7119532B2 (en) Productivity Improvement Support System and Productivity Improvement Support Program
WO2014136941A1 (en) Control system, control device, image processing device, and control method
JP6855801B2 (en) Anomaly detection system, anomaly detection device, anomaly detection method and program
CN109407544B (en) System module of simulation machine operation picture of non-invasive data extraction system
JP7414797B2 (en) Manufacturing control method
WO2020250498A1 (en) Information processing device, information processing method, information processing program, and recording medium
WO2021006183A1 (en) Task classification system and task classification program
JP2015225630A (en) Work management device, work management system, and work management method
CN105279591B (en) Man-machine interaction system supporting single-person flow operation instruction and verification
JP7139987B2 (en) Process information acquisition system, process information acquisition method, and process information acquisition program
WO2020246082A1 (en) Work monitoring device and work monitoring method
JP2021163293A (en) Work analyzer and work analysis program
JP7018858B2 (en) Work management system, work management server, and work management method
JP2014174653A (en) Operation result acquisition system and operation result acquisition method
JP6806148B2 (en) Working time analysis system, working time analysis program, and working time analysis method
JP7424825B2 (en) I/O signal information display system
JP2021022232A (en) Production result recording system and production result recording program
CN114066799A (en) Detection system and detection method
WO2021171702A1 (en) Person determining system, and person determining program
JP2020068009A (en) Operation detection device, operation detection method, and operation detection system
JP2021511594A (en) Automatic inspection and parts registration
WO2023095329A1 (en) Movement evaluation system, movement evaluation method, and non-transitory computer-readable medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20836740

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021530663

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 20836740

Country of ref document: EP

Kind code of ref document: A1