WO2020246082A1 - Work monitoring device and work monitoring method - Google Patents

Work monitoring device and work monitoring method Download PDF

Info

Publication number
WO2020246082A1
WO2020246082A1 PCT/JP2020/007755 JP2020007755W WO2020246082A1 WO 2020246082 A1 WO2020246082 A1 WO 2020246082A1 JP 2020007755 W JP2020007755 W JP 2020007755W WO 2020246082 A1 WO2020246082 A1 WO 2020246082A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
image data
worker
determination result
result table
Prior art date
Application number
PCT/JP2020/007755
Other languages
French (fr)
Japanese (ja)
Inventor
卓馬 寺田
洋登 永吉
直一 根本
宏樹 内山
日野 一彦
雅史 佐藤
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Publication of WO2020246082A1 publication Critical patent/WO2020246082A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to a work monitoring device and a work monitoring method.
  • Patent Document 1 describes a method of recognizing the motion of a moving body for the robot to recognize the posture, motion, or behavior of a person.
  • the eigenspace data A in which the frame image data A for each basic motion of the moving object A is displayed by dots is created in advance and created in a database, and the frame image data B of the target moving object B is displayed by dots.
  • the eigenspace data B is compared with the eigenspace data A for each basic operation, the eigenspace data A closest to the eigenspace data B is selected, and the operation of the moving body B is recognized.
  • the moving body A For each frame image data A for each basic motion, the moving body A is made to perform the basic motion, and the moving body A performing the basic motion is photographed from multiple directions using a plurality of image input means, and is continuously acquired for each image input means. It is obtained from a compressed image created by weighting each of a plurality of frame images and then superimposing them.
  • Patent Document 2 describes a motion recognition system that recognizes the shape and motion of a hand in a user interface that uses a person's gesture or gesture.
  • the motion recognition system recognizes the shape and motion of the target by processing the time-series image data including the image of the specific target, extracts the moving region from the time-series image data, and performs the time-series image data.
  • a region containing a color that characterizes the target is extracted from, and a region that is a moving region and that includes a color that characterizes the target is extracted as the target region.
  • Non-Patent Document 1 describes various behavior recognition technologies that utilize sensors and cameras.
  • Non-Patent Document 2 describes that motion recognition is performed using "Zero Shot Learning" that predicts labels of data that do not appear in the training data.
  • Patent Document 1 For example, the technique described in Patent Document 1 generates eigenspace data in advance with a frame image for each basic operation, compares the distance from the characteristics of the operation of the eigenspace, and performs motion recognition. It is necessary to prepare a large amount of data in advance for the basic operation, which puts a heavy load on the workers at the site at the time of introduction.
  • motion recognition is performed by extracting a target region based on a motion region and a region containing a color that characterizes the motion recognition target using time-series image data and analyzing the shape. It is a method that does not require data in advance. However, since the same technology recognizes the movement based on the shape, it is not possible to accurately recognize the position of the working hand and the surrounding information.
  • Non-Patent Document 2 predicts labels of data that do not appear in the learning data, but it is necessary to prepare the data necessary for predicting the recognition target at the site.
  • the present invention has been made based on such a background, and provides a work monitoring device and a work monitoring method capable of accurately monitoring the work performed by an operator without preparing a large amount of data in advance.
  • the purpose is.
  • One of the present inventions for achieving the above object is a work monitoring device, which is configured by using an information processing device having a processor and a memory, and displays image data showing a state in which an operator is performing work.
  • An image data acquisition unit to be acquired an object position detection unit that acquires information indicating the position of each of a plurality of objects reflected in the image data in the image data, and one area set in the image data.
  • a determination result table generation unit that generates a determination result table including information indicating the result of determining whether or not the positions of the plurality of objects are included in each of the above determination target areas, and the image data.
  • a storage unit that stores the judgment result pattern / work content correspondence information, which is information corresponding to the judgment result table generated based on the above and the work content information which is information indicating the work in the image data, and a newly acquired storage unit.
  • a monitoring processing unit that generates the determination result table for the image data and compares the generated determination result table with the determination result pattern / work content correspondence information to specify the work performed by the worker. , Equipped with.
  • FIG. 1 It is a figure which shows the schematic structure of the work monitoring system. It is a figure which shows the hardware configuration example of the work monitoring apparatus. It is a figure which shows the main function which a work monitoring apparatus has.
  • (A) is an example of image data
  • (b) is a diagram drawn by extracting the position of a marker reflected in the image data and a judgment target area
  • (c) is a judgment generated based on the image data. It is a result table.
  • (A) to (c) are examples of image data and determination result patterns corresponding to the image data. It is a flowchart explaining the main process. It is a flowchart explaining the marker color setting process. It is a flowchart explaining the determination target area setting process. It is a flowchart explaining the correspondence information registration process.
  • FIG. 1 shows a schematic configuration of the work monitoring system 1 described as the first embodiment.
  • the work monitoring system 1 includes an image acquisition device 3, various sensors 4, and a work monitoring device 100 which is an information processing device.
  • the image acquisition device 3, various sensors 4, and the work monitoring device 100 are communicably connected via a wired or wireless communication means 5.
  • the configuration of the communication means 5 is not necessarily limited, but for example, a communication means compliant with various communication standards such as USB (Universal Serial Bus) and RS-232C, LAN (Local Area Network), WAN (Wide Area Network), the Internet, and the like. Dedicated line, etc.
  • the image acquisition device 3 is a device that acquires image data of the worker 2 and the surroundings of the worker 2, and is, for example, a camera (digital camera) that acquires (shoots) image data of a moving image or a still image.
  • a camera digital camera
  • infrared camera thermography camera
  • TOF Time Of Flight
  • stereo camera stereo camera
  • Various sensors 4 are provided in the work environment in which the worker 2 works, and output physical information about the worker 2 and the work environment.
  • the various sensors 4 include, for example, a motion detection sensor, a human sensor, a temperature sensor, a humidity sensor, an acceleration sensor, a speed sensor, an acoustic sensor (microphone), an ultrasonic sensor, a vibration sensor, a millimeter wave radar, and a laser radar (LIDAR: Laser). Imaging Detection and Langing), an infrared depth sensor.
  • the work monitoring device 100 performs processing related to monitoring the work of the worker 2 based on the image data acquired by the image acquisition device 3.
  • FIG. 2 shows an example of the hardware configuration of the work monitoring device 100.
  • the work monitoring device 100 is an information processing device (computer), and includes a processor 11, a main storage device 12, an auxiliary storage device 13, an input device 14, an output device 15, and a communication device 16.
  • the processor 11 is, for example, a device that performs arithmetic processing, such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), and an AI (Artificial Intelligence) chip.
  • the main storage device 12 is a device that stores programs and data, and is, for example, a ROM (Read Only Memory) (SRAM (Static Random Access Memory), NVRAM (Non Volatile RAM), mask ROM (Mask Read Only Memory), PROM. (Programmable ROM), etc.), RAM (RandomAccessMemory) (DRAM (DynamicRandomAccessMemory), etc.), etc.
  • the auxiliary storage device 13 is a hard disk drive (Hard Disk Drive), a flash memory (Flash Memory), an SSD (Solid State Drive), an optical storage device (CD (Compact Disc), DVD (Digital Versatile Disc), etc.) and the like. ..
  • the programs and data stored in the auxiliary storage device 13 are read into the main storage device 12 at any time.
  • the input device 14 is a user interface that receives information from the user, and is, for example, a keyboard, a mouse, a card reader, a touch panel, or the like.
  • the output device 15 is a user interface that outputs various information (display output, audio output, print output, etc.). For example, a display device (LCD (Liquid Crystal Display), graphic card, etc.) that visualizes various information, audio, etc. Output devices (speakers), printing devices, etc.
  • the communication device 16 is a communication interface that communicates with another device via the communication means 5, and is, for example, a NIC (Network Interface Card), a wireless communication module, a USB (Universal Serial Interface) module, a serial communication module, or the like.
  • the communication device 16 can also function as an input device that receives information from another device that is communicably connected.
  • the communication device 16 can also function as an output device that transmits information to other devices that are communicably connected.
  • the work monitoring device 100 communicates with the image acquisition device 3 and various sensors 4 via the communication means 5 by the communication device 16.
  • the various functions included in the work monitoring device 100 are provided by the processor 11 reading and executing the program stored in the main storage device 12, or by the hardware (FPGA, ASIC) constituting the work monitoring device 100. , AI chip, etc.).
  • FIG. 3 shows the main functions of the work monitoring device 100.
  • the work monitoring device 100 includes a storage unit 110, a data acquisition unit 120, a preparation setting processing unit 130, a determination result table generation unit 140, and a monitoring processing unit 150.
  • the work monitoring device 100 may further include functions such as an operating system, a device driver, a file system, and a DBMS (DataBase Management System).
  • DBMS DataBase Management System
  • the storage unit 110 stores image data 111, sensor data 112, marker color data 113, determination target area data 114, determination result table 115, and determination result pattern / work content correspondence information 116.
  • the storage unit 110 stores such information (data) as, for example, a database table provided by the DBMS or a file provided by the file system.
  • the image data 111 is data acquired from the image acquisition device 3, and is, for example, frame data that constitutes still image data or moving image data sent from the image acquisition device 3.
  • the sensor data 112 is data acquired from various sensors 4. The details of the marker color data 113, the determination target area data 114, the determination result table 115, and the determination result pattern / work content correspondence information 116 will be described later.
  • the data acquisition unit 120 acquires (receives) data sent from another device. As shown in the figure, the data acquisition unit 120 includes an image data acquisition unit 121 and a sensor data acquisition unit 122. Of these, the image data acquisition unit 121 acquires image data 111 from the image acquisition device 3. The sensor data acquisition unit 122 acquires sensor data 112 from various sensors 4.
  • the preparation setting processing unit 130 performs processing related to setting various information used when the work monitoring device 100 monitors the work of the worker 2. As shown in the figure, the preparation setting processing unit 130 includes a marker color setting processing unit 131, a determination target area setting processing unit 132, and a corresponding information setting processing unit 133.
  • the marker color setting processing unit 131 is provided on the body of the worker 2, the work object, the tool used for the work, and the like in order to recognize the position of the worker 2 and the object based on the image data 111. (Hereinafter referred to as "marker"), the color is set.
  • the marker color setting processing unit 131 acquires color data (for example, data expressed by RGB values) in the area where the marker is reflected in the image data 111, and refers to the identifier of the marker (hereinafter, referred to as “marker ID”). ) And the color data are associated with each other to generate marker color data 113, which is data including information.
  • the determination target area setting processing unit 132 performs processing related to setting an area (hereinafter, referred to as “determination target area”) for the image data 111.
  • the determination target area setting processing unit 132 receives, for example, the setting of the determination target area from the user. Further, the determination target area setting processing unit 132 automatically sets the determination target area in the image data.
  • the determination target area setting processing unit 132 generates the determination target area data 114, which is data including information for specifying the set determination target area.
  • the determination target area data 114 includes, for example, information in which an identifier of the determination target area (hereinafter, referred to as “determination target area ID”) and information indicating the area in the coordinate system of the image data are associated with each other.
  • the pattern of the judgment result table 115 (hereinafter, referred to as "judgment result pattern"), which is data including information indicating whether or not the judgment result is reflected, and information for specifying the work content corresponding to the judgment result pattern (hereinafter, referred to as "judgment result pattern") It is referred to as "work content information") and is associated with the determination result pattern / work content correspondence information 116.
  • the work content information includes information indicating whether the work of the determination result pattern is for normal work or abnormal work (hereinafter, referred to as "normal / abnormal classification"). ..
  • the details of the determination result table 115 will be described later.
  • the determination result table generation unit 140 generates the determination result table 115 based on the image data 111. As shown in the figure, the determination result table generation unit 140 includes a marker position detection processing unit 141 and a determination result table generation processing unit 142.
  • the marker position detection processing unit 141 detects the position where the marker is reflected in the image data 111 (in this example, the position of the center of gravity of the region where the marker is moved) using the marker color data 113.
  • the judgment result table generation processing unit 142 generates the judgment result table 115 by comparing the position of the detected marker with the judgment target area data 114.
  • FIG. 4A shows an example of image data 111.
  • a plurality of markers 42 (1) to 42 (4) provided in each of the objects 41 (1) to 41 (4) are shown. Further, the determination target areas 43 (1) to (3) are set.
  • FIG. 4B is drawn by extracting the positions of the markers 42 (1) to (4) and the determination target areas 43 (1) to (3) shown in the image data 111 shown in FIG. 4A. It is a figure.
  • the determination target areas 43 (1) to (3) have a rectangular frame, but other shapes may be used.
  • FIG. 4C is a determination result table 115 generated based on the image data 111 shown in FIG. 4A.
  • the determination result table 115 is a two-dimensional table in which the marker ID is set in the row direction and the determination target area ID is set in the column direction. Further, as shown in the figure, an item (hereinafter, referred to as “Visible”) indicating visible / invisible (whether or not the marker is reflected in the image data 111) of the marker is further provided in the column direction.
  • the determination result pattern described above has the same configuration as the determination result table 115.
  • the object whose marker ID is "Object 1" is reflected in the image data 111, so the value of "Visible” is set to “True” and the judgment target area ID is "Area 1". Since it is within the area, the item is set to "True”, and since the judgment target area ID is outside the area of "Area 2", the item is set to "False” and the judgment target area ID is "Area 3". Since it is outside the area, the item is set to "False”.
  • the value of "Visible” is set to "True", and the judgment target area ID is within the judgment target area of "Area 1". Therefore, the item is set to "True”, and the judgment target area ID is within the judgment target area of "Area 2", so the item is set to "True”, and the judgment target area ID is the judgment target of "Area 3". Since it is outside the area, the item is set to "False".
  • the object whose marker ID is "Object 3" is reflected in the image data 111, so the value of "Visible” is set to “True", and the judgment target area ID is outside the judgment target area of "Area 1". Therefore, the item is set to "False” and the judgment target area ID is outside the judgment target area of "Area 2", so the item is set to "False” and the judgment target area ID is "Area 3". Since it is in the area, the item is set to "True”.
  • the object whose marker ID is "Object 4" is reflected in the image data 111, so the value of "Visible” is set to "True", and the judgment target area ID is outside the judgment target area of "Area 1". Therefore, the item is set to "False” and the judgment target area ID is outside the judgment target area of "Area 2", so the item is set to "False” and the judgment target area ID is "Area 3". Since it is outside the area, the item is set to "False”.
  • the contents of the determination result table 115 shown above represent the characteristics of the individual work performed by the worker 2. Therefore, by preparing in advance the judgment result table 115 based on the image data 111 acquired for each work (normal work or abnormal work) performed by the worker 2, the worker 2 is newly prepared. The work performed by the worker 2 can be easily and accurately specified based on the image data 111 reflecting the work of.
  • FIG. 5 shows an example of a judgment result pattern (judgment result table 115) generated for each work performed by the worker 2 (three cases shown by (a) to (c) in the figure). Since the determination result table 115 has an item of "Visible" and includes information on whether or not the marker is reflected in the image data 111, the work of the worker 2 can be specified with high accuracy. .. Further, since the position of the marker displayed in the image data 111 can be automatically specified by using the information processing device, the user can easily set the judgment target area in the image data 111 and easily perform the judgment result table. 115 can be produced. As described above, according to the work monitoring device 100 of the present embodiment, it is possible to easily realize a mechanism for monitoring the work of the worker 2 at each site where different work is performed.
  • the monitoring processing unit 150 monitors the work of the worker 2 based on the image data 111 sent from the image acquisition device 3. As shown in the figure, the monitoring processing unit 150 includes a determination result table acquisition unit 151 and an abnormality presence / absence determination processing unit 152.
  • the determination result table acquisition unit 151 acquires the determination result table 115 generated based on the image data 111 from the determination result table generation unit 140.
  • the abnormality presence / absence determination processing unit 152 determines whether the work of the worker 2 is normal or abnormal based on the determination result table 115, and outputs the determination result.
  • the abnormality presence / absence determination processing unit 152 sets the normal / abnormal classification registered in the determination result pattern / work content correspondence information 116 to "normal". If it matches any of the determination result patterns, it is determined that the work of the worker 2 is normal, and if it does not match any of the determination result patterns, the work of the worker 2 is determined to be abnormal.
  • the determination result table 115 generated based on the image data 111 is registered in the determination result pattern / work content correspondence information 116, and the normal / abnormal classification is "abnormal". If it matches any of the above, it is determined that the work of the worker 2 is abnormal, and if it does not match any of the above, it is determined that the work of the worker 2 is normal. Note that the match between the determination result table 115 and the determination result pattern does not necessarily mean an exact match, and for example, if the degree of mismatch is within a preset allowable range, it may be determined as a match.
  • FIG. 6 is a flowchart illustrating a process (hereinafter, referred to as “main process S600”) performed by the work monitoring device 100 when monitoring the work of the worker 2 at the site.
  • main process S600 a process performed by the work monitoring device 100 when monitoring the work of the worker 2 at the site.
  • the preparation setting processing unit 130 sets the information necessary for monitoring the work of the worker 2 (marker color setting process S611, determination target area setting process S612, and correspondence information registration). Process S613) is performed. Details of each of these processes will be described later.
  • the monitoring processing unit 150 performs a process of monitoring the work of the worker 2 (work monitoring process S614).
  • FIG. 7 is a flowchart illustrating the details of the marker color setting process S611 of FIG.
  • the image data acquisition unit 121 acquires the image data 111 from the image acquisition device 3 (S711).
  • the marker color setting processing unit 131 acquires the color data (RGB data) of the marker area reflected in the image data 111 (S712).
  • the marker color setting processing unit 131 acquires the color data of the area of each marker.
  • the marker color setting processing unit 131 converts the acquired color data into data in a predetermined color space (HSV (Hue Saturation Value), HSB (Hue Saturation Brightness), etc.) (S713).
  • a predetermined color space HSV (Hue Saturation Value), HSB (Hue Saturation Brightness), etc.
  • the conversion to the color space does not necessarily have to be performed, and whether or not the conversion is performed is determined according to, for example, the required monitoring accuracy (the accuracy of determining whether the work of the worker 2 is normal or abnormal). Just do it. The same applies to which color space is selected.
  • the marker color setting processing unit 131 associates the converted data with the marker ID and stores it as the marker color data 113 (S714).
  • FIG. 8 is a flowchart illustrating the details of the determination target area setting process S612 of FIG.
  • the image data acquisition unit 121 acquires the image data 111 from the image acquisition device 3 (S811).
  • the determination target area setting processing unit 132 sets one or more determination target areas in the acquired image data 111 (S812).
  • the determination target area setting processing unit 132 sets the determination target area by, for example, interactive processing with the user.
  • the user appropriately sets the determination target area so as to secure the required monitoring accuracy based on the relationship between the setting contents of the determination target area in the past and the determination accuracy.
  • the determination target area setting processing unit 132 automatically sets the determination target area so that the required monitoring accuracy is ensured, for example, based on the relationship between the setting contents of the past determination target area and the determination accuracy.
  • the determination target area setting processing unit 132 stores the set determination target area as the determination target area data 114 (S813).
  • FIG. 9 is a flowchart illustrating the details of the corresponding information registration process S613 of FIG.
  • the image data acquisition unit 121 acquires the image data 111 from the image acquisition device 3 (S911).
  • the correspondence information setting processing unit 133 sends the acquired image data 111 to the determination result table generation unit 140.
  • the determination result table generation unit 140 performs a process of acquiring the determination result table 115 based on the sent image data 111 (hereinafter, referred to as "determination result table generation process S912"), and generates the determination result table 115. It is returned to the correspondence information setting processing unit 133.
  • determination result table generation process S912 The details of the determination result table generation process S912 will be described later.
  • the corresponding information setting processing unit 133 accepts the input (setting) of the work content information for the image data 111 from the user (S913).
  • the correspondence information setting processing unit 133 associates the judgment result table 115 sent from the judgment result table generation unit 140 with the received work content information and registers it in the judgment result pattern / work content correspondence information 154 ( S914).
  • FIG. 10 is a flowchart illustrating the details of the determination result table generation process S912 of FIG.
  • the marker position detection processing unit 141 performs a process of detecting the position of each marker included in the image data 111 (hereinafter, referred to as “marker position detection process S1011”). The details of the marker position detection process S1011 will be described later.
  • the determination result table generation processing unit 142 acquires the determination target area data 114 (S1012).
  • the determination result table generation processing unit 142 generates the determination result table 115 by comparing the position of the detected marker with the acquired determination target area data 114 (S1013).
  • FIG. 11 is a flowchart illustrating the details of the marker position detection process S811 of FIG.
  • the marker position detection processing unit 141 acquires the marker color data 113 (S1111).
  • the marker position detection processing unit 141 acquires all the marker color data 113.
  • the marker position detection processing unit 141 transfers the area corresponding to the marker color data 113 from the image data 111 (a region in which the match or similarity is equal to or less than a preset threshold value; hereinafter referred to as a “marker area”). Is detected (S1112). When the marker color data 113 of a plurality of markers is acquired, the marker region is detected for each marker color data.
  • the marker position detection processing unit 141 removes noise information from the detected marker region by using, for example, a known noise removal technique (image processing, morphology calculation, etc.) (S1113).
  • a known noise removal technique image processing, morphology calculation, etc.
  • the marker position detection processing unit 141 specifies the position of the center of gravity (coordinates of the center of gravity) of the marker region (S1114).
  • the marker position detection processing unit 141 checks whether or not there are a plurality of marker regions corresponding to the same marker color data, and if there are a plurality of marker regions, one of them is selected (S1115). For example, the marker position detection processing unit 141 selects the color of the marker region closest to the marker color data. Further, for example, the marker position detection processing unit 141 selects the marker area based on whether or not the position of the marker area in the image data 111 is within a predetermined range.
  • the marker position detection processing unit 141 sets the center of gravity of the selected marker area as the marker position (S1116).
  • FIG. 12 is a flowchart illustrating the details of the work monitoring process S614 of FIG.
  • the image data acquisition unit 121 acquires one or more image data 111 from the image acquisition device 3 (S1211).
  • the image data 111 is, for example, a one-frame image of moving image data showing the state of the worker 2.
  • the determination result table acquisition unit 151 passes the acquired image data 111 to the determination result table generation unit 140, and acquires the determination result table 115 from the determination result table generation unit 140 (S1212). Since the process of S1212 is the same as the determination result table generation process S912 described above, the description thereof will be omitted.
  • the abnormality presence / absence determination processing unit 152 determines whether or not the work of the worker 2 is normal based on the acquired determination result table 115 (S1213), and outputs the determination result to the output device 15 (S1214).
  • the monitoring processing unit 150 determines whether or not the work of the worker 2 has been completed (S1215). If the work is completed (S1215: YES), the monitoring process S414 is completed, and if the work is not completed (S1215: NO), the process returns to S1211. The monitoring processing unit 150 determines whether or not the work has been completed, for example, based on the information input from the user.
  • the work monitoring device 100 of the present embodiment by preparing the determination result pattern in advance, the work can be performed from the image data 111 that newly reflects the work of the worker 2. It is possible to accurately determine whether the work performed by the person 2 is normal or abnormal. Therefore, it is possible to easily realize a mechanism for monitoring the work of the worker 2 at each site.
  • the determination result table 115 shown above includes information on the marker (information on whether or not the position of the marker is within the determination target area and information on whether or not the marker is reflected in the image data 111).
  • the determination result table 115 and the determination result pattern further include the information of the sensor data acquired from the various sensors 4 when the image data 111 is acquired, and the monitoring processing unit 150 also considers the value of the sensor data.
  • the determination result table 115 may be compared with the determination result pattern / work content correspondence information 116 to identify the work and determine whether or not there is an abnormality in the work.
  • the characteristics of each work can be expressed in more detail, and the determination accuracy of the work can be improved.
  • FIG. 13 shows an example of the judgment result table 115 in that case.
  • the determination result table 115 illustrated in addition to each item of the determination result table 115 described above, an item (“Sensor 1”, “Sensor 1”, “Sensor 1”, “Sensor 1”, “ Sensor 2 ”) is included. It also includes items (“Value 1”, “Value 2”) for which sensor data values are set in the column direction.
  • the work monitoring device 100 of the second embodiment uses the work monitoring mechanism of the worker 2 in the first embodiment to determine the time required for the work by the worker 2 (hereinafter, referred to as “working time”).
  • the work performed by the worker 2 is monitored (determines whether the work is normal or abnormal) by comparing the acquired work time with the preset standard work time.
  • the parts different from those of the first embodiment will be mainly described.
  • FIG. 14 shows the main functions of the work monitoring device 100 of the second embodiment.
  • the storage unit 110 of the work monitoring device 100 of the second embodiment has each work performed by the worker 2 in addition to the information stored by the storage unit 110 of the work monitoring device 100 of the first embodiment.
  • the standard working time 117 for each work which is data including information indicating the standard working time of the above, is stored.
  • the abnormality presence / absence determination processing unit 152 of the work monitoring device 100 of the second embodiment has a function of determining whether the work performed by the worker 2 is normal or abnormal by comparing the standard work time with the work time.
  • FIG. 15 is a flowchart illustrating the work monitoring process S614 in the second embodiment. Since the other processes in the second embodiment (the other processes described in the first embodiment performed by the work monitoring device 100) are the same as those in the first embodiment, the description thereof will be omitted.
  • the monitoring processing unit 150 initializes the work variables "working time” and "previous work” used in the following processing (S1511).
  • working time the cumulative working time when the worker 2 is performing a certain work is set.
  • previous work information indicating the work specified last time in the loop processing from S1512 (for example, the above-mentioned work content information) is set.
  • the image data acquisition unit 121 acquires one or more image data 111 from the image acquisition device 3 (S1512).
  • the image data 111 is a one-frame image of moving image data showing the state of the worker 2.
  • the image data 111 of one frame of the moving image data is sequentially selected in chronological order.
  • the determination result table acquisition unit 151 passes the acquired image data 111 to the determination result table generation unit 140, and acquires the determination result table 115 from the determination result table generation unit 140 (S1513). Since the process of S1513 is the same as the determination result table generation process S912 described in the first embodiment, the description thereof will be omitted.
  • the abnormality presence / absence determination processing unit 152 identifies the work performed by the worker 2 by comparing the acquired determination result table 115 with the determination result pattern / work content correspondence information 116 (S1514).
  • the abnormality presence / absence determination processing unit 152 determines whether or not the specified work is the same as the "previous work" (S1515). If the specified work is the same as the "previous work” (S1515: YES), the process proceeds to S1516. In the first loop, the process of S1515 is skipped and the process proceeds to S1516.
  • the abnormality presence / absence determination processing unit 152 updates the "working time" (for example, the time from the previous update to the present is added).
  • the abnormality presence / absence determination processing unit 152 sets the work specified in S1514 (for example, work content information) to "previous work”. After that, the process returns to S1512.
  • the abnormality presence / absence determination processing unit 152 acquires the standard work time of the work specified in S1514 from the determination result pattern / work content correspondence information 116, compares the “work time” with the standard work time, and compares the “work time” with the standard work time. Determines if the work being done by is abnormal. For example, if the difference between the "working time" and the standard working time is equal to or less than the predetermined time, the abnormality presence / absence determination processing unit 152 determines that the work performed by the worker 2 is normal, and the difference exceeds the predetermined time. If so, the work performed by the worker 2 is determined to be abnormal. By appropriately setting the predetermined time, for example, the determination may be made while considering the time spent by the worker 2 in addition to the work such as the break time.
  • the abnormality presence / absence determination processing unit 152 outputs the determination result to the output device 15.
  • the work monitoring mechanism of the worker 2 in the first embodiment by using the work monitoring mechanism of the worker 2 in the first embodiment, the work performed by the worker 2 based on the work time of the work performed by the worker 2. It is possible to realize a mechanism for monitoring whether or not is normal or abnormal.
  • the work monitoring device 100 of the third embodiment monitors whether or not the worker 2 is performing the work according to the correct work order by using the work monitoring mechanism of the worker 2 in the first embodiment.
  • the parts different from those of the first embodiment will be mainly described.
  • FIG. 16 shows the main functions of the work monitoring device 100 of the third embodiment.
  • the storage unit 110 of the work monitoring device 100 of the third embodiment stores information indicating the correct work order in addition to the information stored by the storage unit 110 of the work monitoring device 100 of the first embodiment.
  • the work order information 118 which is the data to be included, is stored.
  • the abnormality presence / absence determination processing unit 152 of the work monitoring device 100 of the third embodiment has a function of determining whether or not the worker 2 is performing work according to a preset regular work order.
  • FIG. 17 is a flowchart illustrating the work monitoring process S614 according to the third embodiment. Since the other processes in the third embodiment (the other processes described in the first embodiment performed by the work monitoring device 100) are the same as those in the first embodiment, the description thereof will be omitted.
  • the monitoring processing unit 150 initializes the "previous work” which is a work variable used in the following processing (S1711).
  • the "previous work” information indicating the work specified last time in the loop process from S1712 (for example, the above-mentioned work content information) is set.
  • the image data acquisition unit 121 acquires one or more image data 111 from the image acquisition device 3 (S1712).
  • the image data 111 is a one-frame image of moving image data showing the state of the worker 2.
  • the image data 111 of one frame of the moving image data is sequentially selected in chronological order.
  • the determination result table acquisition unit 151 passes the acquired image data 111 to the determination result table generation unit 140, and acquires the determination result table 115 from the determination result table generation unit 140 (S1713). Since the process of S1713 is the same as the determination result table generation process S912 described in the first embodiment, the description thereof will be omitted.
  • the abnormality presence / absence determination processing unit 152 identifies the work performed by the worker 2 by comparing the acquired determination result table 115 with the determination result pattern / work content correspondence information 116 (S1714).
  • the abnormality presence / absence determination processing unit 152 compares the specified work, the "previous work", and the work order information 118 to determine whether the specified work is a work according to the work order in the work order information 118. (S1715). In the first loop, the process of S1715 is skipped and the process proceeds to S1716. If the specified work follows the work order in the work order information 118 (S1715: YES), the process proceeds to S1716. If the specified work is not a work according to the work order in the work order information 118 (S1715: NO), the process proceeds to S1717.
  • the abnormality presence / absence determination processing unit 152 sets the work specified in S1714 (for example, work content information) to "previous work”. After that, the process returns to S1712.
  • the abnormality presence / absence determination processing unit 152 outputs the determination result (information indicating that the work order does not follow the correct work order) to the output device 15.
  • the work monitoring device 100 of the fourth embodiment uses the work monitoring mechanism of the worker 2 in the first embodiment to set the period during which the work to be analyzed is performed among the periods performed by the worker 2. Identify.
  • the parts different from those of the first embodiment will be mainly described.
  • FIG. 18 shows the main functions of the work monitoring device 100 of the fourth embodiment.
  • the storage unit 110 of the work monitoring device 100 of the fourth embodiment shows the work to be analyzed in addition to the information stored by the storage unit 110 of the work monitoring device 100 of the first embodiment.
  • the analysis target work information 119 which is data including information (for example, the above-mentioned work content information), is stored.
  • the abnormality presence / absence determination processing unit 152 of the work monitoring device 100 of the fourth embodiment has a function of specifying the period during which the work to be analyzed is being performed among the periods during which the worker 2 is performing.
  • FIG. 19 is a flowchart illustrating the work monitoring process S614 according to the fourth embodiment. Since the other processes in the fourth embodiment (the other processes described in the first embodiment performed by the work monitoring device 100) are the same as those in the first embodiment, the description thereof will be omitted.
  • the image data acquisition unit 121 acquires one or more image data 111 from the image acquisition device 3 (S1911).
  • the image data 111 is a one-frame image of moving image data showing the state of the worker 2.
  • the image data 111 of one frame of the moving image data is sequentially selected in chronological order.
  • the determination result table acquisition unit 151 passes the acquired image data 111 to the determination result table generation unit 140, and acquires the determination result table 115 from the determination result table generation unit 140 (S1912). Since the process of S1912 is the same as the determination result table generation process S912 described in the first embodiment, the description thereof will be omitted.
  • the abnormality presence / absence determination processing unit 152 identifies the work performed by the worker 2 by comparing the acquired determination result table 115 with the determination result pattern / work content correspondence information 116 (S1913).
  • the abnormality presence / absence determination processing unit 152 compares the specified work with the analysis target work information 119, and determines whether or not the specified work is the analysis target (S1914). If the identified work is the analysis target (S1914: YES), the process proceeds to S1915. If the specified work is not the analysis target (S1914: NO), the process returns to S1911.
  • the abnormality presence / absence determination processing unit 152 outputs information indicating that the worker 2 is performing the work to be analyzed at the timing of the image data to the output device 15.
  • the worker 2 when the worker 2 continuously performs a plurality of works, it is possible to output information indicating the period during which each work is performed. For example, the work performed by the worker 2 can be analyzed.
  • a person who intends to analyze hereinafter referred to as an "analyzer" can easily specify the period to be analyzed.
  • FIG. 20 shows an example of the above period that the analyst analyzes.
  • the analyst can specify the start time point and the end time point of the work to be analyzed by using the above mechanism.
  • the analyst can specify the start time to be analyzed by using the above mechanism, and can specify a predetermined time from the start time as the period to be analyzed.
  • the analyst uses the above mechanism to specify the end time point to be analyzed, and the period from the end time point to the end time point is the period to be analyzed. Can be specified as.
  • the work monitoring process S614 described in each of the above embodiments may be executed for the moving image data acquired in real time, or may be executed ex post facto for the recorded moving image data.
  • each of the above configurations, functional units, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit.
  • each of the above configurations, functions, etc. may be realized by software by the processor interpreting and executing a program that realizes each function.
  • Information such as programs, tables, and files that realize each function can be placed in a memory, a hard disk, a recording device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • control lines and information lines are shown as necessary for explanation, and not all the control lines and information lines in the implementation are necessarily shown. For example, in practice almost all configurations may be considered interconnected.
  • the arrangement form of various functional units, various processing units, and various databases of each information processing device described above is only an example.
  • the arrangement form of various function units, various processing units, and various databases can be changed to the optimum arrangement form from the viewpoint of the performance, processing efficiency, communication efficiency, and the like of the hardware and software included in these devices.
  • the configuration of the database (schema, etc.) that stores the various data described above can be flexibly changed from the viewpoints of efficient use of resources, improvement of processing efficiency, improvement of access efficiency, improvement of search efficiency, and the like.
  • the position of the object may be detected by using a sensor or an image recognition method that can automatically detect the body of the worker 2. Further, the position of the object may be detected by using a sensor or an image recognition method that can automatically detect the work object or the tool used for the work. In this case, the same processing as that of the above-described embodiment is performed using the automatically detected position information of the body, the work object, and the like.

Abstract

The present invention monitors work conducted by a person with high accuracy by an object recognition technique without preparing large-volume data in advance. A work monitoring device acquires image data of a state in which a person conducts work, acquires positions in the image data of each of a plurality of objects in the image data, generates a determination result table including information which indicates a result of determination of whether one or more determination subject areas as areas set for the image data respectively include the positions of the plurality of objects or not, stores determination result pattern/work content correspondence information as information acquired by associating the determination result table generated on the basis of the image data and work content information as information which indicates the work conducted by the person in the image data, generates a determination result table for image data newly acquired, and specifies work conducted by the person by comparing the generated determination result table and the determination result pattern/work content correspondence information, thereby monitoring the work conducted by the person.

Description

作業監視装置、及び作業監視方法Work monitoring device and work monitoring method
 本発明は、作業監視装置、及び作業監視方法に関する。 The present invention relates to a work monitoring device and a work monitoring method.
 本出願は、2019年6月3日に出願された日本特許出願2019-103551号に基づく優先権を主張し、その開示全体を援用して本出願に取り込むものである。 This application claims priority based on Japanese Patent Application No. 2019-103551 filed on June 3, 2019, and incorporates the entire disclosure into this application.
 特許文献1には、ロボットが人の姿勢、動作、又は挙動を認識するための動体の動作認識方法について記載されている。上記動作認識方法では、予め動体Aの基本動作ごとのフレーム画像データAが点で表示される固有空間データAを作成してデータベース化し、対象となる動体Bのフレーム画像データBが点で表示された固有空間データBと基本動作ごとの固有空間データAを比較し、固有空間データBからの距離が最も近い固有空間データAを選び、動体Bの動作を認識する。基本動作ごとの各フレーム画像データAは、動体Aに基本動作を行わせ、基本動作を行う動体Aを複数の画像入力手段を用いて多方向から撮影し、画像入力手段ごとに取得した連続する複数のフレーム画像にそれぞれ重みをつけた後、重ね合わせることで作成される圧縮画像から得る。 Patent Document 1 describes a method of recognizing the motion of a moving body for the robot to recognize the posture, motion, or behavior of a person. In the above motion recognition method, the eigenspace data A in which the frame image data A for each basic motion of the moving object A is displayed by dots is created in advance and created in a database, and the frame image data B of the target moving object B is displayed by dots. The eigenspace data B is compared with the eigenspace data A for each basic operation, the eigenspace data A closest to the eigenspace data B is selected, and the operation of the moving body B is recognized. For each frame image data A for each basic motion, the moving body A is made to perform the basic motion, and the moving body A performing the basic motion is photographed from multiple directions using a plurality of image input means, and is continuously acquired for each image input means. It is obtained from a compressed image created by weighting each of a plurality of frame images and then superimposing them.
 特許文献2には、人の身振りや手振りを利用したユーザインタフェースにおいて手の形状および動作を認識する動作認識システムに関して記載されている。動作認識システムは、特定の対象の画像が含まれている時系列画像データを処理することにより対象の形状および動作を認識し、時系列画像データから動きのある領域を抽出し、時系列画像データから対象を特徴づける色を含む領域を抽出し、動きのある領域で、かつ対象を特徴づける色を含む領域となる領域を対象領域として抽出する。 Patent Document 2 describes a motion recognition system that recognizes the shape and motion of a hand in a user interface that uses a person's gesture or gesture. The motion recognition system recognizes the shape and motion of the target by processing the time-series image data including the image of the specific target, extracts the moving region from the time-series image data, and performs the time-series image data. A region containing a color that characterizes the target is extracted from, and a region that is a moving region and that includes a color that characterizes the target is extracted as the target region.
 非特許文献1には、センサやカメラを活用した様々な行動認識技術について記載されている。非特許文献2には、学習データに出現していないデータのラベルを予測する「Zero Shot Learning」を用いて動作認識を行うことが記載されている。 Non-Patent Document 1 describes various behavior recognition technologies that utilize sensors and cameras. Non-Patent Document 2 describes that motion recognition is performed using "Zero Shot Learning" that predicts labels of data that do not appear in the training data.
特開2009-176059号公報JP-A-2009-176059 特開2001-16606号公報Japanese Unexamined Patent Publication No. 2001-16606
 製品の製造現場においては、品質管理や作業者の安全確保等を目的として作業者の動作を認識することにより作業者が正しい作業手順に従って作業しているか否かを監視する仕組みの導入が進められている。昨今では機械学習を利用した技術が注目されており、例えば一般公開されているデータベース(「UCF101- Action Recognition Data Set」等)を用いた動作認識技術について研究開発が活発に行われている。 At the product manufacturing site, the introduction of a mechanism to monitor whether or not the worker is working according to the correct work procedure by recognizing the movement of the worker for the purpose of quality control and ensuring the safety of the worker is being promoted. ing. In recent years, technology using machine learning has been attracting attention, and for example, research and development of motion recognition technology using a database (“UCF101-Action Recognition Data Set”, etc.) that is open to the public is being actively carried out.
 ところで、作業者が行う作業は現場毎に異なるため、動作認識技術を利用した作業監視の仕組みを現場に導入する際は、現場毎に作業者が行う様々な作業についてデータを収集する必要がある。また上記のようなデータベースを用いる場合でも、登録されていない作業については現場毎にデータを収集する必要がある。 By the way, since the work performed by the worker differs from site to site, when introducing a work monitoring mechanism using motion recognition technology to the site, it is necessary to collect data on various tasks performed by the worker at each site. .. Even when using the above database, it is necessary to collect data for each site for unregistered work.
 例えば、特許文献1に記載の技術は、予め基本動作ごとのフレーム画像で固有空間データを生成し、固有空間が持つ動作の特徴から距離を比較して動作認識を行うものであり、認識対象となる基本動作について事前に大量のデータを用意しておく必要があり、導入時に現場の作業者に与える負荷が大きい。 For example, the technique described in Patent Document 1 generates eigenspace data in advance with a frame image for each basic operation, compares the distance from the characteristics of the operation of the eigenspace, and performs motion recognition. It is necessary to prepare a large amount of data in advance for the basic operation, which puts a heavy load on the workers at the site at the time of introduction.
 一方、特許文献2に記載の技術は、時系列画像データを用いて動き領域と動作認識対象を特徴づける色を含む領域とに基づき対象領域を抽出し、形状を解析することにより動作認識を行うものであり、事前にデータを必要としない方式である。しかし同技術では形状に基づき動作を認識するため、作業している手の位置や周辺の情報については正確に認識することができない。 On the other hand, in the technique described in Patent Document 2, motion recognition is performed by extracting a target region based on a motion region and a region containing a color that characterizes the motion recognition target using time-series image data and analyzing the shape. It is a method that does not require data in advance. However, since the same technology recognizes the movement based on the shape, it is not possible to accurately recognize the position of the working hand and the surrounding information.
 また非特許文献2に記載の技術では、学習データに出現していないデータのラベルを予測するが、認識対象を予測するのに必要なデータは現場で用意する必要がある。 In addition, the technique described in Non-Patent Document 2 predicts labels of data that do not appear in the learning data, but it is necessary to prepare the data necessary for predicting the recognition target at the site.
 本発明はこうした背景に基づきなされたものであり、事前に大量のデータを準備することなく、作業者が行う作業を精度よく監視することが可能な、作業監視装置、及び作業監視方法を提供することを目的とする。 The present invention has been made based on such a background, and provides a work monitoring device and a work monitoring method capable of accurately monitoring the work performed by an operator without preparing a large amount of data in advance. The purpose is.
 上記目的を達成するための本発明の一つは、作業監視装置であって、プロセッサ及びメモリを有する情報処理装置を用いて構成され、作業者が作業を行っている様子を映した画像データを取得する画像データ取得部と、前記画像データに映っている複数の物体について前記画像データにおける夫々の位置を示す情報を取得する物体位置検出部と、前記画像データに設定された領域である一つ以上の判定対象領域の夫々に前記複数の物体の夫々の前記位置が含まれているか否かを判定した結果を示す情報を含んだ判定結果表を生成する判定結果表生成部と、前記画像データに基づき生成される前記判定結果表と当該画像データにおける作業を示す情報である作業内容情報とを対応づけた情報である判定結果パターン/作業内容対応情報を記憶する記憶部と、新たに取得された前記画像データについて前記判定結果表を生成し、生成した前記判定結果表と前記判定結果パターン/作業内容対応情報とを対照することにより前記作業者が行っている作業を特定する監視処理部と、を備える。 One of the present inventions for achieving the above object is a work monitoring device, which is configured by using an information processing device having a processor and a memory, and displays image data showing a state in which an operator is performing work. An image data acquisition unit to be acquired, an object position detection unit that acquires information indicating the position of each of a plurality of objects reflected in the image data in the image data, and one area set in the image data. A determination result table generation unit that generates a determination result table including information indicating the result of determining whether or not the positions of the plurality of objects are included in each of the above determination target areas, and the image data. A storage unit that stores the judgment result pattern / work content correspondence information, which is information corresponding to the judgment result table generated based on the above and the work content information which is information indicating the work in the image data, and a newly acquired storage unit. A monitoring processing unit that generates the determination result table for the image data and compares the generated determination result table with the determination result pattern / work content correspondence information to specify the work performed by the worker. , Equipped with.
 その他、本願が開示する課題、及びその解決方法は、発明を実施するための形態の欄、及び図面により明らかにされる。 In addition, the problems disclosed in the present application and the solutions thereof will be clarified by the column of the form for carrying out the invention and the drawings.
 本発明によれば、事前に大量のデータを準備することなく、作業者が行う作業を精度よく監視することができる。 According to the present invention, it is possible to accurately monitor the work performed by the worker without preparing a large amount of data in advance.
作業監視システムの概略的な構成を示す図である。It is a figure which shows the schematic structure of the work monitoring system. 作業監視装置のハードウェア構成例を示す図である。It is a figure which shows the hardware configuration example of the work monitoring apparatus. 作業監視装置が備える主な機能を示す図である。It is a figure which shows the main function which a work monitoring apparatus has. (a)は画像データの例であり、(b)は画像データに映っているマーカの位置と判定対象領域を抽出して描いた図であり、(c)は画像データに基づき生成される判定結果表である。(A) is an example of image data, (b) is a diagram drawn by extracting the position of a marker reflected in the image data and a judgment target area, and (c) is a judgment generated based on the image data. It is a result table. (a)~(c)はいずれも画像データと当該画像データに対応する判定結果パターンの例である。(A) to (c) are examples of image data and determination result patterns corresponding to the image data. メイン処理を説明するフローチャートである。It is a flowchart explaining the main process. マーカ色設定処理を説明するフローチャートである。It is a flowchart explaining the marker color setting process. 判定対象領域設定処理を説明するフローチャートである。It is a flowchart explaining the determination target area setting process. 対応情報登録処理を説明するフローチャートである。It is a flowchart explaining the correspondence information registration process. 判定結果表生成処理を説明するフローチャートである。It is a flowchart explaining the determination result table generation process. マーカ位置検出処理を説明するフローチャートである。It is a flowchart explaining the marker position detection process. 作業監視処理を説明するフローチャートである。It is a flowchart explaining the work monitoring process. 判定結果表の他の例である。This is another example of the judgment result table. 第2実施形態の作業監視装置が備える主な機能を示す図である。It is a figure which shows the main function which the work monitoring apparatus of 2nd Embodiment has. 第2実施形態の作業監視処理を説明するフローチャートである。It is a flowchart explaining the work monitoring process of 2nd Embodiment. 第3実施形態の作業監視装置が備える主な機能を示す図である。It is a figure which shows the main function which the work monitoring apparatus of 3rd Embodiment has. 第3実施形態の作業監視処理を説明するフローチャートである。It is a flowchart explaining the work monitoring process of 3rd Embodiment. 第4実施形態の作業監視装置が備える主な機能を示す図である。It is a figure which shows the main function which the work monitoring apparatus of 4th Embodiment has. 第4実施形態の作業監視処理を説明するフローチャートである。It is a flowchart explaining the work monitoring process of 4th Embodiment. 解析対象とする期間の例を示す図である。It is a figure which shows the example of the period to be analyzed.
 以下、実施形態につき図面を参照しつつ説明する。以下の説明において、同一のまたは類似する構成に同一の符号を付して重複した説明を省略することがある。また以下の説明において、同種の構成を区別する必要がある場合、構成を総称する符号の後に括弧書きで識別子(数字、アルファベット等)を表記することがある。 Hereinafter, the embodiment will be described with reference to the drawings. In the following description, the same or similar configurations may be designated by the same reference numerals and duplicate description may be omitted. Further, in the following description, when it is necessary to distinguish the same type of configuration, an identifier (number, alphabet, etc.) may be written in parentheses after the code that collectively refers to the configuration.
[第1実施形態]
 図1に第1実施形態として説明する作業監視システム1の概略的な構成を示している。同図に示すように、作業監視システム1は、画像取得装置3、各種センサ4、及び情報処理装置である作業監視装置100を含む。画像取得装置3、各種センサ4、及び作業監視装置100は、有線又は無線の通信手段5を介して通信可能に接続されている。通信手段5の構成は必ずしも限定されないが、例えば、USB(Universal Serial Bus)やRS-232C等の各種通信規格に準拠した通信手段、LAN(Local Area Network)、WAN(Wide Area Network)、インターネット、専用線等である。
[First Embodiment]
FIG. 1 shows a schematic configuration of the work monitoring system 1 described as the first embodiment. As shown in the figure, the work monitoring system 1 includes an image acquisition device 3, various sensors 4, and a work monitoring device 100 which is an information processing device. The image acquisition device 3, various sensors 4, and the work monitoring device 100 are communicably connected via a wired or wireless communication means 5. The configuration of the communication means 5 is not necessarily limited, but for example, a communication means compliant with various communication standards such as USB (Universal Serial Bus) and RS-232C, LAN (Local Area Network), WAN (Wide Area Network), the Internet, and the like. Dedicated line, etc.
 画像取得装置3(撮影装置)は、作業者2や作業者2の周囲を映した画像データを取得する装置であり、例えば、動画や静止画の画像データを取得(撮影)するカメラ(デジタルカメラ(RGBカメラ)、赤外線カメラ、サーモグラフィカメラ、タイムオブフライト(TOF: Time Of Flight)カメラ、ステレオカメラ等)である。 The image acquisition device 3 (shooting device) is a device that acquires image data of the worker 2 and the surroundings of the worker 2, and is, for example, a camera (digital camera) that acquires (shoots) image data of a moving image or a still image. (RGB camera), infrared camera, thermography camera, time of flight (TOF: Time Of Flight) camera, stereo camera, etc.).
 各種センサ4は、作業者2が作業を行う作業環境に設けられ、作業者2や作業環境についての物理的な情報を出力する。各種センサ4は、例えば、動体検知センサ、人感センサ、温度センサ、湿度センサ、加速度センサ、速度センサ、音響センサ(マイクロホン)、超音波センサ、振動センサ、ミリ波レーダ、レーザレーダ(LIDAR: Laser Imaging Detection and Ranging)、赤外線深度センサである。 Various sensors 4 are provided in the work environment in which the worker 2 works, and output physical information about the worker 2 and the work environment. The various sensors 4 include, for example, a motion detection sensor, a human sensor, a temperature sensor, a humidity sensor, an acceleration sensor, a speed sensor, an acoustic sensor (microphone), an ultrasonic sensor, a vibration sensor, a millimeter wave radar, and a laser radar (LIDAR: Laser). Imaging Detection and Langing), an infrared depth sensor.
 作業監視装置100は、画像取得装置3によって取得される画像データに基づき作業者2の作業の監視に関する処理を行う。 The work monitoring device 100 performs processing related to monitoring the work of the worker 2 based on the image data acquired by the image acquisition device 3.
 図2に作業監視装置100のハードウェア構成例を示している。作業監視装置100は、情報処理装置(コンピュータ)であり、プロセッサ11、主記憶装置12、補助記憶装置13、入力装置14、出力装置15、及び通信装置16を備える。 FIG. 2 shows an example of the hardware configuration of the work monitoring device 100. The work monitoring device 100 is an information processing device (computer), and includes a processor 11, a main storage device 12, an auxiliary storage device 13, an input device 14, an output device 15, and a communication device 16.
 プロセッサ11は、例えば、演算処理を行う装置であり、CPU(Central Processing Unit)、MPU(Micro Processing Unit)、GPU(Graphics Processing Unit)、AI(Artificial Intelligence)チップ等である。主記憶装置12は、プログラムやデータを記憶する装置であり、例えば、ROM(Read Only Memory)(SRAM(Static Random Access Memory)、NVRAM(Non Volatile RAM)、マスクROM(Mask Read Only Memory)、PROM(Programmable ROM)等)、RAM(Random Access Memory)(DRAM(Dynamic Random Access Memory)等)等である。補助記憶装置13は、ハードディスクドライブ(Hard Disk Drive)、フラッシュメモリ(Flash Memory)、SSD(Solid State Drive)、光学式記憶装置(CD(Compact Disc)、DVD(Digital Versatile Disc)等)等である。補助記憶装置13に格納されているプログラムやデータは、主記憶装置12に随時読み込まれる。 The processor 11 is, for example, a device that performs arithmetic processing, such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), and an AI (Artificial Intelligence) chip. The main storage device 12 is a device that stores programs and data, and is, for example, a ROM (Read Only Memory) (SRAM (Static Random Access Memory), NVRAM (Non Volatile RAM), mask ROM (Mask Read Only Memory), PROM. (Programmable ROM), etc.), RAM (RandomAccessMemory) (DRAM (DynamicRandomAccessMemory), etc.), etc. The auxiliary storage device 13 is a hard disk drive (Hard Disk Drive), a flash memory (Flash Memory), an SSD (Solid State Drive), an optical storage device (CD (Compact Disc), DVD (Digital Versatile Disc), etc.) and the like. .. The programs and data stored in the auxiliary storage device 13 are read into the main storage device 12 at any time.
 入力装置14は、ユーザから情報を受付けるユーザインタフェースであり、例えば、キーボード、マウス、カードリーダ、タッチパネル等である。出力装置15は、各種の情報を出力(表示出力、音声出力、印字出力等)するユーザインタフェースであり、例えば、各種情報を可視化する表示装置(LCD(Liquid Crystal Display)、グラフィックカード等)や音声出力装置(スピーカ)、印字装置等である。 The input device 14 is a user interface that receives information from the user, and is, for example, a keyboard, a mouse, a card reader, a touch panel, or the like. The output device 15 is a user interface that outputs various information (display output, audio output, print output, etc.). For example, a display device (LCD (Liquid Crystal Display), graphic card, etc.) that visualizes various information, audio, etc. Output devices (speakers), printing devices, etc.
 通信装置16は、通信手段5を介して他の装置と通信する通信インタフェースであり、例えば、NIC(Network Interface Card)、無線通信モジュール、USB(Universal Serial Interface)モジュール、シリアル通信モジュール等である。通信装置16は、通信可能に接続する他の装置から情報を受信する入力装置として機能することもできる。また通信装置16は、通信可能に接続する他の装置に情報を送信する出力装置として機能することもできる。作業監視装置100は、通信装置16により通信手段5を介して画像取得装置3や各種センサ4と通信する。 The communication device 16 is a communication interface that communicates with another device via the communication means 5, and is, for example, a NIC (Network Interface Card), a wireless communication module, a USB (Universal Serial Interface) module, a serial communication module, or the like. The communication device 16 can also function as an input device that receives information from another device that is communicably connected. The communication device 16 can also function as an output device that transmits information to other devices that are communicably connected. The work monitoring device 100 communicates with the image acquisition device 3 and various sensors 4 via the communication means 5 by the communication device 16.
 作業監視装置100が備える各種の機能は、プロセッサ11が、主記憶装置12に格納されているプログラムを読み出して実行することにより、もしくは、作業監視装置100を構成しているハードウェア(FPGA、ASIC、AIチップ等)により実現される。 The various functions included in the work monitoring device 100 are provided by the processor 11 reading and executing the program stored in the main storage device 12, or by the hardware (FPGA, ASIC) constituting the work monitoring device 100. , AI chip, etc.).
 図3に作業監視装置100が備える主な機能を示している。同図に示すように、作業監視装置100は、記憶部110、データ取得部120、準備設定処理部130、判定結果表生成部140、及び監視処理部150を備える。尚、作業監視装置100は、上記の機能に加えて、例えば、オペレーティングシステム、デバイスドライバ、ファイルシステム、DBMS(DataBase Management System)等の機能を更に備えていてもよい。 FIG. 3 shows the main functions of the work monitoring device 100. As shown in the figure, the work monitoring device 100 includes a storage unit 110, a data acquisition unit 120, a preparation setting processing unit 130, a determination result table generation unit 140, and a monitoring processing unit 150. In addition to the above functions, the work monitoring device 100 may further include functions such as an operating system, a device driver, a file system, and a DBMS (DataBase Management System).
 上記の機能のうち、記憶部110は、画像データ111、センサデータ112、マーカ色データ113、判定対象領域データ114、判定結果表115、及び判定結果パターン/作業内容対応情報116を記憶する。尚、記憶部110は、例えば、DBMSが提供するデータベースのテーブルや、ファイルシステムが提供するファイルとしてこれらの情報(データ)を記憶する。 Among the above functions, the storage unit 110 stores image data 111, sensor data 112, marker color data 113, determination target area data 114, determination result table 115, and determination result pattern / work content correspondence information 116. The storage unit 110 stores such information (data) as, for example, a database table provided by the DBMS or a file provided by the file system.
 画像データ111は、画像取得装置3から取得されるデータであり、例えば、画像取得装置3から送られてくる、静止画データや動画データを構成するフレームのデータである。センサデータ112は、各種センサ4から取得されるデータである。尚、マーカ色データ113、判定対象領域データ114、判定結果表115、及び判定結果パターン/作業内容対応情報116の詳細については後述する。 The image data 111 is data acquired from the image acquisition device 3, and is, for example, frame data that constitutes still image data or moving image data sent from the image acquisition device 3. The sensor data 112 is data acquired from various sensors 4. The details of the marker color data 113, the determination target area data 114, the determination result table 115, and the determination result pattern / work content correspondence information 116 will be described later.
 データ取得部120は、他の装置から送られてくるデータを取得(受信)する。同図に示すように、データ取得部120は、画像データ取得部121とセンサデータ取得部122を含む。このうち画像データ取得部121は、画像取得装置3から画像データ111を取得する。センサデータ取得部122は、各種センサ4からセンサデータ112を取得する。 The data acquisition unit 120 acquires (receives) data sent from another device. As shown in the figure, the data acquisition unit 120 includes an image data acquisition unit 121 and a sensor data acquisition unit 122. Of these, the image data acquisition unit 121 acquires image data 111 from the image acquisition device 3. The sensor data acquisition unit 122 acquires sensor data 112 from various sensors 4.
 準備設定処理部130は、作業監視装置100が作業者2の作業を監視する際に用いる各種の情報の設定に関する処理を行う。同図に示すように、準備設定処理部130は、マーカ色設定処理部131、判定対象領域設定処理部132、及び対応情報設定処理部133を含む。 The preparation setting processing unit 130 performs processing related to setting various information used when the work monitoring device 100 monitors the work of the worker 2. As shown in the figure, the preparation setting processing unit 130 includes a marker color setting processing unit 131, a determination target area setting processing unit 132, and a corresponding information setting processing unit 133.
 このうちマーカ色設定処理部131は、画像データ111に基づき、作業者2や物体の位置を認識するために、作業者2の体や作業対象物、作業に用いる道具等に設けられている表示(以下、「マーカ」と称する。)の色に関する設定を行う。例えば、マーカ色設定処理部131は、画像データ111においてマーカが映っている領域の色データ(例えばRGB値で表現されたデータ)を取得し、当該マーカの識別子(以下、「マーカID」と称する。)と当該色データとを対応づけた情報を含むデータであるマーカ色データ113を生成する。 Of these, the marker color setting processing unit 131 is provided on the body of the worker 2, the work object, the tool used for the work, and the like in order to recognize the position of the worker 2 and the object based on the image data 111. (Hereinafter referred to as "marker"), the color is set. For example, the marker color setting processing unit 131 acquires color data (for example, data expressed by RGB values) in the area where the marker is reflected in the image data 111, and refers to the identifier of the marker (hereinafter, referred to as “marker ID”). ) And the color data are associated with each other to generate marker color data 113, which is data including information.
 判定対象領域設定処理部132は、画像データ111に対する領域(以下、「判定対象領域」と称する。)の設定に関する処理を行う。判定対象領域設定処理部132は、例えば、ユーザから判定対象領域の設定を受け付ける。また判定対象領域設定処理部132は、自動的に画像データに判定対象領域を設定する。判定対象領域設定処理部132は、設定した判定対象領域を特定する情報を含んだデータである判定対象領域データ114を生成する。判定対象領域データ114は、例えば、判定対象領域の識別子(以下、「判定対象領域ID」と称する。)と画像データの座標系で領域を示した情報とを対応づけた情報を含む。 The determination target area setting processing unit 132 performs processing related to setting an area (hereinafter, referred to as “determination target area”) for the image data 111. The determination target area setting processing unit 132 receives, for example, the setting of the determination target area from the user. Further, the determination target area setting processing unit 132 automatically sets the determination target area in the image data. The determination target area setting processing unit 132 generates the determination target area data 114, which is data including information for specifying the set determination target area. The determination target area data 114 includes, for example, information in which an identifier of the determination target area (hereinafter, referred to as “determination target area ID”) and information indicating the area in the coordinate system of the image data are associated with each other.
 対応情報設定処理部133は、マーカが設けられているオブジェクト(作業者2の体の所定部位、作業対象、作業者2が使う道具等の作業に関する物体)が各判定対象領域内に存在する(映っている)か否かを示す情報を含むデータである判定結果表115のパターン(以下、「判定結果パターン」と称する。)と、判定結果パターンに対応する作業内容を特定する情報(以下、「作業内容情報」と称する。)と、を対応づけた情報である判定結果パターン/作業内容対応情報116の設定に関する処理を行う。尚、作業内容情報は、判定結果パターンの作業が、正常な作業についてのものであるか異常な作業についてのものであるかを示す情報(以下、「正常/異常区分」と称する。)を含む。判定結果表115の詳細については後述する。 In the correspondence information setting processing unit 133, an object provided with a marker (a predetermined part of the body of the worker 2, a work target, an object related to work such as a tool used by the worker 2) exists in each determination target area ( The pattern of the judgment result table 115 (hereinafter, referred to as "judgment result pattern"), which is data including information indicating whether or not the judgment result is reflected, and information for specifying the work content corresponding to the judgment result pattern (hereinafter, referred to as "judgment result pattern") It is referred to as "work content information") and is associated with the determination result pattern / work content correspondence information 116. The work content information includes information indicating whether the work of the determination result pattern is for normal work or abnormal work (hereinafter, referred to as "normal / abnormal classification"). .. The details of the determination result table 115 will be described later.
 判定結果表生成部140は、画像データ111に基づき判定結果表115を生成する。同図に示すように、判定結果表生成部140は、マーカ位置検出処理部141と判定結果表生成処理部142を含む。 The determination result table generation unit 140 generates the determination result table 115 based on the image data 111. As shown in the figure, the determination result table generation unit 140 includes a marker position detection processing unit 141 and a determination result table generation processing unit 142.
 このうちマーカ位置検出処理部141は、マーカ色データ113を用いて画像データ111におけるマーカが映っている位置(本例ではマーカが移っている領域の重心位置とする)を検出する。 Of these, the marker position detection processing unit 141 detects the position where the marker is reflected in the image data 111 (in this example, the position of the center of gravity of the region where the marker is moved) using the marker color data 113.
 判定結果表生成処理部142は、検出したマーカの位置と判定対象領域データ114とを対照することにより判定結果表115を生成する。 The judgment result table generation processing unit 142 generates the judgment result table 115 by comparing the position of the detected marker with the judgment target area data 114.
 図4(a)に画像データ111の例を示す。例示する画像データ111には、オブジェクト41(1)~41(4)の夫々に設けられている複数のマーカ42(1)~42(4)が映っている。また判定対象領域43(1)~(3)が設定されている。 FIG. 4A shows an example of image data 111. In the illustrated image data 111, a plurality of markers 42 (1) to 42 (4) provided in each of the objects 41 (1) to 41 (4) are shown. Further, the determination target areas 43 (1) to (3) are set.
 図4(b)は、図4(a)に示す画像データ111に映っているマーカ42(1)~(4)の位置と判定対象領域43(1)~(3)を抽出して描いた図である。本例では判定対象領域43(1)~(3)は矩形枠としているが、他の形状としてもよい。 FIG. 4B is drawn by extracting the positions of the markers 42 (1) to (4) and the determination target areas 43 (1) to (3) shown in the image data 111 shown in FIG. 4A. It is a figure. In this example, the determination target areas 43 (1) to (3) have a rectangular frame, but other shapes may be used.
 図4(c)は、図4(a)に示した画像データ111に基づき生成される判定結果表115である。同図に示すように、判定結果表115は、行方向にマーカIDを、列方向に判定対象領域IDを、夫々設定した二次元表(テーブル)である。また同図に示すように、列方向には、さらにマーカの可視/不可視(画像データ111に映っているか否か)を示す項目(以下、「Visible」と称する。)が設けられている。尚、前述した判定結果パターンも判定結果表115と同様の構成を有する。 FIG. 4C is a determination result table 115 generated based on the image data 111 shown in FIG. 4A. As shown in the figure, the determination result table 115 is a two-dimensional table in which the marker ID is set in the row direction and the determination target area ID is set in the column direction. Further, as shown in the figure, an item (hereinafter, referred to as “Visible”) indicating visible / invisible (whether or not the marker is reflected in the image data 111) of the marker is further provided in the column direction. The determination result pattern described above has the same configuration as the determination result table 115.
 本例の場合、例えば、マーカIDが「Object 1」のオブジェクトについては、画像データ111に映っているので「Visible」の値は「True」に設定され、判定対象領域IDが「Area 1」の領域内であるので当該項目は「True」に設定され、判定対象領域IDが「Area 2」の領域外であるので当該項目は「False」に設定され、判定対象領域IDが「Area 3」の領域外であるので当該項目は「False」に設定されている。 In the case of this example, for example, the object whose marker ID is "Object 1" is reflected in the image data 111, so the value of "Visible" is set to "True" and the judgment target area ID is "Area 1". Since it is within the area, the item is set to "True", and since the judgment target area ID is outside the area of "Area 2", the item is set to "False" and the judgment target area ID is "Area 3". Since it is outside the area, the item is set to "False".
 またマーカIDが「Object 2」のオブジェクトについては、画像データ111に映っているので「Visible」の値は「True」に設定され、判定対象領域IDが「Area 1」の判定対象領域内であるので当該項目は「True」に設定され、判定対象領域IDが「Area 2」の判定対象領域内であるので当該項目は「True」に設定され、判定対象領域IDが「Area 3」の判定対象領域外であるので当該項目は「False」に設定されている。 Further, since the object whose marker ID is "Object 2" is reflected in the image data 111, the value of "Visible" is set to "True", and the judgment target area ID is within the judgment target area of "Area 1". Therefore, the item is set to "True", and the judgment target area ID is within the judgment target area of "Area 2", so the item is set to "True", and the judgment target area ID is the judgment target of "Area 3". Since it is outside the area, the item is set to "False".
 またマーカIDが「Object 3」のオブジェクトについては、画像データ111に映っているので「Visible」の値は「True」に設定され、判定対象領域IDが「Area 1」の判定対象領域外であるので当該項目は「False」に設定され、判定対象領域IDが「Area 2」の判定対象領域外であるので当該項目は「False」に設定され、判定対象領域IDが「Area 3」の判定対象領域内であるので当該項目は「True」に設定されている。 Further, the object whose marker ID is "Object 3" is reflected in the image data 111, so the value of "Visible" is set to "True", and the judgment target area ID is outside the judgment target area of "Area 1". Therefore, the item is set to "False" and the judgment target area ID is outside the judgment target area of "Area 2", so the item is set to "False" and the judgment target area ID is "Area 3". Since it is in the area, the item is set to "True".
 またマーカIDが「Object 4」のオブジェクトについては、画像データ111に映っているので「Visible」の値は「True」に設定され、判定対象領域IDが「Area 1」の判定対象領域外であるので当該項目は「False」に設定され、判定対象領域IDが「Area 2」の判定対象領域外であるので当該項目は「False」に設定され、判定対象領域IDが「Area 3」の判定対象領域外であるので当該項目は「False」に設定されている。 Further, the object whose marker ID is "Object 4" is reflected in the image data 111, so the value of "Visible" is set to "True", and the judgment target area ID is outside the judgment target area of "Area 1". Therefore, the item is set to "False" and the judgment target area ID is outside the judgment target area of "Area 2", so the item is set to "False" and the judgment target area ID is "Area 3". Since it is outside the area, the item is set to "False".
 またマーカIDが「Object 5」のオブジェクトについては、画像データ111に映っていないため、「Visible」の値は「False」に設定され、判定対象領域IDが「Area 1」、「Area 2」、「Area 3」の各項目の値はいずれも「False」に設定されている。 Further, since the object whose marker ID is "Object 5" is not reflected in the image data 111, the value of "Visible" is set to "False", and the judgment target area IDs are "Area 1", "Area 2", and so on. The value of each item of "Area 3" is set to "False".
 尚、以上に示した判定結果表115の内容は、作業者2が行う個々の作業の特徴を表している。そのため、作業者2が行う個々の作業(正常な作業または異常な作業)について取得した画像データ111に基づく判定結果表115を判定結果パターンとして事前に用意しておくことで、新たに作業者2の作業を映した画像データ111に基づき作業者2が行っている作業を容易かつ精度よく特定することができる。 The contents of the determination result table 115 shown above represent the characteristics of the individual work performed by the worker 2. Therefore, by preparing in advance the judgment result table 115 based on the image data 111 acquired for each work (normal work or abnormal work) performed by the worker 2, the worker 2 is newly prepared. The work performed by the worker 2 can be easily and accurately specified based on the image data 111 reflecting the work of.
 図5に、作業者2が行う個々の作業について生成した判定結果パターン(判定結果表115)の例(同図では(a)~(c)で示す3例)を示す。尚、判定結果表115は「Visible」の項目を有しており、画像データ111にマーカが映っているか否かについての情報も含むので、作業者2の作業を高い精度で特定することができる。また画像データ111に映っているマーカの位置の特定は情報処理装置を用いて自動的に行うことができるので、ユーザは画像データ111に判定対象領域を設定しておくだけで容易に判定結果表115を生成することができる。このように、本実施形態の作業監視装置100によれば、異なる作業が行われる個々の現場において作業者2の作業を監視する仕組みを容易に実現することができる。 FIG. 5 shows an example of a judgment result pattern (judgment result table 115) generated for each work performed by the worker 2 (three cases shown by (a) to (c) in the figure). Since the determination result table 115 has an item of "Visible" and includes information on whether or not the marker is reflected in the image data 111, the work of the worker 2 can be specified with high accuracy. .. Further, since the position of the marker displayed in the image data 111 can be automatically specified by using the information processing device, the user can easily set the judgment target area in the image data 111 and easily perform the judgment result table. 115 can be produced. As described above, according to the work monitoring device 100 of the present embodiment, it is possible to easily realize a mechanism for monitoring the work of the worker 2 at each site where different work is performed.
 図3に戻り、監視処理部150は、画像取得装置3から送られてくる画像データ111に基づき作業者2の作業を監視する。同図に示すように、監視処理部150は、判定結果表取得部151と異常有無判定処理部152とを含む。 Returning to FIG. 3, the monitoring processing unit 150 monitors the work of the worker 2 based on the image data 111 sent from the image acquisition device 3. As shown in the figure, the monitoring processing unit 150 includes a determination result table acquisition unit 151 and an abnormality presence / absence determination processing unit 152.
 このうち判定結果表取得部151は、判定結果表生成部140から、画像データ111に基づき生成される判定結果表115を取得する。 Of these, the determination result table acquisition unit 151 acquires the determination result table 115 generated based on the image data 111 from the determination result table generation unit 140.
 異常有無判定処理部152は、判定結果表115に基づき、作業者2の作業が正常か異常かを判定して判定結果を出力する。異常有無判定処理部152は、例えば、画像データ111に基づき生成した判定結果表115が、判定結果パターン/作業内容対応情報116に登録されている正常/異常区分が「正常」に設定されている判定結果パターンのいずれかと一致する場合は作業者2の作業は正常であると判定し、いずれにも一致しない場合は作業者2の作業は異常と判定する。また異常有無判定処理部152は、例えば、画像データ111に基づき生成した判定結果表115が、判定結果パターン/作業内容対応情報116に登録されている正常/異常区分が「異常」の判定結果パターンのいずれかと一致する場合は作業者2の作業は異常であると判定し、いずれにも一致しない場合は作業者2の作業は正常と判定する。尚、判定結果表115と判定結果パターンの一致は必ずしも完全一致を意味せず、例えば、不一致の度合いが予め設定した許容範囲内であれば一致と判定するようにしてもよい。 The abnormality presence / absence determination processing unit 152 determines whether the work of the worker 2 is normal or abnormal based on the determination result table 115, and outputs the determination result. In the determination result table 115 generated based on the image data 111, for example, the abnormality presence / absence determination processing unit 152 sets the normal / abnormal classification registered in the determination result pattern / work content correspondence information 116 to "normal". If it matches any of the determination result patterns, it is determined that the work of the worker 2 is normal, and if it does not match any of the determination result patterns, the work of the worker 2 is determined to be abnormal. Further, in the abnormality presence / absence determination processing unit 152, for example, the determination result table 115 generated based on the image data 111 is registered in the determination result pattern / work content correspondence information 116, and the normal / abnormal classification is "abnormal". If it matches any of the above, it is determined that the work of the worker 2 is abnormal, and if it does not match any of the above, it is determined that the work of the worker 2 is normal. Note that the match between the determination result table 115 and the determination result pattern does not necessarily mean an exact match, and for example, if the degree of mismatch is within a preset allowable range, it may be determined as a match.
<処理説明>
 続いて、作業監視装置100が行う処理について説明する。尚、以下の説明において、作業監視装置100が作業者2の作業を監視している間、画像取得装置3の撮影範囲は固定されているものとする。また各マーカは夫々異なる色を有するものとする。
<Processing explanation>
Subsequently, the processing performed by the work monitoring device 100 will be described. In the following description, it is assumed that the shooting range of the image acquisition device 3 is fixed while the work monitoring device 100 monitors the work of the worker 2. Also, each marker shall have a different color.
 図6は、現場の作業者2の作業の監視に際して作業監視装置100が行う処理(以下、「メイン処理S600」と称する。)を説明するフローチャートである。 FIG. 6 is a flowchart illustrating a process (hereinafter, referred to as “main process S600”) performed by the work monitoring device 100 when monitoring the work of the worker 2 at the site.
 同図に示すように、まず準備設定処理部130が、作業者2の作業を監視するために必要な情報を設定する処理(マーカ色設定処理S611、判定対象領域設定処理S612、及び対応情報登録処理S613)を行う。これらの各処理の詳細については後述する。 As shown in the figure, first, the preparation setting processing unit 130 sets the information necessary for monitoring the work of the worker 2 (marker color setting process S611, determination target area setting process S612, and correspondence information registration). Process S613) is performed. Details of each of these processes will be described later.
 続いて、監視処理部150が、作業者2の作業を監視する処理(作業監視処理S614)を行う。 Subsequently, the monitoring processing unit 150 performs a process of monitoring the work of the worker 2 (work monitoring process S614).
 図7は、図6のマーカ色設定処理S611の詳細を説明するフローチャートである。 FIG. 7 is a flowchart illustrating the details of the marker color setting process S611 of FIG.
 同図に示すように、まず画像データ取得部121が、画像取得装置3から画像データ111を取得する(S711)。 As shown in the figure, first, the image data acquisition unit 121 acquires the image data 111 from the image acquisition device 3 (S711).
 続いて、マーカ色設定処理部131が、画像データ111に映っているマーカの領域の色データ(RGBデータ)を取得する(S712)。尚、画像データ111に複数のマーカが映っている場合、マーカ色設定処理部131は各マーカの領域の色データを取得する。 Subsequently, the marker color setting processing unit 131 acquires the color data (RGB data) of the marker area reflected in the image data 111 (S712). When a plurality of markers are displayed in the image data 111, the marker color setting processing unit 131 acquires the color data of the area of each marker.
 続いて、マーカ色設定処理部131は、取得した色データを所定の色空間(HSV(Hue Saturation Value)、HSB(Hue Saturation Brightness)等)のデータに変換する(S713)。尚、色空間への変換は必ずしも行わなくてもよく、変換を行うか否かは、例えば、要求される監視精度(作業者2の作業が正常か異常かの判定精度)に応じて決定すればよい。またいずれの色空間を選択するかについても同様である。 Subsequently, the marker color setting processing unit 131 converts the acquired color data into data in a predetermined color space (HSV (Hue Saturation Value), HSB (Hue Saturation Brightness), etc.) (S713). It should be noted that the conversion to the color space does not necessarily have to be performed, and whether or not the conversion is performed is determined according to, for example, the required monitoring accuracy (the accuracy of determining whether the work of the worker 2 is normal or abnormal). Just do it. The same applies to which color space is selected.
 続いて、マーカ色設定処理部131は、変換後のデータをマーカIDと対応づけてマーカ色データ113として記憶する(S714)。 Subsequently, the marker color setting processing unit 131 associates the converted data with the marker ID and stores it as the marker color data 113 (S714).
 図8は、図6の判定対象領域設定処理S612の詳細を説明するフローチャートである。 FIG. 8 is a flowchart illustrating the details of the determination target area setting process S612 of FIG.
 同図に示すように、まず画像データ取得部121が、画像取得装置3から画像データ111を取得する(S811)。 As shown in the figure, first, the image data acquisition unit 121 acquires the image data 111 from the image acquisition device 3 (S811).
 続いて、判定対象領域設定処理部132が、取得した画像データ111に判定対象領域を一つ以上設定する(S812)。判定対象領域設定処理部132は、例えば、ユーザとの対話処理により判定対象領域を設定する。尚、ユーザは、例えば、過去の判定対象領域の設定内容と判定精度の関係に基づき、要求される監視精度が確保されるように判定対象領域を適切に設定する。また判定対象領域設定処理部132は、例えば、過去の判定対象領域の設定内容と判定精度の関係に基づき、要求される監視精度が確保されるように自動的に判定対象領域を設定する。 Subsequently, the determination target area setting processing unit 132 sets one or more determination target areas in the acquired image data 111 (S812). The determination target area setting processing unit 132 sets the determination target area by, for example, interactive processing with the user. The user appropriately sets the determination target area so as to secure the required monitoring accuracy based on the relationship between the setting contents of the determination target area in the past and the determination accuracy. Further, the determination target area setting processing unit 132 automatically sets the determination target area so that the required monitoring accuracy is ensured, for example, based on the relationship between the setting contents of the past determination target area and the determination accuracy.
 続いて、判定対象領域設定処理部132は、設定した判定対象領域を判定対象領域データ114として記憶する(S813)。 Subsequently, the determination target area setting processing unit 132 stores the set determination target area as the determination target area data 114 (S813).
 図9は、図6の対応情報登録処理S613の詳細を説明するフローチャートである。 FIG. 9 is a flowchart illustrating the details of the corresponding information registration process S613 of FIG.
 同図に示すように、まず画像データ取得部121が、画像取得装置3から画像データ111を取得する(S911)。 As shown in the figure, first, the image data acquisition unit 121 acquires the image data 111 from the image acquisition device 3 (S911).
 続いて、対応情報設定処理部133は、取得した画像データ111を判定結果表生成部140に送る。判定結果表生成部140は、送られてきた画像データ111に基づき判定結果表115を取得する処理(以下、「判定結果表生成処理S912」と称する。)を行い、生成した判定結果表115を対応情報設定処理部133に返す。尚、判定結果表生成処理S912の詳細については後述する。 Subsequently, the correspondence information setting processing unit 133 sends the acquired image data 111 to the determination result table generation unit 140. The determination result table generation unit 140 performs a process of acquiring the determination result table 115 based on the sent image data 111 (hereinafter, referred to as "determination result table generation process S912"), and generates the determination result table 115. It is returned to the correspondence information setting processing unit 133. The details of the determination result table generation process S912 will be described later.
 続いて、対応情報設定処理部133は、当該画像データ111についての作業内容情報の入力(設定)をユーザから受け付ける(S913)。 Subsequently, the corresponding information setting processing unit 133 accepts the input (setting) of the work content information for the image data 111 from the user (S913).
 続いて、対応情報設定処理部133は、判定結果表生成部140から送られてきた判定結果表115と受け付けた作業内容情報とを対応づけて判定結果パターン/作業内容対応情報154に登録する(S914)。 Subsequently, the correspondence information setting processing unit 133 associates the judgment result table 115 sent from the judgment result table generation unit 140 with the received work content information and registers it in the judgment result pattern / work content correspondence information 154 ( S914).
 図10は、図9の判定結果表生成処理S912の詳細を説明するフローチャートである。 FIG. 10 is a flowchart illustrating the details of the determination result table generation process S912 of FIG.
 同図に示すように、まずマーカ位置検出処理部141が、画像データ111に含まれている各マーカの位置を検出する処理(以下、「マーカ位置検出処理S1011」と称する。)を行う。マーカ位置検出処理S1011の詳細については後述する。 As shown in the figure, first, the marker position detection processing unit 141 performs a process of detecting the position of each marker included in the image data 111 (hereinafter, referred to as “marker position detection process S1011”). The details of the marker position detection process S1011 will be described later.
 続いて、判定結果表生成処理部142が、判定対象領域データ114を取得する(S1012)。 Subsequently, the determination result table generation processing unit 142 acquires the determination target area data 114 (S1012).
 続いて、判定結果表生成処理部142が、検出したマーカの位置と取得した判定対象領域データ114とを対照して判定結果表115を生成する(S1013)。 Subsequently, the determination result table generation processing unit 142 generates the determination result table 115 by comparing the position of the detected marker with the acquired determination target area data 114 (S1013).
 図11は、図10のマーカ位置検出処理S811の詳細を説明するフローチャートである。 FIG. 11 is a flowchart illustrating the details of the marker position detection process S811 of FIG.
 同図に示すように、まずマーカ位置検出処理部141が、マーカ色データ113を取得する(S1111)。尚、複数のマーカのマーカ色データ113が存在する場合、マーカ位置検出処理部141は全てのマーカ色データ113を取得する。 As shown in the figure, first, the marker position detection processing unit 141 acquires the marker color data 113 (S1111). When the marker color data 113 of a plurality of markers exists, the marker position detection processing unit 141 acquires all the marker color data 113.
 続いて、マーカ位置検出処理部141は、画像データ111から、マーカ色データ113に対応する領域(一致もしくは類似度が予め設定された閾値以下である領域。以下、「マーカ領域」と称する。)を検出する(S1112)。尚、複数のマーカのマーカ色データ113を取得している場合、各マーカ色データについてマーカ領域を検出する。 Subsequently, the marker position detection processing unit 141 transfers the area corresponding to the marker color data 113 from the image data 111 (a region in which the match or similarity is equal to or less than a preset threshold value; hereinafter referred to as a “marker area”). Is detected (S1112). When the marker color data 113 of a plurality of markers is acquired, the marker region is detected for each marker color data.
 続いて、マーカ位置検出処理部141は、例えば公知のノイズ除去技術(画像処理、モルフォロジー演算等)を用いて、検出したマーカ領域からノイズ情報を除去する(S1113)。 Subsequently, the marker position detection processing unit 141 removes noise information from the detected marker region by using, for example, a known noise removal technique (image processing, morphology calculation, etc.) (S1113).
 続いて、マーカ位置検出処理部141は、マーカ領域の重心位置(重心座標)を特定する(S1114)。 Subsequently, the marker position detection processing unit 141 specifies the position of the center of gravity (coordinates of the center of gravity) of the marker region (S1114).
 続いて、マーカ位置検出処理部141は、同じマーカ色データに対応するマーカ領域が複数存在する否かを調べ、複数存在する場合はそのうちの一つを選択する(S1115)。例えば、マーカ位置検出処理部141は、マーカ領域の色がマーカ色データに最も近いものを選択する。また例えば、マーカ位置検出処理部141は、マーカ領域の画像データ111内の位置が所定範囲内であるか否かに基づきマーカ領域を選択する。 Subsequently, the marker position detection processing unit 141 checks whether or not there are a plurality of marker regions corresponding to the same marker color data, and if there are a plurality of marker regions, one of them is selected (S1115). For example, the marker position detection processing unit 141 selects the color of the marker region closest to the marker color data. Further, for example, the marker position detection processing unit 141 selects the marker area based on whether or not the position of the marker area in the image data 111 is within a predetermined range.
 続いて、マーカ位置検出処理部141は、選択したマーカ領域の重心をマーカ位置として設定する(S1116)。 Subsequently, the marker position detection processing unit 141 sets the center of gravity of the selected marker area as the marker position (S1116).
 図12は、図6の作業監視処理S614の詳細を説明するフローチャートである。 FIG. 12 is a flowchart illustrating the details of the work monitoring process S614 of FIG.
 同図に示すように、まず画像データ取得部121が、画像取得装置3から一つ以上の画像データ111を取得する(S1211)。この画像データ111は、例えば、作業者2の様子を映した動画データの1フレームの画像である。 As shown in the figure, first, the image data acquisition unit 121 acquires one or more image data 111 from the image acquisition device 3 (S1211). The image data 111 is, for example, a one-frame image of moving image data showing the state of the worker 2.
 続いて、判定結果表取得部151が、取得した画像データ111を判定結果表生成部140に渡し、判定結果表生成部140から判定結果表115を取得する(S1212)。尚、S1212の処理は、前述した判定結果表生成処理S912と同様であるので説明を省略する。 Subsequently, the determination result table acquisition unit 151 passes the acquired image data 111 to the determination result table generation unit 140, and acquires the determination result table 115 from the determination result table generation unit 140 (S1212). Since the process of S1212 is the same as the determination result table generation process S912 described above, the description thereof will be omitted.
 続いて、異常有無判定処理部152が、取得した判定結果表115に基づき、作業者2の作業が正常か否かを判定し(S1213)、判定結果を出力装置15に出力する(S1214)。 Subsequently, the abnormality presence / absence determination processing unit 152 determines whether or not the work of the worker 2 is normal based on the acquired determination result table 115 (S1213), and outputs the determination result to the output device 15 (S1214).
 続いて、監視処理部150は、作業者2の作業が終了したか否かを判定する(S1215)。作業が終了していれば(S1215:YES)、監視処理S414は終了し、作業が終了していなければ(S1215:NO)、処理はS1211に戻る。尚、監視処理部150は、作業が終了したか否かを、例えば、ユーザから入力される情報に基づき判定する。 Subsequently, the monitoring processing unit 150 determines whether or not the work of the worker 2 has been completed (S1215). If the work is completed (S1215: YES), the monitoring process S414 is completed, and if the work is not completed (S1215: NO), the process returns to S1211. The monitoring processing unit 150 determines whether or not the work has been completed, for example, based on the information input from the user.
 以上、詳細に説明したように、本実施形態の作業監視装置100によれば、判定結果パターンを事前に用意しておくことで、新たに作業者2の作業を映した画像データ111から、作業者2が行っている作業が正常であるか異常であるかを精度よく判定することができる。そのため、個々の現場の作業者2の作業を監視する仕組みを容易に実現することができる。 As described in detail above, according to the work monitoring device 100 of the present embodiment, by preparing the determination result pattern in advance, the work can be performed from the image data 111 that newly reflects the work of the worker 2. It is possible to accurately determine whether the work performed by the person 2 is normal or abnormal. Therefore, it is possible to easily realize a mechanism for monitoring the work of the worker 2 at each site.
 ところで、以上において示した判定結果表115は、マーカに関する情報(マーカの位置が判定対象領域内であるか否かの情報とマーカが画像データ111に映っているか否かの情報)とを含むものであったが、判定結果表115及び判定結果パターンに、更に画像データ111を取得した時に各種センサ4から取得されるセンサデータの情報を含ませ、監視処理部150がセンサデータの値も考慮して判定結果表115と判定結果パターン/作業内容対応情報116とを対照し、作業の特定や作業の異常有無の判定を行うようにしてもよい。判定結果表115及び判定結果パターンにセンサデータの情報を更に含ませることで、個々の作業の特徴をより詳細に表現することができ、作業の判定精度を高めることができる。 By the way, the determination result table 115 shown above includes information on the marker (information on whether or not the position of the marker is within the determination target area and information on whether or not the marker is reflected in the image data 111). However, the determination result table 115 and the determination result pattern further include the information of the sensor data acquired from the various sensors 4 when the image data 111 is acquired, and the monitoring processing unit 150 also considers the value of the sensor data. The determination result table 115 may be compared with the determination result pattern / work content correspondence information 116 to identify the work and determine whether or not there is an abnormality in the work. By further including the sensor data information in the determination result table 115 and the determination result pattern, the characteristics of each work can be expressed in more detail, and the determination accuracy of the work can be improved.
 図13にその場合における判定結果表115の例を示す。例示する判定結果表115は、前述した判定結果表115の各項目に加えて、行方向にセンサの識別子(以下、「センサID」と称する。)が設定される項目(「Sensor 1」、「Sensor 2」)を含む。また列方向にセンサデータの値が設定される項目(「Value 1」、「Value 2」)を含む。 FIG. 13 shows an example of the judgment result table 115 in that case. In the determination result table 115 illustrated, in addition to each item of the determination result table 115 described above, an item (“Sensor 1”, “Sensor 1”, “Sensor 1”, “Sensor 1”, “ Sensor 2 ”) is included. It also includes items (“Value 1”, “Value 2”) for which sensor data values are set in the column direction.
[第2実施形態]
 第2実施形態の作業監視装置100は、第1実施形態における作業者2の作業監視の仕組みを用いて作業者2が作業に要している時間(以下、「作業時間」と称する。)を取得し、取得した作業時間と予め設定した標準作業時間とを比較することにより作業者2が行っている作業を監視(作業が正常か異常かを判定)する。以下、第1実施形態と異なる部分を中心として説明する。
[Second Embodiment]
The work monitoring device 100 of the second embodiment uses the work monitoring mechanism of the worker 2 in the first embodiment to determine the time required for the work by the worker 2 (hereinafter, referred to as “working time”). The work performed by the worker 2 is monitored (determines whether the work is normal or abnormal) by comparing the acquired work time with the preset standard work time. Hereinafter, the parts different from those of the first embodiment will be mainly described.
 図14に第2実施形態の作業監視装置100が備える主な機能を示す。同図に示すように、第2実施形態の作業監視装置100の記憶部110は、第1実施形態の作業監視装置100の記憶部110が記憶する情報に加えて、作業者2が行う作業毎の標準作業時間を示す情報を含むデータである作業毎標準作業時間117を記憶する。また第2実施形態の作業監視装置100の異常有無判定処理部152は、標準作業時間と作業時間を比較することにより作業者2が行っている作業が正常か異常かを判定する機能を有する。 FIG. 14 shows the main functions of the work monitoring device 100 of the second embodiment. As shown in the figure, the storage unit 110 of the work monitoring device 100 of the second embodiment has each work performed by the worker 2 in addition to the information stored by the storage unit 110 of the work monitoring device 100 of the first embodiment. The standard working time 117 for each work, which is data including information indicating the standard working time of the above, is stored. Further, the abnormality presence / absence determination processing unit 152 of the work monitoring device 100 of the second embodiment has a function of determining whether the work performed by the worker 2 is normal or abnormal by comparing the standard work time with the work time.
 図15は、第2実施形態における作業監視処理S614を説明するフローチャートである。尚、第2実施形態における他の処理(作業監視装置100が行う第1実施形態で説明した他の処理)については第1実施形態と同様であるので説明を省略する。 FIG. 15 is a flowchart illustrating the work monitoring process S614 in the second embodiment. Since the other processes in the second embodiment (the other processes described in the first embodiment performed by the work monitoring device 100) are the same as those in the first embodiment, the description thereof will be omitted.
 同図に示すように、まず監視処理部150は、以下の処理で用いる作業変数である「作業時間」と「前回作業」とを初期化する(S1511)。尚、「作業時間」には、作業者2がある作業を行っている際の累積作業時間が設定される。また「前回作業」には、S1512からのループ処理において前回特定した作業を示す情報(例えば、前述の作業内容情報)が設定される。 As shown in the figure, first, the monitoring processing unit 150 initializes the work variables "working time" and "previous work" used in the following processing (S1511). In the "working time", the cumulative working time when the worker 2 is performing a certain work is set. Further, in the "previous work", information indicating the work specified last time in the loop processing from S1512 (for example, the above-mentioned work content information) is set.
 続いて、画像データ取得部121が、画像取得装置3から一つ以上の画像データ111を取得する(S1512)。この画像データ111は作業者2の様子を映した動画データの1フレームの画像である。尚、S1512~S1517のループ処理では、動画データの1フレームの画像データ111が時系列に順次選択される。 Subsequently, the image data acquisition unit 121 acquires one or more image data 111 from the image acquisition device 3 (S1512). The image data 111 is a one-frame image of moving image data showing the state of the worker 2. In the loop processing of S1512 to S1517, the image data 111 of one frame of the moving image data is sequentially selected in chronological order.
 続いて、判定結果表取得部151が、取得した画像データ111を判定結果表生成部140に渡し、判定結果表生成部140から判定結果表115を取得する(S1513)。尚、S1513の処理は、第1実施形態で説明した判定結果表生成処理S912と同様であるので説明を省略する。 Subsequently, the determination result table acquisition unit 151 passes the acquired image data 111 to the determination result table generation unit 140, and acquires the determination result table 115 from the determination result table generation unit 140 (S1513). Since the process of S1513 is the same as the determination result table generation process S912 described in the first embodiment, the description thereof will be omitted.
 続いて、異常有無判定処理部152が、取得した判定結果表115と判定結果パターン/作業内容対応情報116とを対照することにより作業者2が行っている作業を特定する(S1514)。 Subsequently, the abnormality presence / absence determination processing unit 152 identifies the work performed by the worker 2 by comparing the acquired determination result table 115 with the determination result pattern / work content correspondence information 116 (S1514).
 続いて、異常有無判定処理部152は、特定した作業が「前回作業」と同じか否かを判定する(S1515)。特定した作業が「前回作業」と同じであれば(S1515:YES)、処理はS1516に進む。尚、1回目のループではS1515の処理はスキップしてS1516の処理に進む。 Subsequently, the abnormality presence / absence determination processing unit 152 determines whether or not the specified work is the same as the "previous work" (S1515). If the specified work is the same as the "previous work" (S1515: YES), the process proceeds to S1516. In the first loop, the process of S1515 is skipped and the process proceeds to S1516.
 特定した作業が「前回作業」と異なっていれば(S1515:NO)、処理はS1518に進む。 If the specified work is different from the "previous work" (S1515: NO), the process proceeds to S1518.
 S1516では、異常有無判定処理部152は、「作業時間」を更新(例えば、前回更新時から現在までの時間を加算)する。 In S1516, the abnormality presence / absence determination processing unit 152 updates the "working time" (for example, the time from the previous update to the present is added).
 S1517では、異常有無判定処理部152は、S1514で特定した作業(例えば作業内容情報)を「前回作業」に設定する。その後、処理はS1512に戻る。 In S1517, the abnormality presence / absence determination processing unit 152 sets the work specified in S1514 (for example, work content information) to "previous work". After that, the process returns to S1512.
 S1518では、異常有無判定処理部152は、S1514で特定した作業の標準作業時間を判定結果パターン/作業内容対応情報116から取得し、「作業時間」を標準作業時間と比較して、作業者2が行っている作業が異常であるか否かを判定する。例えば、異常有無判定処理部152は、「作業時間」と標準作業時間との差が所定時間以下であれば作業者2が行っている作業は正常と判定し、上記差が上記所定時間を超えていれば作業者2が行っている作業は異常と判定する。尚、上記所定時間を適切に設定することで、例えば、休憩時間等の作業以外に作業者2が費やしている時間を考慮しつつ上記判定を行うようにしてもよい。 In S1518, the abnormality presence / absence determination processing unit 152 acquires the standard work time of the work specified in S1514 from the determination result pattern / work content correspondence information 116, compares the “work time” with the standard work time, and compares the “work time” with the standard work time. Determines if the work being done by is abnormal. For example, if the difference between the "working time" and the standard working time is equal to or less than the predetermined time, the abnormality presence / absence determination processing unit 152 determines that the work performed by the worker 2 is normal, and the difference exceeds the predetermined time. If so, the work performed by the worker 2 is determined to be abnormal. By appropriately setting the predetermined time, for example, the determination may be made while considering the time spent by the worker 2 in addition to the work such as the break time.
 S1519では、異常有無判定処理部152は、判定結果を出力装置15に出力する。 In S1519, the abnormality presence / absence determination processing unit 152 outputs the determination result to the output device 15.
 このように以上の仕組みによれば、第1実施形態における作業者2の作業監視の仕組みを用いることで、作業者2が行っている作業の作業時間に基づき、作業者2が行っている作業が正常であるか異常であるかを監視する仕組みを実現することができる。 As described above, according to the above mechanism, by using the work monitoring mechanism of the worker 2 in the first embodiment, the work performed by the worker 2 based on the work time of the work performed by the worker 2. It is possible to realize a mechanism for monitoring whether or not is normal or abnormal.
[第3実施形態]
 第3実施形態の作業監視装置100は、第1実施形態における作業者2の作業監視の仕組みを用いて作業者2が正しい作業順序に従って作業を行っているか否かを監視する。以下、第1実施形態と異なる部分を中心として説明する。
[Third Embodiment]
The work monitoring device 100 of the third embodiment monitors whether or not the worker 2 is performing the work according to the correct work order by using the work monitoring mechanism of the worker 2 in the first embodiment. Hereinafter, the parts different from those of the first embodiment will be mainly described.
 図16に第3実施形態の作業監視装置100が備える主な機能を示す。同図に示すように、第3実施形態の作業監視装置100の記憶部110は、第1実施形態の作業監視装置100の記憶部110が記憶する情報に加えて、正しい作業順序を示す情報を含むデータである作業順序情報118を記憶する。また第3実施形態の作業監視装置100の異常有無判定処理部152は、作業者2が予め設定されている正規の作業順序に従って作業を行っているか否かを判定する機能を有する。 FIG. 16 shows the main functions of the work monitoring device 100 of the third embodiment. As shown in the figure, the storage unit 110 of the work monitoring device 100 of the third embodiment stores information indicating the correct work order in addition to the information stored by the storage unit 110 of the work monitoring device 100 of the first embodiment. The work order information 118, which is the data to be included, is stored. Further, the abnormality presence / absence determination processing unit 152 of the work monitoring device 100 of the third embodiment has a function of determining whether or not the worker 2 is performing work according to a preset regular work order.
 図17は、第3実施形態における作業監視処理S614を説明するフローチャートである。尚、第3実施形態における他の処理(作業監視装置100が行う第1実施形態で説明した他の処理)については第1実施形態と同様であるので説明を省略する。 FIG. 17 is a flowchart illustrating the work monitoring process S614 according to the third embodiment. Since the other processes in the third embodiment (the other processes described in the first embodiment performed by the work monitoring device 100) are the same as those in the first embodiment, the description thereof will be omitted.
 同図に示すように、まず監視処理部150は、以下の処理で用いる作業変数である「前回作業」を初期化する(S1711)。尚、「前回作業」には、S1712からのループ処理において前回特定した作業を示す情報(例えば、前述の作業内容情報)が設定される。 As shown in the figure, first, the monitoring processing unit 150 initializes the "previous work" which is a work variable used in the following processing (S1711). In the "previous work", information indicating the work specified last time in the loop process from S1712 (for example, the above-mentioned work content information) is set.
 続いて、画像データ取得部121が、画像取得装置3から一つ以上の画像データ111を取得する(S1712)。この画像データ111は、作業者2の様子を映した動画データの1フレームの画像である。尚、S1712~S1716のループ処理では、動画データの1フレームの画像データ111が時系列に順次選択される。 Subsequently, the image data acquisition unit 121 acquires one or more image data 111 from the image acquisition device 3 (S1712). The image data 111 is a one-frame image of moving image data showing the state of the worker 2. In the loop processing of S1712 to S1716, the image data 111 of one frame of the moving image data is sequentially selected in chronological order.
 続いて、判定結果表取得部151が、取得した画像データ111を判定結果表生成部140に渡し、判定結果表生成部140から判定結果表115を取得する(S1713)。尚、S1713の処理は、第1実施形態で説明した判定結果表生成処理S912と同様であるので説明を省略する。 Subsequently, the determination result table acquisition unit 151 passes the acquired image data 111 to the determination result table generation unit 140, and acquires the determination result table 115 from the determination result table generation unit 140 (S1713). Since the process of S1713 is the same as the determination result table generation process S912 described in the first embodiment, the description thereof will be omitted.
 続いて、異常有無判定処理部152が、取得した判定結果表115と判定結果パターン/作業内容対応情報116とを対照することにより、作業者2が行っている作業を特定する(S1714)。 Subsequently, the abnormality presence / absence determination processing unit 152 identifies the work performed by the worker 2 by comparing the acquired determination result table 115 with the determination result pattern / work content correspondence information 116 (S1714).
 続いて、異常有無判定処理部152は、特定した作業、「前回作業」、及び作業順序情報118を対照することにより、特定した作業が作業順序情報118における作業順序に従った作業であるか否かを判定する(S1715)。尚、1回目のループではS1715の処理はスキップしてS1716の処理に進む。特定した作業が作業順序情報118における作業順序に従った作業であれば(S1715:YES)、処理はS1716に進む。特定した作業が作業順序情報118における作業順序に従った作業でなければ(S1715:NO)、処理はS1717に進む。 Subsequently, the abnormality presence / absence determination processing unit 152 compares the specified work, the "previous work", and the work order information 118 to determine whether the specified work is a work according to the work order in the work order information 118. (S1715). In the first loop, the process of S1715 is skipped and the process proceeds to S1716. If the specified work follows the work order in the work order information 118 (S1715: YES), the process proceeds to S1716. If the specified work is not a work according to the work order in the work order information 118 (S1715: NO), the process proceeds to S1717.
 S1716では、異常有無判定処理部152は、S1714で特定した作業(例えば作業内容情報)を「前回作業」に設定する。その後、処理はS1712に戻る。 In S1716, the abnormality presence / absence determination processing unit 152 sets the work specified in S1714 (for example, work content information) to "previous work". After that, the process returns to S1712.
 S1717では、異常有無判定処理部152は、判定結果(作業順序が正しい作業順序に従っていないことを示す情報)を出力装置15に出力する。 In S1717, the abnormality presence / absence determination processing unit 152 outputs the determination result (information indicating that the work order does not follow the correct work order) to the output device 15.
 このように第1実施形態における作業者2の作業監視の仕組みを用いることで、作業者2が行っている作業が正しい作業順序に従っているか否かを監視する仕組みを実現することができる。 By using the work monitoring mechanism of the worker 2 in the first embodiment in this way, it is possible to realize a mechanism for monitoring whether or not the work performed by the worker 2 follows the correct work order.
[第4実施形態]
 第4実施形態の作業監視装置100は、第1実施形態における作業者2の作業監視の仕組みを用いて、作業者2が行っている期間のうち解析対象とする作業が行われている期間を特定する。以下、第1実施形態と異なる部分を中心として説明する。
[Fourth Embodiment]
The work monitoring device 100 of the fourth embodiment uses the work monitoring mechanism of the worker 2 in the first embodiment to set the period during which the work to be analyzed is performed among the periods performed by the worker 2. Identify. Hereinafter, the parts different from those of the first embodiment will be mainly described.
 図18に第4実施形態の作業監視装置100が備える主な機能を示す。同図に示すように、第4実施形態の作業監視装置100の記憶部110は、第1実施形態の作業監視装置100の記憶部110が記憶する情報に加えて、解析対象とする作業を示す情報(例えば、前述の作業内容情報)を含むデータである解析対象作業情報119を記憶する。また第4実施形態の作業監視装置100の異常有無判定処理部152は、作業者2が行っている期間のうち解析対象とする作業が行われている期間を特定する機能を有する。 FIG. 18 shows the main functions of the work monitoring device 100 of the fourth embodiment. As shown in the figure, the storage unit 110 of the work monitoring device 100 of the fourth embodiment shows the work to be analyzed in addition to the information stored by the storage unit 110 of the work monitoring device 100 of the first embodiment. The analysis target work information 119, which is data including information (for example, the above-mentioned work content information), is stored. Further, the abnormality presence / absence determination processing unit 152 of the work monitoring device 100 of the fourth embodiment has a function of specifying the period during which the work to be analyzed is being performed among the periods during which the worker 2 is performing.
 図19は、第4実施形態における作業監視処理S614を説明するフローチャートである。尚、第4実施形態における他の処理(作業監視装置100が行う第1実施形態で説明した他の処理)については第1実施形態と同様であるので説明を省略する。 FIG. 19 is a flowchart illustrating the work monitoring process S614 according to the fourth embodiment. Since the other processes in the fourth embodiment (the other processes described in the first embodiment performed by the work monitoring device 100) are the same as those in the first embodiment, the description thereof will be omitted.
 同図に示すように、まず画像データ取得部121が、画像取得装置3から一つ以上の画像データ111を取得する(S1911)。この画像データ111は、作業者2の様子を映した動画データの1フレームの画像である。尚、S1911~S1916のループ処理では、動画データの1フレームの画像データ111が時系列に順次選択される。 As shown in the figure, first, the image data acquisition unit 121 acquires one or more image data 111 from the image acquisition device 3 (S1911). The image data 111 is a one-frame image of moving image data showing the state of the worker 2. In the loop processing of S1911 to S1916, the image data 111 of one frame of the moving image data is sequentially selected in chronological order.
 続いて、判定結果表取得部151が、取得した画像データ111を判定結果表生成部140に渡し、判定結果表生成部140から判定結果表115を取得する(S1912)。尚、S1912の処理は、第1実施形態で説明した判定結果表生成処理S912と同様であるので説明を省略する。 Subsequently, the determination result table acquisition unit 151 passes the acquired image data 111 to the determination result table generation unit 140, and acquires the determination result table 115 from the determination result table generation unit 140 (S1912). Since the process of S1912 is the same as the determination result table generation process S912 described in the first embodiment, the description thereof will be omitted.
 続いて、異常有無判定処理部152が、取得した判定結果表115と判定結果パターン/作業内容対応情報116とを対照することにより、作業者2が行っている作業を特定する(S1913)。 Subsequently, the abnormality presence / absence determination processing unit 152 identifies the work performed by the worker 2 by comparing the acquired determination result table 115 with the determination result pattern / work content correspondence information 116 (S1913).
 続いて、異常有無判定処理部152は、特定した作業を解析対象作業情報119と対照し、特定した作業が解析対象か否かを判定する(S1914)。特定した作業が解析対象であれば(S1914:YES)、処理はS1915に進む。特定した作業が解析対象でなければ(S1914:NO)、処理はS1911に戻る。 Subsequently, the abnormality presence / absence determination processing unit 152 compares the specified work with the analysis target work information 119, and determines whether or not the specified work is the analysis target (S1914). If the identified work is the analysis target (S1914: YES), the process proceeds to S1915. If the specified work is not the analysis target (S1914: NO), the process returns to S1911.
 S1915では、異常有無判定処理部152は、作業者2が当該画像データのタイミングで解析対象の作業を行っている旨を示す情報を出力装置15に出力する。 In S1915, the abnormality presence / absence determination processing unit 152 outputs information indicating that the worker 2 is performing the work to be analyzed at the timing of the image data to the output device 15.
 以上の仕組みによれば、作業者2が連続して複数の作業を行う場合に、各作業が行われている期間を示す情報を出力することができ、例えば、作業者2が行う作業を解析しようとする者(以下、「解析者」と称する。)は、解析対象とする期間を容易に特定することができる。 According to the above mechanism, when the worker 2 continuously performs a plurality of works, it is possible to output information indicating the period during which each work is performed. For example, the work performed by the worker 2 can be analyzed. A person who intends to analyze (hereinafter referred to as an "analyzer") can easily specify the period to be analyzed.
 図20に解析者が解析対象とする上記期間の例を示している。例えば、(a)に示すように、解析者は、以上の仕組みを利用して、解析対象とする作業の開始時点及び終了時点を特定することができる。また例えば、(b)に示すように、解析者は、以上の仕組みを利用して、解析対象とする開始時点を特定し、開始時点から所定時間を解析対象の期間として特定することができる。また例えば、(c)に示すように、解析者は、以上の仕組みを利用して、解析対象とする終了時点を特定し、終了時点から所定時間遡った時点から終了時点までを解析対象の期間として特定することができる。 FIG. 20 shows an example of the above period that the analyst analyzes. For example, as shown in (a), the analyst can specify the start time point and the end time point of the work to be analyzed by using the above mechanism. Further, for example, as shown in (b), the analyst can specify the start time to be analyzed by using the above mechanism, and can specify a predetermined time from the start time as the period to be analyzed. Further, for example, as shown in (c), the analyst uses the above mechanism to specify the end time point to be analyzed, and the period from the end time point to the end time point is the period to be analyzed. Can be specified as.
 以上、本発明の一実施形態について詳細に説明したが、本発明は上記の実施形態に限定されるものではなく、その要旨を逸脱しない範囲で種々変更可能であることはいうまでもない。例えば、上記の実施形態は本発明を分かりやすく説明するために詳細に説明したものであり、説明した全ての構成を備えるものに必ずしも限定されるものではない。また上記実施形態の構成の一部について、他の構成の追加や削除、置換をすることが可能である。 Although one embodiment of the present invention has been described in detail above, it goes without saying that the present invention is not limited to the above embodiment and can be variously modified without departing from the gist thereof. For example, the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to those having all the described configurations. Further, it is possible to add, delete, or replace a part of the configuration of the above embodiment with another configuration.
 例えば、以上の各実施形態で説明した作業監視処理S614は、リアルタイムに取得される動画データを対象として実行してもよいし、録画された動画データを対象として事後的に実行してもよい。 For example, the work monitoring process S614 described in each of the above embodiments may be executed for the moving image data acquired in real time, or may be executed ex post facto for the recorded moving image data.
 また上記の各構成、機能部、処理部、処理手段等は、それらの一部または全部を、例えば、集積回路で設計する等によりハードウェアで実現してもよい。また上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリやハードディスク、SSD(Solid State Drive)等の記録装置、ICカード、SDカード、DVD等の記録媒体に置くことができる。 Further, each of the above configurations, functional units, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, etc. may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be placed in a memory, a hard disk, a recording device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
 また上記の各図において、制御線や情報線は説明上必要と考えられるものを示しており、必ずしも実装上の全ての制御線や情報線を示しているとは限らない。例えば、実際には殆ど全ての構成が相互に接続されていると考えてもよい。 Also, in each of the above figures, the control lines and information lines are shown as necessary for explanation, and not all the control lines and information lines in the implementation are necessarily shown. For example, in practice almost all configurations may be considered interconnected.
 また以上に説明した各情報処理装置の各種機能部、各種処理部、各種データベースの配置形態は一例に過ぎない。各種機能部、各種処理部、各種データベースの配置形態は、これらの装置が備えるハードウェアやソフトウェアの性能、処理効率、通信効率等の観点から最適な配置形態に変更し得る。 The arrangement form of various functional units, various processing units, and various databases of each information processing device described above is only an example. The arrangement form of various function units, various processing units, and various databases can be changed to the optimum arrangement form from the viewpoint of the performance, processing efficiency, communication efficiency, and the like of the hardware and software included in these devices.
 また前述した各種のデータを格納するデータベースの構成(スキーマ(Schema)等)は、リソースの効率的な利用、処理効率向上、アクセス効率向上、検索効率向上等の観点から柔軟に変更し得る。 In addition, the configuration of the database (schema, etc.) that stores the various data described above can be flexibly changed from the viewpoints of efficient use of resources, improvement of processing efficiency, improvement of access efficiency, improvement of search efficiency, and the like.
 また前述したマーカを用いた位置検知方法の代わりに、作業者2の身体を自動的に検出できるセンサまたは画像認識手法を利用してオブジェクトの位置検知を行うようにしてもよい。また作業対象物または作業に用いる道具等を自動的に検出できるセンサまたは画像認識手法を利用してオブジェクトの位置検知を行うようにしてもよい。この場合、自動的に検出された身体や作業対象物などの位置情報を用いて、前述した実施形態と同様の処理を行うこととなる。 Further, instead of the position detection method using the marker described above, the position of the object may be detected by using a sensor or an image recognition method that can automatically detect the body of the worker 2. Further, the position of the object may be detected by using a sensor or an image recognition method that can automatically detect the work object or the tool used for the work. In this case, the same processing as that of the above-described embodiment is performed using the automatically detected position information of the body, the work object, and the like.
1 作業監視システム
2 作業者
3 画像取得装置
4 各種センサ
5 通信手段
100 作業監視装置
111 画像データ
112 センサデータ
113 マーカ色データ
114 判定対象領域データ
115 判定結果表
116 判定結果パターン/作業内容対応情報
120 データ取得部
121 画像データ取得部
122 センサデータ取得部
130 準備設定処理部
131 マーカ色設定処理部
132 判定対象領域設定処理部
133 対応情報設定処理部
140 判定結果表生成部
141 マーカ位置検出処理部
142 判定結果表生成処理部
150 監視処理部
151 判定結果表取得部
152 異常有無判定処理部
S600 メイン処理
S611 マーカ色設定処理
S612 判定対象領域設定処理
S613 対応情報登録処理
S912 判定結果表生成処理
S1011 マーカ位置検出処理
S614 作業監視処理
1 Work monitoring system 2 Worker 3 Image acquisition device 4 Various sensors 5 Communication means 100 Work monitoring device 111 Image data 112 Sensor data 113 Marker color data 114 Judgment target area data 115 Judgment result table 116 Judgment result pattern / work content correspondence information 120 Data acquisition unit 121 Image data acquisition unit 122 Sensor data acquisition unit 130 Preparation setting processing unit 131 Marker color setting processing unit 132 Judgment target area setting processing unit 133 Corresponding information setting processing unit 140 Judgment result table generation unit 141 Marker position detection processing unit 142 Judgment result table generation processing unit 150 Monitoring processing unit 151 Judgment result table acquisition unit 152 Abnormality presence / absence judgment processing unit S600 Main processing S611 Marker color setting processing S612 Judgment target area setting processing S613 Corresponding information registration processing S912 Judgment result table generation processing S1011 Marker position Detection process S614 Work monitoring process

Claims (15)

  1.  プロセッサ及びメモリを有する情報処理装置を用いて構成され、
     作業者が作業を行っている様子を映した画像データを取得する画像データ取得部と、
     前記画像データに映っている複数の物体について前記画像データにおける夫々の位置を示す情報を取得する物体位置検出部と、
     前記画像データに設定された領域である一つ以上の判定対象領域の夫々に前記複数の物体の夫々の前記位置が含まれているか否かを判定した結果を示す情報を含んだ判定結果表を生成する判定結果表生成部と、
     前記画像データに基づき生成される前記判定結果表と当該画像データにおける作業を示す情報である作業内容情報とを対応づけた情報である判定結果パターン/作業内容対応情報を記憶する記憶部と、
     新たに取得された前記画像データについて前記判定結果表を生成し、生成した前記判定結果表と前記判定結果パターン/作業内容対応情報とを対照することにより前記作業者が行っている作業を特定する監視処理部と、
     を備える、作業監視装置。
    It is configured using an information processing device with a processor and memory.
    An image data acquisition unit that acquires image data that shows the worker performing work,
    An object position detection unit that acquires information indicating each position in the image data of a plurality of objects reflected in the image data, and an object position detection unit.
    A determination result table including information indicating the result of determining whether or not each of the positions of the plurality of objects is included in each of the one or more determination target areas which are the areas set in the image data. Judgment result table generator to be generated and
    A storage unit that stores a determination result pattern / work content correspondence information, which is information in which the determination result table generated based on the image data is associated with work content information which is information indicating work in the image data, and a storage unit.
    The judgment result table is generated for the newly acquired image data, and the work performed by the worker is specified by comparing the generated judgment result table with the judgment result pattern / work content correspondence information. Monitoring processing unit and
    A work monitoring device equipped with.
  2.  請求項1に記載の作業監視装置であって、
     前記判定結果表は、前記物体の前記位置が前記画像データに含まれているか否かを示す情報を更に含む、
     作業監視装置。
    The work monitoring device according to claim 1.
    The determination result table further includes information indicating whether or not the position of the object is included in the image data.
    Work monitoring device.
  3.  請求項1に記載の作業監視装置であって、
     前記作業者が作業を行う現場に設けられたセンサが出力するデータであるセンサデータを取得するセンサデータ取得部をさらに備え、
     前記判定結果表は、前記センサから取得した前記センサデータをさらに含み、
     前記監視処理部は、新たに取得された前記画像データと当該画像データの取得時に前記センサから取得した前記センサデータとに基づき前記判定結果表を生成し、生成した前記判定結果表と前記判定結果パターン/作業内容対応情報とを対照して人が行っている作業を特定する、
     作業監視装置。
    The work monitoring device according to claim 1.
    Further provided with a sensor data acquisition unit that acquires sensor data, which is data output by a sensor provided at the site where the worker performs work.
    The determination result table further includes the sensor data acquired from the sensor.
    The monitoring processing unit generates the determination result table based on the newly acquired image data and the sensor data acquired from the sensor at the time of acquisition of the image data, and the generated determination result table and the determination result. Identify the work that a person is doing by comparing it with the pattern / work content correspondence information.
    Work monitoring device.
  4.  請求項1~3のいずれか一項に記載の作業監視装置であって、
     前記記憶部は、所定の前記作業に要する標準の作業時間を示す情報を記憶し、
     前記画像データ取得部は、前記作業者が作業を行っている様子を映した時系列の画像データを取得し、
     前記判定結果表生成部は、時系列の前記画像データの夫々について前記判定結果表を生成し、
     前記監視処理部は、前記判定結果表の夫々と前記判定結果パターン/作業内容対応情報とを対照して前記作業者が行っている作業を特定することにより、前記作業者が前記所定の作業に要した作業時間を取得し、前記作業時間と前記標準の作業時間とを比較することにより前記作業者が行っている作業を監視する、
     作業監視装置。
    The work monitoring device according to any one of claims 1 to 3.
    The storage unit stores information indicating a standard working time required for the predetermined work, and stores the information.
    The image data acquisition unit acquires time-series image data showing the worker performing the work, and obtains the image data.
    The determination result table generation unit generates the determination result table for each of the image data in the time series.
    The monitoring processing unit specifies the work performed by the worker by comparing each of the judgment result tables with the judgment result pattern / work content correspondence information, so that the worker can perform the predetermined work. The work performed by the worker is monitored by acquiring the required work time and comparing the work time with the standard work time.
    Work monitoring device.
  5.  請求項1~3のいずれか一項に記載の作業監視装置であって、
     前記記憶部は、前記作業者が行う複数の前記作業についての標準の作業順序を示す情報を記憶し、
     前記画像データ取得部は、前記作業者が作業を行っている様子を映した時系列の画像データを取得し、
     前記判定結果表生成部は、時系列の前記画像データの夫々について前記判定結果表を生成し、
     前記監視処理部は、前記判定結果表の夫々と前記判定結果パターン/作業内容対応情報とを対照して前記作業者が行っている作業を特定することにより、前記作業者が行っている作業の順序を特定し、特定した前記作業順序と前記標準の作業順序とを比較することにより前記作業者が行っている作業を監視する、
     作業監視装置。
    The work monitoring device according to any one of claims 1 to 3.
    The storage unit stores information indicating a standard work order for a plurality of the work performed by the worker.
    The image data acquisition unit acquires time-series image data showing the worker performing the work, and obtains the image data.
    The determination result table generation unit generates the determination result table for each of the image data in the time series.
    The monitoring processing unit identifies the work performed by the worker by comparing each of the judgment result tables with the judgment result pattern / work content correspondence information, thereby performing the work performed by the worker. The work performed by the worker is monitored by specifying the order and comparing the specified work order with the standard work order.
    Work monitoring device.
  6.  請求項1~3のいずれか一項に記載の作業監視装置であって、
     前記記憶部は、前記作業者が行う所定の作業を示す情報を記憶し、
     前記画像データ取得部は、前記作業者が作業を行っている様子を映した時系列の画像データを取得し、
     前記判定結果表生成部は、時系列の前記画像データの夫々について前記判定結果表を生成し、
     前記監視処理部は、前記判定結果表の夫々と前記判定結果パターン/作業内容対応情報とを対照して前記作業者が行っている作業を特定することにより、前記所定の作業の開始又は終了のタイミングを示す情報を生成する、
     作業監視装置。
    The work monitoring device according to any one of claims 1 to 3.
    The storage unit stores information indicating a predetermined work performed by the worker, and stores information indicating the predetermined work.
    The image data acquisition unit acquires time-series image data showing the worker performing the work, and obtains the image data.
    The determination result table generation unit generates the determination result table for each of the image data in the time series.
    The monitoring processing unit determines the work performed by the worker by comparing each of the judgment result tables with the judgment result pattern / work content correspondence information, thereby starting or ending the predetermined work. Generate timing information,
    Work monitoring device.
  7.  請求項1に記載の作業監視装置であって、
     前記画像データについての前記判定対象領域の設定を受け付ける判定対象領域設定処理部を更に備える、
     作業監視装置。
    The work monitoring device according to claim 1.
    A determination target area setting processing unit that accepts the setting of the determination target area for the image data is further provided.
    Work monitoring device.
  8.  請求項1に記載の作業監視装置であって、
     前記物体は、前記作業者の体の一部、作業対象物、及び前記作業者が作業に際して用いる道具のうちの少なくともいずれかである、
     作業監視装置。
    The work monitoring device according to claim 1.
    The object is at least one of a part of the worker's body, a work object, and a tool used by the worker in his work.
    Work monitoring device.
  9.  請求項1に記載の作業監視装置であって、
     前記物体位置検出部は、前記物体に設けたマーカを画像認識することにより前記物体の前記画像データにおける位置を示す情報を取得する、
     作業監視装置。
    The work monitoring device according to claim 1.
    The object position detection unit acquires information indicating the position of the object in the image data by recognizing an image of a marker provided on the object.
    Work monitoring device.
  10.  プロセッサ及びメモリを有する情報処理装置が、
     作業者が作業を行っている様子を映した画像データを取得し、
     前記画像データに映っている複数の物体について前記画像データにおける夫々の位置を示す情報を取得し、
     前記画像データに設定された領域である一つ以上の判定対象領域の夫々に前記複数の物体の夫々の前記位置が含まれているか否かを判定した結果を示す情報を含んだ判定結果表を生成し、
     前記画像データに基づき生成される前記判定結果表と当該画像データにおける作業を示す情報である作業内容情報とを対応づけた情報である判定結果パターン/作業内容対応情報を記憶し、
     新たに取得された前記画像データについて前記判定結果表を生成し、生成した前記判定結果表と前記判定結果パターン/作業内容対応情報とを対照することにより前記作業者が行っている作業を特定する、
     作業監視方法。
    An information processing device with a processor and memory
    Acquire image data showing how the worker is working,
    Information indicating the position of each of the plurality of objects reflected in the image data in the image data is acquired.
    A determination result table including information indicating the result of determining whether or not each of the positions of the plurality of objects is included in each of the one or more determination target areas which are the areas set in the image data. Generate and
    The judgment result pattern / work content correspondence information, which is information in which the determination result table generated based on the image data is associated with the work content information which is information indicating the work in the image data, is stored.
    The judgment result table is generated for the newly acquired image data, and the work performed by the worker is specified by comparing the generated judgment result table with the judgment result pattern / work content correspondence information. ,
    Work monitoring method.
  11.  請求項10に記載の作業監視方法であって、
     前記判定結果表は、前記物体の前記位置が前記画像データに含まれているか否かを示す情報を更に含む、
     作業監視方法。
    The work monitoring method according to claim 10.
    The determination result table further includes information indicating whether or not the position of the object is included in the image data.
    Work monitoring method.
  12.  請求項10に記載の作業監視方法であって、
     前記情報処理装置は、前記作業者が作業を行う現場に設けられたセンサが出力するデータであるセンサデータを取得し、
     前記判定結果表は、前記センサから取得した前記センサデータをさらに含み、
     前記情報処理装置は、新たに取得された前記画像データと当該画像データの取得時に前記センサから取得した前記センサデータとに基づき前記判定結果表を生成し、生成した前記判定結果表と前記判定結果パターン/作業内容対応情報とを対照して人が行っている作業を特定する、
     作業監視方法。
    The work monitoring method according to claim 10.
    The information processing device acquires sensor data, which is data output by a sensor provided at the site where the worker works.
    The determination result table further includes the sensor data acquired from the sensor.
    The information processing device generates the determination result table based on the newly acquired image data and the sensor data acquired from the sensor at the time of acquisition of the image data, and the generated determination result table and the determination result. Identify the work that a person is doing by comparing it with the pattern / work content correspondence information.
    Work monitoring method.
  13.  請求項10~12のいずれか一項に記載の作業監視方法であって、
     前記情報処理装置が、
     所定の前記作業に要する標準の作業時間を示す情報を記憶し、
     前記作業者が作業を行っている様子を映した時系列の画像データを取得し、
     時系列の前記画像データの夫々について前記判定結果表を生成し、
     前記判定結果表の夫々と前記判定結果パターン/作業内容対応情報とを対照して前記作業者が行っている作業を特定することにより、前記作業者が前記所定の作業に要した作業時間を取得し、前記作業時間と前記標準の作業時間とを比較することにより前記作業者が行っている作業を監視する、
     作業監視方法。
    The work monitoring method according to any one of claims 10 to 12.
    The information processing device
    Stores information indicating the standard working time required for the predetermined work,
    Acquire time-series image data showing the worker performing the work,
    The determination result table is generated for each of the image data in the time series.
    By identifying the work performed by the worker by comparing each of the judgment result tables with the judgment result pattern / work content correspondence information, the work time required by the worker for the predetermined work is acquired. Then, the work performed by the worker is monitored by comparing the work time with the standard work time.
    Work monitoring method.
  14.  請求項10~12のいずれか一項に記載の作業監視方法であって、
     前記情報処理装置が、
     前記作業者が行う複数の前記作業についての標準の作業順序を示す情報を記憶し、
     前記作業者が作業を行っている様子を映した時系列の画像データを取得し、
     時系列の前記画像データの夫々について前記判定結果表を生成し、
     前記判定結果表の夫々と前記判定結果パターン/作業内容対応情報とを対照して前記作業者が行っている作業を特定することにより、前記作業者が行っている作業の順序を特定し、特定した前記作業順序と前記標準の作業順序とを比較することにより前記作業者が行っている作業を監視する、
     作業監視方法。
    The work monitoring method according to any one of claims 10 to 12.
    The information processing device
    Stores information indicating a standard work sequence for a plurality of the work performed by the worker.
    Acquire time-series image data showing the worker performing the work,
    The determination result table is generated for each of the image data in the time series.
    By specifying the work performed by the worker by comparing each of the judgment result tables with the judgment result pattern / work content correspondence information, the order of the work performed by the worker is specified and specified. The work performed by the worker is monitored by comparing the work sequence performed with the standard work sequence.
    Work monitoring method.
  15.  請求項10~12のいずれか一項に記載の作業監視方法であって、
     前記情報処理装置が、
     前記作業者が行う所定の作業を示す情報を記憶し、
     前記作業者が作業を行っている様子を映した時系列の画像データを取得し、
     時系列の前記画像データの夫々について前記判定結果表を生成し、
     前記判定結果表の夫々と前記判定結果パターン/作業内容対応情報とを対照して前記作業者が行っている作業を特定することにより、前記所定の作業の開始又は終了のタイミングを示す情報を生成する、
     作業監視方法。
    The work monitoring method according to any one of claims 10 to 12.
    The information processing device
    Stores information indicating a predetermined work performed by the worker,
    Acquire time-series image data showing the worker performing the work,
    The determination result table is generated for each of the image data in the time series.
    By specifying the work performed by the worker by comparing each of the judgment result tables with the judgment result pattern / work content correspondence information, information indicating the start or end timing of the predetermined work is generated. To do
    Work monitoring method.
PCT/JP2020/007755 2019-06-03 2020-02-26 Work monitoring device and work monitoring method WO2020246082A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-103551 2019-06-03
JP2019103551A JP7324618B2 (en) 2019-06-03 2019-06-03 Work monitoring device and work monitoring method

Publications (1)

Publication Number Publication Date
WO2020246082A1 true WO2020246082A1 (en) 2020-12-10

Family

ID=73649639

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/007755 WO2020246082A1 (en) 2019-06-03 2020-02-26 Work monitoring device and work monitoring method

Country Status (2)

Country Link
JP (1) JP7324618B2 (en)
WO (1) WO2020246082A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022244536A1 (en) * 2021-05-17 2022-11-24 株式会社日立製作所 Work recognition device and work recognition method
WO2023166605A1 (en) * 2022-03-02 2023-09-07 日本電気株式会社 Action determination device, action determination method, and non-transitory computer-readable medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009032033A (en) * 2007-07-27 2009-02-12 Omron Corp Operation boundary detection method and operation analysis system
JP2013145422A (en) * 2012-01-13 2013-07-25 Panasonic Corp Setting method of work detection system and work detection system using the same
JP2014115700A (en) * 2012-12-06 2014-06-26 Nec Corp Confirmation requirement degree determination device, confirmation requirement degree determination method, and program
JP2019032593A (en) * 2017-08-04 2019-02-28 オリンパス株式会社 Imaging system, imaging device and imaging method
JP2019053527A (en) * 2017-09-15 2019-04-04 キヤノン株式会社 Assembly work analysis device, assembly work analysis method, computer program, and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009032033A (en) * 2007-07-27 2009-02-12 Omron Corp Operation boundary detection method and operation analysis system
JP2013145422A (en) * 2012-01-13 2013-07-25 Panasonic Corp Setting method of work detection system and work detection system using the same
JP2014115700A (en) * 2012-12-06 2014-06-26 Nec Corp Confirmation requirement degree determination device, confirmation requirement degree determination method, and program
JP2019032593A (en) * 2017-08-04 2019-02-28 オリンパス株式会社 Imaging system, imaging device and imaging method
JP2019053527A (en) * 2017-09-15 2019-04-04 キヤノン株式会社 Assembly work analysis device, assembly work analysis method, computer program, and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022244536A1 (en) * 2021-05-17 2022-11-24 株式会社日立製作所 Work recognition device and work recognition method
WO2023166605A1 (en) * 2022-03-02 2023-09-07 日本電気株式会社 Action determination device, action determination method, and non-transitory computer-readable medium

Also Published As

Publication number Publication date
JP2020197899A (en) 2020-12-10
JP7324618B2 (en) 2023-08-10

Similar Documents

Publication Publication Date Title
JP6814673B2 (en) Movement route prediction device and movement route prediction method
US9734393B2 (en) Gesture-based control system
WO2019245768A1 (en) System for predicting articulated object feature location
WO2020246082A1 (en) Work monitoring device and work monitoring method
CN111104820A (en) Gesture recognition method based on deep learning
JP2019101919A (en) Information processor, information processing method, computer program, and storage medium
US10372958B2 (en) In-field data acquisition and formatting
KR20190054702A (en) Method and apparatus for detecting action of object in viedio stream
CN108885469A (en) System and method for the initialized target object in tracking system
WO2017084319A1 (en) Gesture recognition method and virtual reality display output device
JP2017505965A (en) Real-time 3D gesture recognition and tracking system for mobile devices
KR20150039252A (en) Apparatus and method for providing application service by using action recognition
JP2018077644A (en) Information processing system and program
JP2021534480A (en) Face recognition methods, devices, electronics and computers Non-volatile readable storage media
WO2021020500A1 (en) Information processing device and marketing activity assistance device
JP2020135747A (en) Action analysis device and action analysis method
CN106529500A (en) Information processing method and system
US20160110909A1 (en) Method and apparatus for creating texture map and method of creating database
JPH04260979A (en) Detecting and tracking system for mobile objection
WO2020137536A1 (en) Person authentication device, control method, and program
JP2007048232A (en) Information processing device, information processing method, and computer program
KR101447958B1 (en) Method and apparatus for recognizing body point
KR101520889B1 (en) Digilog Book System Using Distance Information Between Object and Hand Device and Implementation Method Therof
JP2012141884A (en) Evaluation support device and evaluation support system
WO2020241208A1 (en) Retrieval device, control method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20818444

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20818444

Country of ref document: EP

Kind code of ref document: A1