WO2022259690A1 - 作業分析装置及び方法 - Google Patents
作業分析装置及び方法 Download PDFInfo
- Publication number
- WO2022259690A1 WO2022259690A1 PCT/JP2022/012833 JP2022012833W WO2022259690A1 WO 2022259690 A1 WO2022259690 A1 WO 2022259690A1 JP 2022012833 W JP2022012833 W JP 2022012833W WO 2022259690 A1 WO2022259690 A1 WO 2022259690A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- work
- worker
- workers
- control unit
- recognized
- Prior art date
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 137
- 238000000034 method Methods 0.000 title description 28
- 238000013507 mapping Methods 0.000 abstract 1
- 238000012856 packing Methods 0.000 description 31
- 238000012545 processing Methods 0.000 description 28
- 238000010586 diagram Methods 0.000 description 18
- 230000002159 abnormal effect Effects 0.000 description 13
- 238000002360 preparation method Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 7
- 230000032258 transport Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000007717 exclusion Effects 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 230000010365 information processing Effects 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 206010024796 Logorrhoea Diseases 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004886 process control Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
Definitions
- the present disclosure relates to a work analysis device and method.
- Patent Document 1 discloses a video surveillance system that identifies a person and tracks the movement of that person.
- a video monitoring system detects a person and an abandoned object appearing in an image captured by any one of a plurality of imaging devices, and identifies a target person who has left the abandoned object.
- the video monitoring system searches for an image of a target person from among the images captured by each imaging device, based on the target person's face feature amount and clothing feature amounts such as the color and shape of clothing.
- the video monitoring system outputs a display indicating the movement of the target person on the screen based on the imaging device that captured the image of the target person and the imaging time.
- the present disclosure provides a work analysis device capable of estimating the worker of each work when a plurality of work is performed by a plurality of workers.
- a work analysis device generates information about multiple workers who perform multiple tasks in a workplace.
- the work analysis device includes an acquisition unit, a control unit, and a storage unit.
- the acquisition unit acquires image data representing a captured image of the workplace.
- the control unit Based on the image data, the control unit generates work history information indicating work performed in the workplace by individual workers among the plurality of workers.
- the storage unit stores work history information.
- the control unit sequentially recognizes the positions and operations of the plurality of workers based on the image data for each time in the workplace.
- the control unit detects crosstalk between a plurality of flow lines including positions of a plurality of workers at each time.
- the control unit generates work history information by associating the work recognized at each time with the individual worker based on the plurality of flow lines when no crossed lines are detected. When crossed lines are detected, the control unit associates the recognized work with individual workers based on the recognized work and past work history information.
- FIG. 1 is a diagram showing an overview of a work analysis system according to the first embodiment
- FIG. Block diagram illustrating the configuration of a work analysis device in the work analysis system A diagram for explaining map data in a work analysis device
- a diagram for explaining work order information in the work analysis device The figure which shows the 1st example for demonstrating the subject regarding a work-analysis apparatus.
- Flowchart for explaining the overall operation of the work analysis device 3 is a flowchart illustrating worker discrimination processing in the work analysis device of the first embodiment; Diagram for explaining work combination in worker determination processing
- FIG. 5 is a diagram for explaining work plan information in the work analysis device of the second embodiment; Flowchart illustrating worker determination processing according to the second embodiment A diagram for explaining the worker determination process of the second embodiment.
- FIG. 1 is a diagram showing an overview of a work analysis system 1 according to this embodiment.
- This system 1 includes a camera 2 and a work analysis device 5, as shown in FIG. This system 1 is applied to analyze the efficiency of workers W1, W2, and W3 who perform a plurality of tasks in a workplace 6 such as a distribution warehouse. Workers W1 to W3 will also be referred to as worker W hereinafter.
- the system 1 may comprise a monitor 4 for presenting an analysis chart 7 for a given analysis period to a user 3, for example a manager of a workshop 6 or an analyst.
- the analysis period is a period to be analyzed by image recognition or the like using the camera 2 in the system 1, and is set in advance from one day to several months, for example.
- a transport line 61 and a shelf 62 are installed in the workplace 6.
- the plurality of tasks performed by each worker W1 to W3 while moving in the workplace 6 are "collection” of picking up products from the shelves 62, “boxing” of packing the products into boxes on the transport line 61, and “boxing”. Includes “box prep” to prepare the box.
- the analysis chart 7 of the system classifies each work into "main work", "secondary work” and "non-work” according to the value added by the work, for each worker W1 to W3 Shows the ratio of each item during the analysis period.
- packing is the main work
- auxiliary work related to the main work such as collection and box preparation, and movement toward the transport line 61 or the shelf 62, are sub-work. Waiting states that are not related to main work are classified as non-work.
- the work to be analyzed by the work analysis device 5 is not limited to main work and sub-work, and includes non-work.
- the user 3 can analyze the work contents of each of the workers W1 to W3 in order to consider improving the work efficiency of the workplace 6, for example. can do.
- the camera 2 of this system 1 is arranged, for example, so that the entire range in which the workers W1 to W3 move in the workplace 6 is captured.
- the camera 2 repeats the imaging operation at predetermined intervals, for example, in the workplace 6, and generates image data representing the captured image.
- the camera 2 is connected to the work analysis device 5 such that image data is transmitted to the work analysis device 5, for example.
- the camera 2 included in the system 1 is not limited to one camera, and may be two or more cameras.
- the work analysis device 5 is composed of an information processing device such as a server device, for example.
- the work analysis device 5 is communicably connected to an external information processing device such as a PC including the monitor 4 .
- a configuration of the work analysis device 5 will be described with reference to FIG. 2 .
- FIG. 2 is a block diagram illustrating the configuration of the work analysis device 5 .
- the work analysis device 5 illustrated in FIG. 2 includes a control unit 50 , a storage unit 52 , an operation unit 53 , an equipment interface 54 and an output interface 55 .
- the interface is abbreviated as "I/F".
- the control unit 50 includes, for example, a CPU or MPU that cooperates with software to realize a predetermined function, and controls the overall operation of the work analysis device 5.
- the control unit 50 reads the data and programs stored in the storage unit 52 and performs various arithmetic processing to realize various functions.
- the control unit 50 includes an image recognition unit 51 as a functional configuration.
- the image recognition unit 51 recognizes a preset position to be processed in the image indicated by the image data, and outputs the recognition result.
- a person such as the worker W is set as a processing target.
- the recognition result may include, for example, information indicating the time when the position to be processed was recognized.
- the image recognition unit 51 performs image recognition processing using a model trained by a neural network such as a convolutional neural network. Image recognition processing may be performed by various image recognition algorithms.
- the control unit 50 executes a program containing a group of commands for realizing the functions of the work analysis device 5, for example.
- the above program may be provided from a communication network such as the Internet, or may be stored in a portable recording medium.
- the control unit 50 may include an internal memory as a temporary storage area for holding various data and programs.
- control unit 50 may be a hardware circuit such as a dedicated electronic circuit or a reconfigurable electronic circuit designed to achieve a predetermined function.
- the control unit 50 may be composed of various semiconductor integrated circuits such as a CPU, MPU, GPU, GPGPU, TPU, microcomputer, DSP, FPGA, and ASIC.
- the storage unit 52 is a storage medium that stores programs and data necessary for realizing the functions of the work analysis device 5 .
- the storage unit 52 is configured by, for example, a hard disk drive (HDD) or a semiconductor storage device (SSD).
- HDD hard disk drive
- SSD semiconductor storage device
- the storage unit 52 stores the above programs, and various information such as flow line data D0, map data D1, work sequence information D2, and authentication information D3.
- the flow line data D0 indicates the flow line of the worker W moving in the workplace 6.
- the flow line data D ⁇ b>0 is generated based on the recognition result obtained by inputting the image data acquired from the camera 2 to the image recognition unit 51 , for example.
- the map data D1 indicates the arrangement of various facilities such as the transfer line 61 and the shelf 62 in the workplace 6 in a predetermined coordinate system.
- the work order information D2 is information indicating the temporal execution order of a combination of works.
- the authentication information D3 is information for identifying individuals such as the workers W1 to W3. Details of each information will be described later.
- the storage unit 52 may include a temporary storage element configured by, for example, a DRAM or SRAM, and may function as a work area for the control unit 50.
- the storage unit 52 may temporarily store the image data received from the camera 2, the recognition result of the image recognition unit 51, and the like.
- the operation unit 53 is a general term for operation members that accept user operations.
- the operation unit 53 is composed of, for example, a keyboard, mouse, trackpad, touchpad, buttons, switches, or the like, or a combination thereof.
- the operation unit 53 acquires various information input by a user's operation.
- the device I/F 54 is a circuit for connecting an external device such as the camera 2 to the work analysis device 5.
- the device I/F 54 communicates according to a predetermined communication standard. Predetermined standards include USB, HDMI (registered trademark), IEEE1395, IEEE802.11, Bluetooth (registered trademark), and the like.
- the device I/F 54 is an example of an acquisition unit that receives various information from an external device in the work analysis device 5 . In the work analysis system 1, the work analysis device 5 acquires image data representing the moving image captured by the camera 2, for example, via the equipment I/F 54.
- the output I/F 55 is a circuit for outputting information.
- the output I/F 55 outputs video signals and the like to an external display device such as a monitor and a projector for displaying various information in compliance with, for example, the HDMI standard.
- the configuration of the work analysis device 5 as described above is an example, and the configuration of the work analysis device 5 is not limited to this.
- the work analysis device 5 may be composed of various computers including a PC (personal computer).
- the work analysis device 5 may include a display section configured by a liquid crystal display or an organic EL display as a built-in display device, for example.
- the work analysis method of the present embodiment may be executed in distributed computing.
- the work analysis device 5 may have a configuration that communicates with an external information processing device via a communication network.
- the operation unit 53 may be configured to receive an operation by an external information processing device connected via a communication network.
- the output I/F 55 may transmit various types of information to an external information processing device via a communication network.
- the acquisition unit in the work analysis device 5 may be implemented in cooperation with various software in the control unit 50 and the like.
- the acquisition unit in the work analysis device 5 may acquire various information by reading various information stored in various storage media (for example, the storage unit 52 ) to the work area of the control unit 50 .
- the work analysis device 5 of this embodiment stores the flow line data D0, the map data D1, the work sequence information D2, and the authentication information D3 in the storage unit 52 as described above.
- An example of the structure of various data D0 to D3 will be described below.
- the flow line data D0 manages, for example, a time, a flow line ID that identifies the flow line of the worker W, and the position of the worker W in the workplace 6 recognized at that time by the image recognition unit 51 in association with each other. .
- the flow line data D0 associates, for example, the map data D1 with a flow line based on the position of the worker W at each time.
- FIG. 3 is a diagram for explaining the map data D1.
- the map data D1 manages data indicating a coordinate system of a map, such as a layout of various facilities in which the workplace 6 is viewed from above, in association with the arrangement of sections and work areas, which will be described later.
- the two directions perpendicular to each other on the horizontal plane in the workplace 6 are referred to as the X direction and the Y direction.
- a position on the workplace 6 is defined, for example, by an X coordinate indicating a position in the X direction and a Y coordinate indicating a position in the Y direction.
- a transport line 61 and a shelf 62 are shown that are spaced apart in the X direction corresponding to the workplace 6 shown in FIG.
- the transport line 61 extends in the Y direction and transports boxes in the direction from positive to negative in the Y direction.
- the map data D1 of this example divides the workplace 6 into a plurality of sections in the Y direction and manages them.
- FIG. 3 shows an example in which the workshop 6 is divided into zones Z1 and Z2. Each of the zones Z1 and Z2 is set in advance as a unit zone where the worker W performs the main work in the workshop 6, for example.
- Each section Z1, Z2 includes a work area indicating the area where the worker W works in the workshop 6.
- a section Z1 shown in FIG. 3 includes a work area A1 near the transfer line 61 and a work area A2 near the shelf 62.
- Each work area A1, A2 is set in advance as a region indicating a range of positions in the work place 6 where work related to the transfer line 61 or the shelf 62, respectively, is considered to be performed.
- the storage unit 52 stores work area information that associates positions in the work area 6 with work.
- the work area information is managed by associating the work area for each section in the work place 6 with the work performed in each work area.
- boxing and box preparation work is performed in the work area A1 near the transfer line 61 in the section Z1
- collection work is performed in the work area A2 near the shelf 62 in the section Z1.
- the work area information may include, for example, information indicating the correspondence between the positional relationship in the Y direction and each work with respect to the work area A1 associated with a plurality of works.
- FIG. 4 is a diagram for explaining the work order information D2 in the work analysis device 5 of this embodiment.
- the work order information D2 is managed by associating "sections" in the workplace 6 with "abnormal order” indicating the order assumed to be abnormal as the work execution order by one worker W. .
- the zone Z1 is associated with an abnormal order of "packing, moving, packing” in chronological order.
- Such an abnormal order is set in advance on the assumption that, for example, in the section Z1, after the worker W who has packed the boxes moves, the order of packing again in a state in which he does not have any baggage is abnormal.
- the abnormal order is an example of the predetermined order in this embodiment.
- the work analysis device 5 of the present embodiment stores, for example, the work order information D2 as well as work tendency information related to the workplace 6 in the storage unit 52 .
- the work trend information includes, for example, information such as a standard work period set for each work of each worker W1 to W3.
- the work tendency information may include information indicating various tendencies of various work performed by the worker W to be analyzed in the workplace 6 .
- the work trend information may also include information indicating the classification of main work and sub-work in the analysis chart 7 .
- the work analysis device 5 of the present embodiment stores authentication information D3 for identifying individuals such as each of the workers W1 to W3 in the storage unit 52.
- the authentication information D3 is, for example, acquired in advance by a card reader or the like installed in the workplace 6 through an authentication operation performed by each worker W1 to W3 when entering the workplace 6, and is sent to the work analysis device 5.
- the authentication information D3 includes, for example, information indicating the time when the authentication operation by each of the workers W1 to W3 was accepted. The operation of the work analysis device 5 using these various types of information will be described later.
- the work analysis system 1 illustrated in FIG. 1 recognizes the position (that is, flow line) of the worker W at each time in the workplace 6 by image recognition processing, and performs work that is work to be performed at each position at each time. to recognize The system 1 accumulates information indicating recognition results of the work performed, and based on the accumulated information, generates an analysis chart 7 that visualizes the work performed by each of the workers W1 to W3 during the analysis period.
- the work analysis device 5 of the present embodiment provides authentication information including, for example, the time when the position of each flow line ID in the workplace 6 was first recognized in the flow line data D0, and the time when each of the workers W1 to W3 entered the workplace 6. Based on D3, the flow line ID is associated with each worker W1 to W3.
- the image recognition unit 51 of the work analysis device 5 performs image recognition on the image of the workplace 6 captured by the camera 2 to recognize the position of the worker W and the work to be performed.
- the work analysis device 5 updates the flow line data D0 so as to associate the recognized positions with the past positions of the workers W1 to W3, thereby identifying the workers who performed the work at the recognized positions. W1 to W3 are discriminated.
- FIG. 5 is a diagram showing a first example for explaining a problem with the work analysis device 5.
- FIG. 6 is a diagram showing a second example for explaining the problem regarding the work analysis device 5.
- FIG. 5 and 6 are top views of the workers W in the workshop 6, and show workers W1 and W2 working in the section Z1.
- FIG. 5(A) shows a scene where worker W1 is “collecting” and worker W2 is “packing”.
- FIG. 5B shows a scene in which the workers W1 and W2 are "moving" from the scene in FIG. 5A.
- FIG. 5(C) shows a scene where the worker W1 who has moved from the scene of FIG. 5(B) is "packing” and the worker W2 is "collecting”.
- the position of the worker W is recognized by image recognition from the image captured by the camera 2, and the work to be performed at each position is performed according to the work areas A1, A2, etc. in FIG. is recognized.
- an occlusion occurs in which the worker W2 is shielded by the worker W1 in the line-of-sight direction of the camera 2 and is not reflected in the captured image.
- the flow lines of the workers W1 and W2 become congested, even if the execution work at two positions is recognized in the example of FIG. It is difficult to determine from image recognition or the like of each position whether or not the worker has moved to a position and performed the corresponding work.
- FIG. 6(A) shows a scene where worker W1 is “moving” and worker W2 is “packing".
- FIG. 6(B) shows a scene in which worker W1 "moves” from the scene in FIG. 6(A) and worker W2 continues “packing”.
- FIG. 6C shows a scene in which worker W1, who has moved from the scene in FIG.
- processing for estimating the worker of each work is executed based on the work tendency information such as the work sequence information D2. .
- the lines of flow of a plurality of workers become congested, and even in a situation where it is difficult to distinguish the worker W from the image recognition of each position. , the worker W of each execution work can be discriminated.
- FIG. 7 is a flowchart for explaining the overall operation of the work analysis device 5.
- FIG. The processing shown in this flowchart is executed by the control unit 50 of the work analysis device 5, for example.
- the control unit 50 acquires image data during the analysis period from the camera 2, for example, via the device I/F 54 (S1). For example, while the workers W1 to W3 are working in the workplace 6, the camera 2 shoots a moving image, generates image data showing the captured image at each time in a predetermined cycle such as the frame cycle of the moving image, and records it in the internal memory. .
- the camera 2 transmits image data recorded during the analysis period to the work analysis device 5 .
- the control unit 50 stores the acquired image data in the storage unit 52, for example.
- control unit 50 selects one frame of image data representing captured images at each time, for example, in time order from the acquired image data in the analysis period (S2).
- the control unit 50 records the time when the selected one frame was captured as the time in the flow line data D0, for example.
- the control unit 50 functions as an image recognition unit 51 and recognizes the position and work of the worker W in the image indicated by the selected one frame of image data (S3).
- the control unit 50 converts, for example, the position recognized in the image into a coordinate system indicating the position in the workplace 6 based on the map data D1. For example, based on the work area information, the control unit 50 recognizes the work to be performed at each recognized position depending on whether it is in the work area A1, A2, or other area.
- the positions of two workers W are recognized in the work area A1 corresponding to the two works of boxing and box preparation.
- the control unit 50 recognizes the work performed at the position of each worker W based on the relationship that box preparation is performed on the upstream side (+Y direction in FIG. 3) of the transport line 61, for example.
- the control unit 50 detects the state of traffic line congestion based on the recognition result of step S3, for example (S4). For example, the control unit 50 detects whether or not occlusion has occurred due to overlapping of the positions of a plurality of workers W in the captured image of the frame being selected. For example, when the control unit 50 determines that occlusion has occurred and the recognized position in the work place 6 is within a predetermined range from the positions of the plurality of flow lines at the latest time in the flow line data D0, the movement is stopped. It is determined that the line is in a crossed state.
- the predetermined range is set in advance as a small range that can be regarded as a range in which the worker W moves in the workplace 6 at time intervals of, for example, a frame cycle.
- the control unit 50 updates the flow line data D0 so as to add the position recognized in step S3 this time as the position of the corresponding flow line ID (S6 ). At this time, the control unit 50 associates the work to be performed for each position recognized in step S3 with each position and the corresponding flow line ID in the flow line data D0, thereby determining the workers W1 to W3 associated with each work to be performed. is determined (S6).
- Information that associates each position of the flow line data D0 with the work performed and the worker W is an example of the work history information in this embodiment.
- the control unit 50 of the present embodiment compares the work recognized in the state of mixed lines and the past performed work for each worker associated with the flow line data D0. , the worker W associated with each recognized work is determined (S5). With such worker discrimination processing (S5) for a crowded line state, even in a crowded state where the worker W for each work cannot be determined by associating the position recognized in step S3 with the past flow line, the work can be performed. Estimation of person W can be realized.
- the control unit 50 of the present embodiment refers to the work tendency information such as the work order information D2, and performs the worker discrimination process (S5) for the mixed line state. The details of the worker discrimination processing (S5) for the mixed line state will be described later.
- the worker discrimination processing for the mixed line state is also simply referred to as the worker discrimination processing.
- step S7 After determining the worker W for each work to be performed (S5, S6), the control unit 50 proceeds to step S7.
- the control unit 50 repeats the processes of steps S2 to S6 for the image data at the next time.
- the flow line data D0 based on the image data at each time during the analysis period is obtained. Note that the processing of steps S3 to S6 may be performed for each section of the workplace 6 as illustrated in FIG. You may proceed.
- the control unit 50 When all the frames in the analysis period have been selected (YES in S7), the control unit 50 performs visualization processing (S8) to generate an analysis chart 7.
- the control unit 50 counts, for example, the number of operations determined for each worker W1 to W3 in the workplace 6 for each time interval such as a period of one frame.
- the control unit 50 calculates the ratio of each work for each worker and generates the analysis chart 7 .
- the ratio of each task in the analysis chart 7 is shown as, for example, the ratio of the time of each task to the analysis period.
- control unit 50 stores the analysis chart 7 generated by the visualization process (S8) in the storage unit 52, and ends the process shown in this flowchart.
- each A worker W who works at a position is determined (S6).
- the worker is discriminated by the worker discriminating process (S5).
- an analysis chart 7 is generated based on the work performed by each worker at all time intervals during the analysis period (S8). .
- step S1 the image data generated from the camera 2 may be acquired sequentially.
- the control unit 50 may repeat the processes after step S1 until the flow line data D0 based on the image data of the number of frames in the analysis period is obtained.
- step S4 the control unit 50 detects the crossed state of flow lines according to, for example, either occlusion in the captured image or the position of the worker W in the workplace 6. You may
- FIG. 8 is a flowchart illustrating the worker determination process (S5 in FIG. 7) in the work analysis device 5 of this embodiment.
- FIG. 9 is a diagram for explaining work combinations in the worker identification process.
- FIGS. 9A and 9B illustrate work combination tables T1 and T2 corresponding to the scenes shown in FIGS. 5A to 5C and FIGS. 6A to 6C, respectively.
- the work combination tables T1 and T2 store a plurality of work combination candidates representing combinations of a plurality of workers W and a plurality of works. Each candidate indicates a task combination associated with a task sequence including two or more tasks for each worker.
- the work column indicates a set of works in which the past work performed by each worker and the work performed for which the worker is to be determined are arranged in order of recognized time.
- the control unit 50 refers to the work tendency information such as the work order information D2, and selects one candidate from the plurality of candidates in the work combination tables T1 and T2 as a determination result. Perform processing to determine the work combination of The control unit 50 determines the worker W for each work according to the determined work combination.
- the control unit 50 calculates the work combination of the work recognized after the time when the mixed line state was detected and the worker W associated with the past work, and calculates the work combination table. T1 and T2 are generated (S11). The control unit 50 generates a work combination including at least two types of work by referring to the past work and the worker W associated with the flow line data D0, for example.
- the flow lines of workers W1 and W2 are in a crossed state in the scene of FIG. 5(B).
- the work performed by the two workers W are both recognized as "moving" (S3 in FIG. 7), and the workers W1 and W2 can be associated with the same type of work without any particular determination.
- the control unit 50 selects, for example, the frame of the next time, and recognizes the position and work of the worker W based on the image data of the frame in the same manner as in step S3 of FIG. .
- the frame corresponding to the scene of FIG. 5(C) is selected, and the position of the worker W and the work "packing" and "collection" at each position are recognized.
- the control unit 50 based on the workers W1 and W2 related to the mixed line state and the work recognized from the frames corresponding to the scenes of FIGS.
- a work combination table T1 illustrated in (A) is generated. For example, when detecting a state of congestion (S4 in FIG. 7), the control unit 50 detects the state of congestion based on the position recognized in step S3 and the past position of the worker W in the flow line data D0. In other words, workers W1 and W2 corresponding to a plurality of flow lines in which crossing has occurred are determined. For the two workers W1 and W2, the control unit 50 replaces each work that cannot be associated with the worker W based on the flow line recognized at the time of FIG. are generated as candidates (S11).
- control unit 50 excludes candidates from the work combination tables T1 and T2 based on the work order information D2 (S12). For example, the control unit 50 determines whether or not each candidate corresponds to the abnormal order of the work order information D2, and excludes the corresponding candidate from the work combination tables T1 and T2.
- the work combination of worker W2 included in candidate C11 corresponds to the abnormal order of work order information D2 illustrated in FIG. Therefore, the candidate C11 is excluded from the work combination table T1 (S12).
- the control unit 50 excludes from the work combination tables T1 and T2 candidates whose work period of the work to be performed exceeds the standard work period ( S13). For example, the control unit 50 calculates the work period of the latest performed work in each candidate work queue, and if the work period exceeds a predetermined period that significantly exceeds the standard work period, the work queue is Exclude candidates that contain.
- the standard work period is calculated by averaging the periods measured a plurality of times in advance as the period required for each work for each worker.
- the predetermined period increment is set, for example, to three times the standard deviation of the measured period.
- the control unit 50 determines whether or not there are a plurality of candidates not excluded in the work combination tables T1 and T2, that is, whether or not a plurality of candidates remain. (S14).
- the control unit 50 relaxes the conditions for exclusion based on the work tendency information, for example, in order to leave at least one candidate until step S14. may be executed again.
- the predetermined period in exclusion by work period (S13) may be set longer than the above example.
- the control unit 50 determines the remaining one candidate as the work combination of the determination result (S16).
- the remaining candidates C12 and C22 are determined as the work combination of the discrimination result.
- the control unit 50 determines, among the plurality of candidates, the difference between the duration of the latest performed work in each work queue and the standard work period is the smallest. A candidate is selected (S15). The control unit 50 determines the selected candidate as the work combination of the determination result (S16).
- the control unit 50 determines the workers W of each task recognized at the latest time, which cannot be associated with the workers W based on the flow line, from the determined work combination of the determination result (S17).
- the control unit 50 updates the flow line data D0 so as to add the determined position of each worker W at the latest time as the position of the corresponding flow line ID (S17).
- the workers W who perform the collection and packing work are determined to be the workers W2 and W1, respectively, from the candidate C12 determined as the work combination of the determination result in the work combination table T1. be done. Then, the flow line data D0 is updated with the positions corresponding to the collection and packing respectively as the positions of the workers W2 and W1.
- the worker W who performs each task of boxing and box preparation is determined to be workers W1 and W2, respectively, from the candidate C22 determined as the task combination of the discrimination result in the task combination table T2. be discriminated. Then, the flow line data D0 is updated with positions corresponding to packing and preparation of boxes as the positions of the workers W1 and W2.
- control unit 50 After identifying the worker W and updating the flow line data D0 (S17), the control unit 50 terminates the processing shown in this flowchart. After that, the process proceeds to step S7 in FIG.
- step S11 an example was explained in which the work sequence of the work combination tables T1 and T2 includes work for three frames.
- the task sequence is not limited to tasks for each time such as three frames, and may include, for example, three tasks of different types.
- the control unit 50 generates work combination tables T1 and T2 by referring to past work associated with the flow line data D0 until, for example, three types of work are obtained.
- the work sequence is not limited to three types, and may be generated by arranging three types of work for each predetermined period.
- the number of tasks is not limited to three, and a task queue may be generated in which two tasks are arranged.
- narrowing down of the candidates for the worker W may be performed using the coordinate information in the flow line data D0 and the movement distance per time.
- the control unit 50 determines the worker W related to the mixed line state in the work combination based on the moving speed of the worker W based on the past position in addition to the past position of the worker W in the flow line data D0, for example. may be determined by
- the work analysis device 5 generates information regarding a plurality of workers W who perform a plurality of tasks in the workplace 6 .
- the work analysis device 5 includes a device I/F 54, a control unit 50, and a storage unit 52 as an example of an acquisition unit.
- the device I/F 54 acquires image data representing an image of the workplace 6 (S1).
- the control unit 50 assigns work to each position of the flow line data D0 as an example of work history information indicating work performed in the workplace 6 by individual workers W1 to W3 among the plurality of workers W.
- Information associated with the worker W is generated (S5, S6).
- the storage unit 52 stores work history information.
- the control unit 50 sequentially recognizes the positions and operations of the plurality of workers W based on the image data for each time in the workplace 6 (S2, S3, S7).
- the control unit 50 detects a crossed line state as an example of crossed lines between a plurality of flow lines including positions of a plurality of workers W at each time (S4).
- the control unit 50 associates the work recognized at each time with the individual workers W1 to W3 based on a plurality of flow lines, and generates work history information.
- S6 When a mixed line state is detected (YES in S4), the control unit 50 associates the recognized work with the individual workers W1 to W3 based on the recognized work and past work history information.
- S5 When a mixed line state is detected (YES in S4), the control unit 50 associates the recognized work with the individual workers W1 to W3 based on the recognized work and past work history information.
- the storage unit 52 stores work tendency information that indicates the tendency of work to be performed in the workplace 6 .
- the control unit 50 refers to the work tendency information and associates the recognized work with the individual workers W1 to W3 (S5).
- W can be estimated.
- the control unit 50 detects a plurality of workers W corresponding to a plurality of flow lines in which crossed lines occur and a plurality of recognized works. As an example of calculating a combination of , a plurality of work combinations are generated (S11). Based on the work tendency information, the control unit 50 determines one work combination from among the plurality of work combinations (S16). are associated with each other (S17). In this way, work combination tables T1 and T2 containing a plurality of work combinations as candidates C11 to C22 are generated (S11), and by narrowing down the candidates C11 to C22 based on the work tendency information, the work combination of the determination result is determined. (S16). Thereby, the worker W for each work can be determined from the determined work combination.
- the work trend information includes work order information D2 as an example of information indicating the order in a combination of two or more works out of a plurality of works.
- the control unit 50 excludes a work combination corresponding to an abnormal order as an example of a predetermined order from a plurality of work combinations (S12), and determines one work combination (S16). As a result, a work combination that does not correspond to the abnormal order in the work order information D2 can be determined as the determination result.
- the work trend information includes information indicating the standard work period set for the first work among a plurality of works.
- the control unit 50 excludes, from a plurality of work combinations, a work combination whose period exceeds the standard work period according to the period during which the work of the worker W is recognized as the first work (S13). (S16).
- the work combination of the candidate C22 is excluded.
- a work combination in which the work period of a specific work matches the standard work period can be determined as a determination result.
- control unit 50 controls the position of workers W1 and W2 (an example of two or more workers) among a plurality of workers W to be superimposed on the image indicated by the acquired image data. (S4). As a result, it is possible to detect a crossed line state based on the position of the worker W on the image.
- the storage unit 52 further stores authentication information D3 that identifies individual workers W1 to W3.
- control unit 50 associates the recognized position with individual workers W1 to W3.
- the control unit 50 associates and manages the flow line ID of each position in the flow line data D0 with each of the workers W1 to W3.
- the flow line data D0 is updated so as to associate the position of the worker W sequentially recognized (S3) with the past positions of the workers W1 to W3, and the work at each recognized position is performed. Workers W1 to W3 can be discriminated (S6).
- the control unit 50 Based on the work history information for the analysis period (an example of a predetermined period), the control unit 50 generates an analysis chart as an example of information indicating the ratio of a plurality of tasks over the analysis period for each individual worker W1 to W3. produces 7. As a result, the analysis chart 7 relating to a plurality of workers W performing a plurality of tasks in the workshop 6 can be presented to the user 3 of the work analysis system 1, for example.
- the work analysis device 5 may further include an output I/F 55 and/or a monitor 4 as an example of a display section that displays generated information such as the analysis chart 7 .
- the work analysis method in this embodiment is a method of generating information about a plurality of workers W who perform a plurality of tasks in the workplace 6 .
- This method comprises a step (S1) in which the control unit 50 of the computer acquires image data showing an image of the workplace 6; (S2 to S7) for generating work history information indicating the work performed in the workshop 6 by the.
- the control unit 50 of the computer sequentially recognizes the positions and work of a plurality of workers W based on the image data for each time in the workplace 6 (S2, S3, S7), cross-talk between a plurality of flow lines including positions of a plurality of workers W at each time is detected (S4).
- control unit 50 associates the work recognized at each time with the individual workers W1 to W3 based on a plurality of flow lines, and generates work history information. (S6), and when crossed lines are detected (YES in S4), the recognized work and individual workers W1 to W3 are associated with each other based on the recognized work and past work history information. (S5).
- a program for causing the control unit of the computer to execute the work analysis method as described above.
- the work analysis method of this embodiment when a plurality of workers W perform a plurality of tasks, the worker W for each task can be estimated.
- FIG. 10 is a diagram for explaining the work plan information D4 in the work analysis device 5 of this embodiment.
- the work plan information D4 is an example of work plan information in the present embodiment, and indicates quotas in the workplace 6, allocation of workers W, and the like.
- the work plan information D4 exemplified in FIG. 10 includes “workers” in the workplace 6, “shipping number” indicating the quota of packing which is the main work, and “section in charge” which is the section where each worker W mainly works. ” is associated with and managed.
- the section in charge indicates a range of positions in the workshop 6 where the worker W does not perform auxiliary work such as box preparation.
- the auxiliary work is an example of the second work in this embodiment.
- the number of items to be carried out and the section in charge are set in advance by the user 3 or the like, for example.
- the section in charge of workers W1 and W2 is set to section Z1
- the section in charge of worker W3 is set to section Z2.
- FIG. 11 The operation of the work analysis device 5 of this embodiment using the work plan information D4 as described above will be described with reference to FIGS. 11 to 13.
- FIG. 11 The operation of the work analysis device 5 of this embodiment using the work plan information D4 as described above will be described with reference to FIGS. 11 to 13.
- FIG. 11 is a flowchart illustrating the worker determination process of this embodiment.
- the control unit 50 in addition to the processes of steps S11 to S17 in the worker determination process (S5) of the first embodiment, performs candidate work based on the work plan information D4 from the work combination table. is excluded (S21).
- FIG. 12 is a diagram for explaining the worker determination processing of this embodiment.
- FIG. 13 is a diagram for explaining work combinations in the worker identification process of this embodiment.
- FIG. 12 is a view of the workplace 6 viewed from above, similar to FIGS.
- FIG. 12 shows how workers W1 and W2 and worker W3, who is in charge of a different section, work in section Z1 of the workshop 6. As shown in FIG.
- FIG. 12(A) shows a scene in which worker W1 is “collecting”, worker W2 is “packing”, and worker W3 who has entered section Z1 from section Z2 is “moving”.
- FIG. 12(B) shows a scene in which each of the workers W1 to W3 is “moving" from the scene of FIG. 12(A).
- FIG. 12(C) shows a scene in which workers W1 and W3 who have moved from the scene of FIG. 12(B) perform "boxing” and “box preparation”, respectively, and worker W2 performs "collection”. .
- An example in which the flow lines of workers W1 to W3 are mixed due to occlusion in the scene of FIG. 12B will be described below.
- the flowchart shown in FIG. 11 is started by detecting a state of crossed lines (YES in S4 of FIG. 7), for example, based on the image data of the frame corresponding to the scene of FIG. 12(B).
- the control unit 50 determines the position and work of the worker W from the next time frame corresponding to the scene of FIG. Recognize and generate a work combination table (S11).
- FIG. 13 illustrates the work combination table T3 generated in step S11 according to the example of FIGS. 12(A) to (C).
- the control unit 50 excludes candidates from the work combination table T3 based on the work plan information D4 (S21).
- the control unit 50 refers to, for example, the section in charge of the work plan information D4, and excludes from the work combination table T3 candidates whose latest work performed by each worker in the section in charge is the box preparation of the auxiliary work.
- exclusion rules based on work plan information D4 are set in advance. be.
- the work plan information D4 of FIG. 10 since the division Z1 is in charge of worker W1, the candidates C35 and C36 whose latest execution work for worker W1 is box preparation are excluded from the work combination table T3. Similarly, regarding worker W2, candidates C32 and C34 are excluded from work combination table T3.
- the control unit 50 excludes candidates from the work combination table T3 based on the work order information D2 and the work period (S12-S13).
- candidate C33 is excluded from work combination table T3 because the work combination of candidate C33 for worker W2 corresponds to the abnormal order in work sequence information D2 (S12).
- the control unit 50 determines the candidate C31 remaining without being excluded from the work combination table T3 as the work combination of the determination result (S16).
- the work tendency information is defined as a division in charge ( An example of work plan information D4 is included as an example of information associating individual workers W1 to W3 with the scope of responsibility).
- the control unit 50 excludes, from a plurality of work combinations, the work combination in which the work of the worker W is an auxiliary work (S21). (S16). As a result, a work combination in which the worker W performs the main work in each assigned section can be determined as the determination result.
- the exclusion (S21) based on the work plan information D4 may be performed using information related to various work plans, not limited to the section in charge. For example, a worker W who has packed boxes a number of times corresponding to a predetermined number of deliveries may be excluded from the candidates including box packing at subsequent times.
- Embodiments 1 and 2 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to embodiments in which modifications, substitutions, additions, omissions, etc. are made as appropriate. Moreover, it is also possible to combine the constituent elements described in the above embodiments to form a new embodiment. Therefore, other embodiments will be exemplified below.
- the work order information D2 that associates abnormal orders with the sections in the workshop 6 has been described.
- the work order information D2 may include a standard order indicating the execution order of work by one worker W for each section of the workshop 6, for example.
- the exclusion process (S12) based on the work order information in the worker identification process (FIGS. 8 and 11), for example, candidates that do not correspond to the standard order may be excluded from the work combination tables T1 to T3.
- the image recognition unit 51 outputs recognition results of the positions of the workers W1 to W3 as persons who do not particularly distinguish between them.
- the image recognition unit 51 of the present embodiment may distinguish between the workers W1 to W3 and recognize the positions of the workers W1 to W3 by, for example, face recognition technology.
- the operation of the work analysis device 5 of the present embodiment can be applied, for example, when the work performed by each worker W whose face is not shown in the captured image of the camera 2 can be recognized.
- the work analysis device 5 of the present embodiment may exclude the position of the worker not to be analyzed from the processing targets in the work-related processing (S4 to S5, S8).
- the work area 6 workers in positions different from the work areas A1, A2 and the area therebetween, such as areas for replenishment work, are excluded.
- a worker who performs process control or monitoring in the work area 6 is excluded from the analysis target based on the tendency that the flow line of the worker passes through the area between the work areas A1 and A2 for a long time. good.
- the work analysis device 5 has been described that excludes work periods based on preset work period information (S13).
- the collection work period may be set for each worker based on the past stay period in the work area A2 on the shelf 62 side.
- the work analysis device 5 of the present embodiment can determine not only the work period of each work, but also the period during which the work is estimated to be performed outside the angle of view of the camera 2, or the worker's rest period.
- the information may be used to perform worker determination processing.
- the work analysis system 1 is applied to a workplace 6 such as a distribution warehouse.
- the workplace that is, the site, to which the work analysis system 1 and the work analysis device 5 are applied is not limited to the above-described workplace 6, but may be various sites such as a factory or a sales floor of a store.
- the work determined by the work analysis system 1 is not limited to the above-described example of packing boxes, and may be various works according to various sites.
- the worker to be analyzed by the work analysis system 1 is not limited to a person such as the worker W, and may be any moving body capable of executing various types of work.
- the mobile object may be a robot or various manned or unmanned vehicles.
- the present disclosure is applicable to data analysis applications for analyzing workers' work in various environments such as logistics sites or factories.
Abstract
Description
1.構成
実施形態1に係る作業分析システムについて、図1を用いて説明する。図1は、本実施形態に係る作業分析システム1の概要を示す図である。
本システム1は、図1に示すように、カメラ2と、作業分析装置5とを備える。本システム1は、物流倉庫などの作業場6において、複数の作業を行う作業者W1、W2及びW3の効率等を分析する用途に適用される。以降、作業者W1~W3を作業者Wともいう。本システム1は、例えば作業場6の管理者または分析担当者といったユーザ3に、所定の分析期間に関する分析チャート7を提示するためのモニタ4を備えてもよい。分析期間は、本システム1においてカメラ2を用いた画像認識等による分析対象の期間であり、例えば1日から数か月などに予め設定される。
図2は、作業分析装置5の構成を例示するブロック図である。図2に例示する作業分析装置5は、制御部50と、記憶部52と、操作部53と、機器インタフェース54と、出力インタフェース55とを備える。以下、インタフェースを「I/F」と略記する。
以上のように構成される作業分析システム1及び作業分析装置5の動作について、以下説明する。
本実施形態の作業分析システム1において、各実施作業の作業者Wを特定する上で課題となる場面について、図5及び図6を用いて説明する。
作業分析システム1における作業分析装置5の全体動作について、図7を用いて説明する。
図7のステップS5における作業者判別処理の詳細を、図8及び図9を用いて説明する。
以上のように、本実施形態における作業分析装置5は、作業場6において複数の作業を行う複数の作業者Wに関する情報を生成する。作業分析装置5は、取得部の一例として機器I/F54と、制御部50と、記憶部52とを備える。機器I/F54は、作業場6が撮像された画像を示す画像データを取得する(S1)。制御部50は、画像データに基づいて、複数の作業者Wにおける個別の作業者W1~W3が作業場6において行った作業を示す作業履歴情報の一例として、動線データD0の各位置に作業と作業者Wとを関連付けた情報を生成する(S5,S6)。記憶部52は、作業履歴情報を格納する。制御部50は、作業場6における時刻毎の画像データに基づいて、複数の作業者Wの位置及び作業を順次、認識する(S2,S3,S7)。制御部50は、複数の作業者Wの時刻毎の位置を含む複数の動線の間の混線の一例として、混線状態を検知する(S4)。制御部50は、混線状態が検知されていないとき(S4でNO)、複数の動線に基づいて、各時刻に認識された作業と個別の作業者W1~W3とを対応付けて作業履歴情報を生成する(S6)。制御部50は、混線状態が検知されたとき(S4でYES)、認識された作業と、過去の作業履歴情報とに基づいて、当該認識された作業と個別の作業者W1~W3とを対応付ける(S5)。
実施形態1では、作業期間の情報及び作業順序情報D2に基づいて、作業者判別処理を行う作業分析装置5を説明した。実施形態2では、さらに作業場6に関する作業計画情報に基づいて、作業者判別処理を行う作業分析装置5を説明する。
以上のように、本出願において開示する技術の例示として、実施形態1、2を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置換、付加、省略などを行った実施の形態にも適用可能である。また、上記各実施形態で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。そこで、以下、他の実施形態を例示する。
Claims (11)
- 作業場において複数の作業を行う複数の作業者に関する情報を生成する作業分析装置であって、
前記作業場が撮像された画像を示す画像データを取得する取得部と、
前記画像データに基づいて、前記複数の作業者における個別の作業者が前記作業場において行った作業を示す作業履歴情報を生成する制御部と、
前記作業履歴情報を格納する記憶部と
を備え、
前記制御部は、
前記作業場における時刻毎の画像データに基づいて、前記複数の作業者の位置及び作業を順次、認識し、
前記複数の作業者の時刻毎の位置を含む複数の動線の間の混線を検知し、
前記混線が検知されていないとき、前記複数の動線に基づいて、各時刻に認識された作業と前記個別の作業者とを対応付けて前記作業履歴情報を生成し、
前記混線が検知されたとき、認識された作業と、過去の作業履歴情報とに基づいて、当該認識された作業と前記個別の作業者とを対応付ける
作業分析装置。 - 前記記憶部は、前記作業場において前記作業が行われる傾向を示す作業傾向情報を格納し、
前記制御部は、前記混線が検知されたとき、前記作業傾向情報を参照して、認識された作業と前記個別の作業者とを対応付ける
請求項1に記載の作業分析装置。 - 前記制御部は、前記混線が検知されたとき、
前記混線が生じた複数の動線に対応する複数の作業者と、認識された複数の作業との複数の組合せを算出し、
前記作業傾向情報に基づいて、前記複数の組合せから一の組合せを決定し、
決定した一の組合せに応じて、前記認識された作業と前記個別の作業者とを対応付ける
請求項2に記載の作業分析装置。 - 前記作業傾向情報は、前記複数の作業のうちの2以上の作業の組合せにおける順序を示す情報を含み、
前記制御部は、前記順序に応じて、前記複数の組合せから所定の順序に該当する組合せを除外して、前記一の組合せを決定する
請求項3に記載の作業分析装置。 - 前記作業傾向情報は、前記複数の作業における第1の作業に設定される標準作業期間を示す情報を含み、
前記制御部は、前記作業者の作業が、前記第1の作業であると認識される期間に応じて、前記複数の組合せから当該期間が前記標準作業期間を超える組合せを除外して、前記一の組合せを決定する
請求項3又は4に記載の作業分析装置。 - 前記作業傾向情報は、前記作業場において前記個別の作業者が、前記複数の作業における第2の作業を行わない位置の範囲を示す担当範囲と、前記個別の作業者とを関連付けた情報を含み、
前記制御部は、前記作業者の位置が前記作業者の担当範囲に含まれるときに、前記複数の組合せから、前記作業者の作業が前記第2の作業である組合せを除外して、前記一の組合せを決定する
請求項3から5のいずれか1項に記載の作業分析装置。 - 前記制御部は、取得された画像データが示す画像において、前記複数の作業者のうちの2以上の作業者の位置が重畳した状態に応じて前記混線を検知する
請求項1から6のいずれか1項に記載の作業分析装置。 - 前記記憶部は、前記個別の作業者を識別する認証情報をさらに格納し、
前記制御部は、前記作業場における前記作業者の位置を認識した最初の時刻に、認識された位置と前記個別の作業者とを対応付ける
請求項1から7のいずれか1項に記載の作業分析装置。 - 前記制御部は、所定期間分の前記作業履歴情報に基づいて、前記個別の作業者毎に、前記所定期間に亘る前記複数の作業の比率を示す情報を生成する
請求項1から8のいずれか1項に記載の作業分析装置。 - 作業場において複数の作業を行う複数の作業者に関する情報を生成する作業分析方法であって、
コンピュータの制御部が、
前記作業場が撮像された画像を示す画像データを取得するステップと、
前記画像データに基づいて、前記複数の作業者における個別の作業者が前記作業場において行った作業を示す作業履歴情報を生成するステップと、
を含み、
前記作業履歴情報を生成するステップにおいて、前記コンピュータの制御部が、
前記作業場における時刻毎の画像データに基づいて、前記複数の作業者の位置及び作業を順次、認識し、
前記複数の作業者の時刻毎の位置を含む複数の動線の間の混線を検知し、
前記混線が検知されていないとき、前記複数の動線に基づいて、各時刻に認識された作業と前記個別の作業者とを対応付けて前記作業履歴情報を生成し、
前記混線が検知されたとき、認識された作業と、過去の作業履歴情報とに基づいて、当該認識された作業と前記個別の作業者とを対応付ける
作業分析方法。 - 請求項10に記載の作業分析方法をコンピュータの制御部に実行させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023527523A JPWO2022259690A1 (ja) | 2021-06-11 | 2022-03-18 | |
EP22819880.0A EP4354388A1 (en) | 2021-06-11 | 2022-03-18 | Task analysis device and method |
CN202280040369.8A CN117441189A (zh) | 2021-06-11 | 2022-03-18 | 作业分析装置以及方法 |
US18/530,573 US20240112498A1 (en) | 2021-06-11 | 2023-12-06 | Image analysis device and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-098078 | 2021-06-11 | ||
JP2021098078 | 2021-06-11 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/530,573 Continuation US20240112498A1 (en) | 2021-06-11 | 2023-12-06 | Image analysis device and method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022259690A1 true WO2022259690A1 (ja) | 2022-12-15 |
Family
ID=84425176
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/012833 WO2022259690A1 (ja) | 2021-06-11 | 2022-03-18 | 作業分析装置及び方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240112498A1 (ja) |
EP (1) | EP4354388A1 (ja) |
JP (1) | JPWO2022259690A1 (ja) |
CN (1) | CN117441189A (ja) |
WO (1) | WO2022259690A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011034234A (ja) * | 2009-07-30 | 2011-02-17 | Kozo Keikaku Engineering Inc | 動作分析装置、動作分析方法及び動作分析プログラム |
JP2013196029A (ja) * | 2012-03-15 | 2013-09-30 | Fujitsu Ltd | 移動物体の動線補間装置、方法、及びプログラム |
JP2017010224A (ja) * | 2015-06-19 | 2017-01-12 | キヤノン株式会社 | 物体追尾装置、物体追尾方法及びプログラム |
WO2018198373A1 (ja) | 2017-04-28 | 2018-11-01 | 株式会社日立国際電気 | 映像監視システム |
JP2019185724A (ja) * | 2018-03-30 | 2019-10-24 | ダイキン工業株式会社 | 情報管理システム |
JP2020098590A (ja) * | 2018-12-13 | 2020-06-25 | 田中 成典 | 移動物追跡装置 |
JP2021056671A (ja) * | 2019-09-27 | 2021-04-08 | 三菱電機株式会社 | 作業状態判別装置、作業状態判別方法およびプログラム |
-
2022
- 2022-03-18 CN CN202280040369.8A patent/CN117441189A/zh active Pending
- 2022-03-18 WO PCT/JP2022/012833 patent/WO2022259690A1/ja active Application Filing
- 2022-03-18 JP JP2023527523A patent/JPWO2022259690A1/ja active Pending
- 2022-03-18 EP EP22819880.0A patent/EP4354388A1/en active Pending
-
2023
- 2023-12-06 US US18/530,573 patent/US20240112498A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011034234A (ja) * | 2009-07-30 | 2011-02-17 | Kozo Keikaku Engineering Inc | 動作分析装置、動作分析方法及び動作分析プログラム |
JP2013196029A (ja) * | 2012-03-15 | 2013-09-30 | Fujitsu Ltd | 移動物体の動線補間装置、方法、及びプログラム |
JP2017010224A (ja) * | 2015-06-19 | 2017-01-12 | キヤノン株式会社 | 物体追尾装置、物体追尾方法及びプログラム |
WO2018198373A1 (ja) | 2017-04-28 | 2018-11-01 | 株式会社日立国際電気 | 映像監視システム |
JP2019185724A (ja) * | 2018-03-30 | 2019-10-24 | ダイキン工業株式会社 | 情報管理システム |
JP2020098590A (ja) * | 2018-12-13 | 2020-06-25 | 田中 成典 | 移動物追跡装置 |
JP2021056671A (ja) * | 2019-09-27 | 2021-04-08 | 三菱電機株式会社 | 作業状態判別装置、作業状態判別方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN117441189A (zh) | 2024-01-23 |
US20240112498A1 (en) | 2024-04-04 |
JPWO2022259690A1 (ja) | 2022-12-15 |
EP4354388A1 (en) | 2024-04-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Akhavian et al. | Construction equipment activity recognition for simulation input modeling using mobile sensors and machine learning classifiers | |
US10890898B2 (en) | Traceability systems and methods | |
CA3014403C (en) | Tracking and/or analyzing facility-related activities | |
US20190156276A1 (en) | Realtime inventory tracking using deep learning | |
US9280757B2 (en) | Automated inventory management | |
US11734338B2 (en) | Image search in walkthrough videos | |
JP6995843B2 (ja) | 作業管理システム及び作業管理方法 | |
US20220137609A1 (en) | Production information management system and production information management method | |
JP2017204094A (ja) | 視線特定プログラム、視線特定装置、及び視線特定方法 | |
US20170330206A1 (en) | Motion line processing system and motion line processing method | |
US20190073616A1 (en) | Customer interaction identification and analytics system | |
JP2019121294A (ja) | 計算機システム | |
JP2024042003A (ja) | レイアウト設計装置、レイアウト設計方法及びプログラム | |
Knoch et al. | Video-to-model: Unsupervised trace extraction from videos for process discovery and conformance checking in manual assembly | |
US20210334758A1 (en) | System and Method of Reporting Based on Analysis of Location and Interaction Between Employees and Visitors | |
JP2023538010A (ja) | 倉庫環境におけるリスクマッピングのためのシステム及び方法 | |
JP6864756B2 (ja) | 作業分析装置、及び作業分析方法 | |
WO2022259690A1 (ja) | 作業分析装置及び方法 | |
Tran et al. | Assessing human worker performance by pattern mining of Kinect sensor skeleton data | |
US20220129821A1 (en) | Retail traffic analysis statistics to actionable intelligence | |
WO2022209082A1 (ja) | 作業分析装置 | |
WO2023152893A1 (ja) | 管理装置、管理システム、管理方法及びプログラム | |
WO2023276332A1 (ja) | 作業分析装置及び方法 | |
JP7345355B2 (ja) | 物体識別装置 | |
JP2020145652A (ja) | 連携システム、システム連携方法、および連携プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22819880 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023527523 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022819880 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022819880 Country of ref document: EP Effective date: 20240111 |