WO2023276332A1 - 作業分析装置及び方法 - Google Patents
作業分析装置及び方法 Download PDFInfo
- Publication number
- WO2023276332A1 WO2023276332A1 PCT/JP2022/013283 JP2022013283W WO2023276332A1 WO 2023276332 A1 WO2023276332 A1 WO 2023276332A1 JP 2022013283 W JP2022013283 W JP 2022013283W WO 2023276332 A1 WO2023276332 A1 WO 2023276332A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- work
- detection
- worker
- detected
- control unit
- Prior art date
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 145
- 238000000034 method Methods 0.000 title claims description 202
- 238000001514 detection method Methods 0.000 claims abstract description 299
- 230000008569 process Effects 0.000 claims description 194
- 230000033001 locomotion Effects 0.000 claims description 16
- 238000012545 processing Methods 0.000 description 61
- 230000007717 exclusion Effects 0.000 description 52
- 238000012937 correction Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 16
- 102200012170 rs10084168 Human genes 0.000 description 12
- 238000012986 modification Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 9
- 102220171488 rs760746448 Human genes 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000010801 machine learning Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000010365 information processing Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 238000012800 visualization Methods 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 206010024796 Logorrhoea Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- FIG. 2 is a block diagram illustrating the configuration of the work analysis device 5 .
- the work analysis device 5 illustrated in FIG. 2 includes a control unit 50 , a storage unit 52 , an operation unit 53 , an equipment interface 54 and an output interface 55 .
- the interface is abbreviated as "I/F".
- FIG. 3 is a diagram for explaining the problem with the work analysis device 5.
- FIG. 3 shows a captured image Im captured by the camera 2 of the worker W who is carrying out the transportation work in the work place 6 .
- FIG. 4 is a diagram for explaining the operation of the work analysis device 5.
- FIGS. 4A and 4B illustrate work timelines 70 and 7 before and after correction by the work analysis device 5, respectively, corresponding to the example of FIG.
- Each work timeline 70, 7 is an example of work detection information in this embodiment.
- the worker W is carrying a load, but the load is hidden behind the worker W's body in the line-of-sight direction of the camera 2 .
- the transport work may not be detected by image recognition.
- the target work is not detected in the period T1 including the time tb corresponding to FIG. 3B.
- control unit 50 repeats the processing of steps S1 to S3 for the image data at the next time.
- control unit 50 When all frames in the analysis period have been selected (YES in S4), the control unit 50 performs visualization processing (S5) for generating the work timeline 7 visualized by the user from the detection results for each frame. In step S5, the work timeline 7 after correction is generated based on the detection result interpolated in the work detection process (S3).
- the work of the worker W is detected at each time (S2 to S4), and the detection result in the non-detection period T1 is interpolated with the target work (S3).
- the work timeline 7 can be accurately obtained from the detection results for each time.
- FIG. 6 is a flowchart illustrating the work detection process (S3 in FIG. 5) in the work analysis device 5 of this embodiment.
- FIG. 7 is a diagram for explaining work detection processing.
- the control unit 50 stores the time when the target work was detected based on the current detection result (S15).
- the time at which the target work is detected is recorded, for example, by the imaging time of the frame at which the transportation work is detected.
- the control unit 50 stores, for example, in the storage unit 52, the time ta at which the "take" stroke was detected.
- the control unit 50 may retain the detected stroke in association with the time ta at which the transportation work was detected.
- control unit 50 when the control unit 50 stores the current detection result in the storage unit 52 (S16), it ends the work detection processing (S3) shown in this flow chart for the one frame selected in step S2. After that, the control unit 50 returns to the process of the flowchart of FIG. 5 and proceeds to step S4. For example, after the work detection process (S3) is executed in the frame of FIG. 3A, the process proceeds to NO in step S4, and the frame of time tb corresponding to the scene of FIG. 3B is selected (S2) to detect work. Processing (S3) is performed again.
- the image recognition in step S11 does not detect the process of "carrying" the load in the transportation work, and it is determined that the target work has not been detected (NO in S12). In this way, the non-detection period T1 illustrated in the work timeline 70 before correction in FIG.
- FIG. 7 illustrates a captured image Im including a period during which the worker W does not carry out the transportation work.
- the work analysis device 5 of the present embodiment performs the work detection process (S3) so as not to erroneously interpolate the detection result while the worker W does not carry out the transportation work. This point will be described with reference to FIG.
- FIG. 7(A) shows a scene where the worker W puts the luggage at the carry-out port 62, similar to the scene in FIG. 3(C).
- the target work is detected after time tc, for example, according to the captured image Im of FIG. 7A.
- step S11 the target work is not detected in step S11 (NO in S12).
- the control unit 50 proceeds to step S16 to store the current detection result, and then proceeds to NO in step S4 of FIG.
- the control unit 50 determines that the target work is detected in the same way as when the process of "taking” the load is detected in the example of FIG.
- the time is stored (S15).
- the control unit 50 stores in the storage unit 52 the time at which the “carrying” stroke was detected, in addition to the time at which the “taking” stroke was detected, for example.
- the control unit 50 interpolates the detection result up to the time when the immediately preceding "taking" stroke is detected, for example (S14). Accordingly, even when the target work is detected at a time in the middle of the non-detection period T1, the detection result can be interpolated with high accuracy.
- control unit 50 when storing the current detection result (S16), the control unit 50 does not store the category of each process of the transportation work, but only whether it is a transportation work or not, that is, whether it is a target work or a non-work. may be stored.
- FIG. 8 is a flowchart illustrating work detection processing in the work analysis device 5 according to the modification of the first embodiment. For example, instead of the process of step S13 in FIG. 6, the control unit 50 determines whether or not the current process is "taking" based on the detection result acquired in step S11 (S13A).
- the plurality of processes include a process of picking up a load as an example of a first process that starts the target work, a process of carrying the load as an example of a second process that continues the target work, and a process that ends the target work.
- An example of the three steps includes the step of placing the luggage.
- the accurately corrected work timeline 7 can be obtained without erroneously interpolating the detection result of the period in which the target work is not performed.
- the target work is a transport work of moving a load as an example of an object.
- the first process is the process of picking up the load (an example of the process of starting the movement of the object)
- the second process is the process of transporting the load (an example of the process of continuing the movement of the object)
- the third process is and the process of placing the load (an example of the process of finishing the movement of the object).
- the objects in the transportation work are not limited to packages, and may be various articles such as parts.
- the target work is not limited to transportation work, and may be assembly work for attaching parts.
- a program is provided for causing the control unit of the computer to execute the work analysis method as described above. According to the work analysis method of the present embodiment, it is possible to accurately detect a target work such as a transport work as an example of a specific work performed by the worker W.
- Embodiment 2 Although Embodiment 1 described the work analysis device 5 that realizes interpolation processing by machine learning that identifies each process of the transportation work using the work detection model 51, such machine learning is not essential. In the second embodiment, a work analysis device 5 that realizes interpolation processing by using the moving direction of the worker W corresponding to a specific stroke in the transportation work will be described.
- FIG. 9 is a diagram for explaining work detection processing in the work analysis device 5 of this embodiment.
- FIG. 10 is a flowchart illustrating work detection processing according to the present embodiment.
- FIG. 9(A) shows a captured image Im of a worker W carrying a load from the transport line 61 to the carry-out port 62 as in FIG. 3(B).
- FIG. 9(B) shows a captured image Im of the worker W who turns from the outlet 62 to the transport line 61 in the same manner as in FIG. 7(B).
- a predetermined direction is set in advance on the captured image Im, corresponding to the process in which the worker W carries the load in the work place 6 .
- the direction from the transport line 61 to the outlet 62 is set as the predetermined direction.
- control unit 50 performs the determination (S13) of the process identified by the work detection model 51 in the work detection process (S3) of the first embodiment. It is determined whether or not there is (S13B).
- the control unit 50 detects the transportation work based on the image data, for example (S11), and determines the movement direction of the worker W based on the position of the detection area in the obtained detection result.
- control unit 50 compares the position of the previous detection area with the position of the current detection area, and determines whether the moving direction of the worker W is in the predetermined direction. It is determined whether or not there is (S13B).
- the worker W is moving in a predetermined direction, and it can be estimated that the worker W is carrying out a process of carrying a load.
- the control unit 50 interpolates the detection results from the time of the current target work to the time of the previous target work with the target work. (S14).
- the worker W is moving in a direction different from the predetermined direction, and it can be estimated that this is not a process of carrying luggage. If the control unit 50 determines that the movement direction of the worker W is not the predetermined direction (NO in S13B), the control unit 50 does not interpolate the detection result, and proceeds to step S15.
- the control unit 50 determines whether or not the direction in which the worker W moved is the predetermined direction when the target work is detected as an example of the work (YES in S12), based on the image data. (S13B), and based on the direction determination result, the work detection information such as the work timeline 70 before correction is corrected (S14 to S16).
- the predetermined direction is set on the captured image Im as an example of the image indicated by the image data, corresponding to the process of carrying the luggage as an example of the second process.
- the work analysis device 5 of the present embodiment can also interpolate the detection result of the non-detection period T1 to obtain an appropriately corrected work timeline 7 (S5).
- step S11 the control unit 50 acquires the detection results of the work detection model 51 and the image recognition model, and in step S13B, compares the position of the worker W recognized last time and this time, and The direction of movement of W may be determined.
- the work detection model 51 may be machine-learned so as to output the detection area of the worker W who is not performing the target work as a category different from the detection area of the worker W who is performing the target work.
- FIG. 11 is a diagram for explaining work detection processing in the work analysis device 5 of this embodiment.
- FIG. 12 is a flowchart illustrating work detection processing according to the present embodiment.
- the control unit 50 determines whether the current detection area is included in the first exclusion area 81 based on the position of the detection area in the detection result of the work (S11). It is determined whether or not it is within the first exclusion area 81 (S13C). For example, the control unit 50 determines that the detection area is included in the first exclusion area 81 when the ratio of the detection area overlapping the first exclusion area 81 in the captured image Im is equal to or greater than a predetermined ratio.
- the predetermined ratio is set in advance as a large ratio (for example, 80%) to the extent that it can be considered that a part of the worker W's body enters the first exclusion area 81, for example.
- step S13C may be made according to the positional relationship of the area on the map corresponding to 81 .
- control unit 50 determines that the detection region R1 is included in the first exclusion region 81 (YES in S13C), and does not execute the interpolation process (S14).
- control unit 50 performs interpolation processing (S14).
- the control unit 50 is defined by the position on the image as an example of the detection position where the worker W was positioned when the target work (an example of work) was detected, based on the image data. It is determined whether or not the detection region R1 is included in the first exclusion region 81 as an example of at least one predetermined region (S13C), and the work timeline 7 (an example of work detection information) is determined based on the determination result of the detection position. correct. At least one predetermined area is set corresponding to at least one of the first and third processes on the captured image Im as an example of the image indicated by the image data. For example, the first exclusion area 81 is set corresponding to the process of picking up the luggage as an example of the first process. As a result, the work timeline 7 can be corrected based on the detected positions without using the identification result of each process of the transportation work by the work detection model 51 .
- the control unit 50 interpolates the detection result of the target work in the undetected period T1 (S14), is included in the first exclusion area 81 (YES in S13C), the detection result of the target work during the non-detection period such as the period T2 is not interpolated.
- the detection position is included in the first exclusion area 81, it can be estimated that the process after the non-detection period corresponds to the process of picking up the load, that is, the process of starting the work, so the detection result of the period T2 is A work timeline 7 accurately corrected without erroneous interpolation can be obtained.
- the work analysis device 5 uses the area corresponding to the process of starting the transportation work for the interpolation process in the work detection process. In the fourth embodiment, the work analysis device 5 that uses areas corresponding to the process of ending the transportation work in addition to the process of starting will be described.
- FIG. 13 is a diagram for explaining work detection processing in the work analysis device 5 of this embodiment.
- FIG. 14 is a flowchart illustrating work detection processing according to the present embodiment.
- FIG. 13(A) shows a captured image Im of a worker W who puts a load in the carry-out port 62, similarly to FIG. 7(A).
- FIG. 13(B) shows a captured image Im of the worker W who picks up the load from the transfer line 61 after placing the load from the scene of FIG. 13(A).
- a region on the captured image Im 2 illustrates the second exclusion area 82 that is preset as .
- the second exclusion area 82 for example, when the worker W performs the transportation work in the work place 6, the position where the cargo taken from the transportation line 61 and carried is placed at the carry-out port 62, that is, the position where the transportation work is completed. area.
- the first exclusion area 81 and the second exclusion area 82 are examples of the predetermined area in this embodiment, and examples of the start area and end area in this embodiment, respectively.
- the worker W when the detection area R2 where the transportation work is detected is included in the second exclusion area 82, the worker W does not perform the process of placing the load, that is, the process of completing one transportation work. presumed to do.
- FIG. 13B when the detection region R1 is included in the first exclusion region 81, it is estimated that the worker W will perform the process of picking up the next load, that is, the process of starting the next transportation work. can.
- the worker W turns around from the carry-out port 62 to the transfer line 61 without holding the load, that is, does not perform the target work. It is thought that
- the worker W is performing any process of the transportation work during the undetected period T1 immediately before the transportation work is detected.
- the work analysis device 5 interpolates the detection result of the non-detection period T1.
- the control unit 50 determines whether the detection area is within the first exclusion area 81. In addition, it is determined whether the detection area of the previous target work is within the second exclusion area 82 (S13D).
- the control unit 50 determines that the current detection area is within the first exclusion area 81 and the detection area of the previous target work is within the second exclusion area 82. It is determined whether or not there is (S13D). For example, the control unit 50 refers to the time of the previous target work stored in step S15 and the past detection results stored in step S16 to identify the detection area of the previous target work. The control unit 50 determines whether the current and previous detection areas are within the first exclusion area 81 and the second exclusion area 82, respectively, based on the same criteria as in step S13C in the work detection process of the third embodiment, for example. to decide.
- the control unit 50 performs interpolation processing ( S14) is skipped and the process proceeds to step S15.
- control The unit 50 performs interpolation processing (S14).
- At least one predetermined area includes, as an example of a start area corresponding to the first process, the first exclusion area 81 corresponding to the process of picking up the baggage and the end area corresponding to the third process.
- An example of a region includes a second exclusion region 82 corresponding to the itinerary of placing a load.
- the control unit 50 selects the case where the immediately preceding detection area is not included in the second exclusion area 82, and the current detection area as an example of the detection position after the non-detection period.
- the detection result of the target work (an example of work) in the non-detection period T1 is interpolated (S14).
- the control unit 50 includes the detection region R2 as an example of the detection position before the non-detection period in the second exclusion region 82, and the detection region R1 as an example of the detection position after the non-detection period in the first exclusion region.
- the detection result of the target work in the non-detection period such as the period T2 is not interpolated.
- the control unit 50 sets the detection region of the previous target work as an example of the detection position recognized before the non-detection period to the second exclusion region 82 in the captured image Im as an example of the image indicated by the image data. It is determined whether or not the current detection area is included in the first exclusion area 81 (an example of the start area) as an example of the detection position included in the (an example of the end area) and recognized after the non-detection period. (S13D).
- Embodiments 1 to 4 have been described as examples of the technology disclosed in the present application.
- the technology in the present disclosure is not limited to this, and can be applied to embodiments in which modifications, substitutions, additions, omissions, etc. are made as appropriate.
- the work analysis device 5 that realizes interpolation processing by determining the process after the non-detection period (S13, S13A) in the work detection process (S3) has been described.
- the work analysis device 5 of the present embodiment may further implement interpolation processing according to the process before the undetected period. For example, if the control unit 50 of the work analysis device 5 retains information on the process together with the time of the target work in step S15, even if an undetected period occurs, when the target work is detected after that, the previous
- the above processing can be performed by referring to the information at the time of detection of the target work. Such modifications will be described with reference to FIGS. 15 and 16.
- FIG. 15 and 16 Such modifications will be described with reference to FIGS. 15 and 16.
- FIG. 15 illustrates a flowchart of work detection processing in this modified example.
- the control unit 50 determines whether the process of the target work detected immediately before is "put”. It judges (S21). The control unit 50 performs the process of step S21, for example, by referring to the process information held in step S15 in the immediately preceding work detection process.
- the control unit 50 sets the undetected period such as the period T2. Do not interpolate. The control unit 50 interpolates the undetected period T1 in at least one of the case where the immediately preceding process is not “putting” (NO in S21) and the case where the current process is not “taking” (NO in S13A). (S14).
- FIG. 16 illustrates a flowchart of work detection processing in this modified example.
- the control unit 50 interpolates the detection result of the non-detection period such as the period T2 if the immediately preceding process is "put" by the same judgment as in step S21 of FIG. 15 (YES in S21). do not.
- the control unit 50 interpolates the undetected period T1 when the immediately preceding process is not "placing" (NO in S21), that is, when the immediately preceding process is "taking” or "carrying".
- the work analysis device 5 has the same configuration as in each of the above-described embodiments, and the control unit 50 detects the target work before and after the non-detection period during which the target work is not detected.
- Work detection information such as the work timeline 70 before correction may be corrected so as to interpolate the detection result of the target work in the non-detection period T1 according to the process at the time of detection. This also makes it possible to accurately detect the target work as an example of the specific work performed by the worker W.
- the control unit 50 when the process before the non-detection period is the third process and the process after the non-detection period is the first process, the control unit 50 performs detection during the non-detection period such as the period T2. The result may not be interpolated. In at least one of the case where the stroke before the undetected period is the first or second stroke and the case where the stroke after the undetected period is the second or third stroke, the work detection result in the non-detection period T1 may be interpolated.
- control unit 50 does not interpolate the undetected period when the stroke preceding the undetected period is the third stroke, and does not interpolate the undetected period when the stroke preceding the undetected period is the first or second stroke. You may make it interpolate a period.
- interpolation processing (S14) was performed according to the process (S13) at the time when the target work was detected.
- the work analysis device 5 of the present embodiment may perform interpolation processing (S14) according to the period from the current time to the time of the previous target work in addition to the process at the current time.
- interpolation may be performed if the period of interest is shorter than a predetermined threshold.
- the predetermined threshold value is set in advance as a short period that can be regarded as a period required for the worker W to carry the load, for example.
- the target work of the work analysis system 1 is one type of transportation work.
- the work analysis system 1 of this embodiment may be applied to a plurality of works as target works. For example, when detecting a plurality of types of transportation work for transporting different types of objects, interpolation processing may be performed on the detection results of each transportation work in the same manner as in the above-described embodiments.
- the target work of the work analysis system 1 is transportation work.
- the target work of the work analysis system 1 is not limited to the transportation work, and may be any work including a plurality of processes.
- the present disclosure is applicable to data analysis applications for analyzing workers' work in various environments such as logistics sites or factories.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Economics (AREA)
- Psychiatry (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Human Computer Interaction (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Strategic Management (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Human Resources & Organizations (AREA)
- Entrepreneurship & Innovation (AREA)
- Development Economics (AREA)
- General Factory Administration (AREA)
- Image Analysis (AREA)
Abstract
Description
1.構成
実施形態1に係る作業分析システムについて、図1を用いて説明する。図1は、本実施形態に係る作業分析システム1の概要を示す図である。
本システム1は、図1に示すように、カメラ2と、作業分析装置5とを備える。本システム1は、物流倉庫などの作業場6において、運搬作業などの作業を行う作業者Wの効率等を分析する用途に適用される。本システム1は、例えば作業場6の管理者または分析担当者といったユーザ3に、所定の分析期間に関する作業タイムライン7を提示するためのモニタ4を備えてもよい。分析期間は、本システム1においてカメラ2を用いた画像認識等による分析対象の期間であり、例えば1日から数か月などに予め設定される。
図2は、作業分析装置5の構成を例示するブロック図である。図2に例示する作業分析装置5は、制御部50と、記憶部52と、操作部53と、機器インタフェース54と、出力インタフェース55とを備える。以下、インタフェースを「I/F」と略記する。
以上のように構成される作業分析システム1及び作業分析装置5の動作について、以下説明する。
本実施形態の作業分析システム1において、作業者Wの時刻毎の作業を特定する上で課題となる場面について、図3及び図4を用いて説明する。以下では、対象作業が運搬作業である例を説明する。
作業分析システム1における作業分析装置5の全体動作について、図5を用いて説明する。
図5のステップS3における作業検知処理の詳細を、図6及び図7を用いて説明する。
実施形態1において、作業分析装置5は、今の行程が「置く」である場合に(S13でYES)補間の処理(S14)を行ったが、今の行程が運搬作業を開始する「取る」行程であるか否かに応じて、補間の処理を行ってもよい。このような変形例について、図8を用いて説明する。
以上のように、本実施形態における作業分析装置5は、作業者Wが行う作業に関する情報を生成する。作業分析装置5は、取得部の一例として機器I/F54と、制御部50とを備える。機器I/F54は、作業を行う作業者Wが撮像された画像を示す画像データを取得する(S1)。制御部50は、画像データに基づいて、作業者Wが行った作業の一例として対象作業を順次、検知して(S2~S4)、対象作業の検知結果を示す作業検知情報の一例として作業タイムライン7を生成する(S5)。対象作業は、作業者Wによって行われる複数の行程を含む。制御部50は、対象作業が検知されなかった未検知期間の後に対象作業が検知された際の行程に応じて、未検知期間T1における対象作業の検知結果を補間するように、補正前の作業タイムライン70のような作業検知情報を補正する(S3)。
実施形態1では、作業検知モデル51により運搬作業の各行程を識別する機械学習により、補間の処理を実現する作業分析装置5を説明したが、こうした機械学習は必須ではない。実施形態2では、運搬作業における特定の行程に対応した作業者Wの移動方向を用いることで、補間の処理を実現する作業分析装置5を説明する。
実施形態2では、作業検知処理において、対象作業における特定の行程に対応した作業者Wの移動方向を用いて補間の処理を実現する作業分析装置5を説明した。実施形態3では、作業場6において作業者Wが特定の行程に対応して位置する領域を用いて、補間の処理を実現する作業分析装置5を説明する。
実施形態3では、作業検知処理において運搬作業を開始する行程に対応した領域を補間の処理のために用いる作業分析装置5を説明した。実施形態4では、開始の行程に加えて、運搬作業を終了する行程に対応した領域を用いる作業分析装置5を説明する。
以上のように、本出願において開示する技術の例示として、実施形態1~4を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置換、付加、省略などを行った実施の形態にも適用可能である。また、上記各実施形態で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。そこで、以下、他の実施形態を例示する。
Claims (10)
- 作業者が行う作業に関する情報を生成する作業分析装置であって、
前記作業を行う前記作業者が撮像された画像を示す画像データを取得する取得部と、
前記画像データに基づいて、前記作業者が行った作業を順次、検知して、前記作業の検知結果を示す作業検知情報を生成する制御部と
を備え、
前記作業は、前記作業者によって行われる複数の行程を含み、
前記制御部は、前記作業が検知されなかった未検知期間の前後に前記作業が検知された際の行程に応じて、前記未検知期間における前記作業の検知結果を補間するように、前記作業検知情報を補正する
作業分析装置。 - 前記複数の行程は、前記作業を開始する第1行程と、前記作業を継続する第2行程と、前記作業を終了する第3行程とを含み、
前記制御部は、前記未検知期間の後の行程が前記第2行程である場合と、前記未検知期間の後の行程が前記第3行程である場合との少なくとも一方において、前記未検知期間における前記作業の検知結果を補間する
請求項1に記載の作業分析装置。 - 前記制御部は、前記未検知期間の後の行程が前記第1行程である場合、前記未検知期間における前記作業の検知結果を補間しない
請求項2に記載の作業分析装置。 - 前記作業は、物体を移動させる運搬作業であり、
前記運搬作業は、前記第1行程として前記物体の移動を開始する行程と、前記第2行程として前記物体の移動を継続する行程と、前記第3行程として前記物体の移動を終了する行程とを含む
請求項2又は3に記載の作業分析装置。 - 前記制御部は、前記画像データに基づいて、前記作業の検知時に前記作業者により行われた行程を識別して、前記行程の識別結果に基づき前記作業検知情報を補正する
請求項1から4のいずれか1項に記載の作業分析装置。 - 前記制御部は、前記画像データに基づいて、前記作業の検知時に前記作業者が移動した方向が所定方向であるか否かを判断して、前記方向の判断結果に基づき前記作業検知情報を補正し、
前記所定方向は、前記画像データが示す画像上で、前記第2行程に対応して設定される
請求項2から4のいずれか1項に記載の作業分析装置。 - 前記制御部は、前記画像データに基づいて、前記作業の検知時に前記作業者が位置した検知位置が、少なくとも1つの所定領域に含まれるか否かを判断して、前記検知位置の判断結果に基づき前記作業検知情報を補正し、
前記少なくとも1つの所定領域は、前記画像データが示す画像上で、前記第1及び第3行程のうちの少なくとも一方に対応して設定される
請求項2から4のいずれか1項に記載の作業分析装置。 - 前記少なくとも1つの所定領域は、前記第1行程に対応する開始領域と、前記第3行程に対応する終了領域とを含み、
前記制御部は、
前記未検知期間の前の前記検知位置が前記終了領域に含まれない場合と、前記未検知期間の後の前記検知位置が前記開始領域に含まれない場合との少なくとも一方であると判断すると、前記未検知期間における前記作業の検知結果を補間し、
前記未検知期間の前の前記検知位置が前記終了領域に含まれて、且つ前記未検知期間の後の前記検知位置が前記開始領域に含まれると判断すると、前記未検知期間における前記作業の検知結果を補間しない
請求項7に記載の作業分析装置。 - 作業者が行う作業に関する情報を生成する作業分析方法であって、
コンピュータの制御部が、
前記作業者が撮像された画像を示す画像データを取得するステップと、
前記画像データに基づいて、前記作業者が行った作業を順次、検知して、前記作業の検知結果を示す作業検知情報を生成するステップと
を含み、
前記作業は、前記作業者によって行われる複数の行程を含み、
前記コンピュータの制御部が、
前記作業が検知されなかった未検知期間の前後に前記作業が検知された際の行程に応じて、前記未検知期間における前記作業の検知結果を補間するように、前記作業検知情報を補正する
作業分析方法。 - 請求項9に記載の作業分析方法をコンピュータの制御部に実行させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280045627.1A CN117581271A (zh) | 2021-06-28 | 2022-03-22 | 作业分析装置以及方法 |
JP2023531430A JPWO2023276332A1 (ja) | 2021-06-28 | 2022-03-22 | |
EP22832495.0A EP4365835A1 (en) | 2021-06-28 | 2022-03-22 | Work analysis device and method |
US18/536,634 US20240112499A1 (en) | 2021-06-28 | 2023-12-12 | Image analysis device and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-106645 | 2021-06-28 | ||
JP2021106645 | 2021-06-28 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/536,634 Continuation US20240112499A1 (en) | 2021-06-28 | 2023-12-12 | Image analysis device and method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023276332A1 true WO2023276332A1 (ja) | 2023-01-05 |
Family
ID=84692613
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/013283 WO2023276332A1 (ja) | 2021-06-28 | 2022-03-22 | 作業分析装置及び方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240112499A1 (ja) |
EP (1) | EP4365835A1 (ja) |
JP (1) | JPWO2023276332A1 (ja) |
CN (1) | CN117581271A (ja) |
WO (1) | WO2023276332A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009032033A (ja) | 2007-07-27 | 2009-02-12 | Omron Corp | 動作境界検出方法および作業分析システム |
JP2019193019A (ja) * | 2018-04-20 | 2019-10-31 | キヤノン株式会社 | 作業分析装置、作業分析方法 |
JP2021077177A (ja) * | 2019-11-11 | 2021-05-20 | 株式会社リコー | 動作認識装置、動作認識方法及び動作認識プログラム |
JP2021082137A (ja) * | 2019-11-21 | 2021-05-27 | キヤノン株式会社 | 行動認識装置及びその制御方法及びプログラム |
-
2022
- 2022-03-22 EP EP22832495.0A patent/EP4365835A1/en active Pending
- 2022-03-22 CN CN202280045627.1A patent/CN117581271A/zh active Pending
- 2022-03-22 WO PCT/JP2022/013283 patent/WO2023276332A1/ja active Application Filing
- 2022-03-22 JP JP2023531430A patent/JPWO2023276332A1/ja active Pending
-
2023
- 2023-12-12 US US18/536,634 patent/US20240112499A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009032033A (ja) | 2007-07-27 | 2009-02-12 | Omron Corp | 動作境界検出方法および作業分析システム |
JP2019193019A (ja) * | 2018-04-20 | 2019-10-31 | キヤノン株式会社 | 作業分析装置、作業分析方法 |
JP2021077177A (ja) * | 2019-11-11 | 2021-05-20 | 株式会社リコー | 動作認識装置、動作認識方法及び動作認識プログラム |
JP2021082137A (ja) * | 2019-11-21 | 2021-05-27 | キヤノン株式会社 | 行動認識装置及びその制御方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP4365835A1 (en) | 2024-05-08 |
CN117581271A (zh) | 2024-02-20 |
JPWO2023276332A1 (ja) | 2023-01-05 |
US20240112499A1 (en) | 2024-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019090268A1 (en) | Contextual training systems and methods | |
US10334965B2 (en) | Monitoring device, monitoring system, and monitoring method | |
US11908293B2 (en) | Information processing system, method and computer readable medium for determining whether moving bodies appearing in first and second videos are the same or not using histogram | |
CN108629284A (zh) | 基于嵌入式视觉系统的实时人脸跟踪和人脸姿态选择的方法及装置 | |
JP2011170684A (ja) | 対象物追跡装置、対象物追跡方法、および対象物追跡プログラム | |
US20110246122A1 (en) | Motion determination apparatus, method and computer readable medium | |
WO2014050432A1 (ja) | 情報処理システム、情報処理方法及びプログラム | |
US20120020521A1 (en) | Object position estimation apparatus, object position estimation method, and object position estimation program | |
Urgo et al. | A human modelling and monitoring approach to support the execution of manufacturing operations | |
US11176700B2 (en) | Systems and methods for a real-time intelligent inspection assistant | |
Knoch et al. | Technology-enhanced process elicitation of worker activities in manufacturing | |
WO2023276332A1 (ja) | 作業分析装置及び方法 | |
KR20200068709A (ko) | 인체 식별 방법, 장치 및 저장 매체 | |
JP7428769B2 (ja) | 柔軟で適応的なロボット学習のための人間ロボット協働 | |
JP2019159885A (ja) | 動作分析装置、動作分析方法、動作分析プログラム及び動作分析システム | |
US20220198802A1 (en) | Computer-implemental process monitoring method, device, system and recording medium | |
Arita et al. | Maneuvering assistance of teleoperation robot based on identification of gaze movement | |
WO2022209082A1 (ja) | 作業分析装置 | |
CN114596239A (zh) | 装卸货事件检测方法、装置、计算机设备和存储介质 | |
US10712725B2 (en) | Non-transitory computer-readable storage medium, robot transfer time estimation method, and robot transfer time estimation device | |
EP4354388A1 (en) | Task analysis device and method | |
CN112567401B (zh) | 动作分析装置、动作分析方法及其程序的记录媒体 | |
WO2022202178A1 (ja) | 機械学習用の学習データ生成装置、学習データ生成システム及び学習データ生成方法 | |
Ravichandar et al. | Gyro-aided image-based tracking using mutual information optimization and user inputs | |
US20220230333A1 (en) | Information processing system, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22832495 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023531430 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280045627.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022832495 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022832495 Country of ref document: EP Effective date: 20240129 |