WO2021210112A1 - Dispositif d'analyse d'opération, procédé d'analyse d'opération et programme d'analyse d'opération - Google Patents

Dispositif d'analyse d'opération, procédé d'analyse d'opération et programme d'analyse d'opération Download PDF

Info

Publication number
WO2021210112A1
WO2021210112A1 PCT/JP2020/016640 JP2020016640W WO2021210112A1 WO 2021210112 A1 WO2021210112 A1 WO 2021210112A1 JP 2020016640 W JP2020016640 W JP 2020016640W WO 2021210112 A1 WO2021210112 A1 WO 2021210112A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
order
basic
work
cycles
Prior art date
Application number
PCT/JP2020/016640
Other languages
English (en)
Japanese (ja)
Inventor
士人 新井
勝大 草野
尚吾 清水
奥村 誠司
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2022514935A priority Critical patent/JP7086322B2/ja
Priority to PCT/JP2020/016640 priority patent/WO2021210112A1/fr
Priority to TW109131846A priority patent/TW202141328A/zh
Publication of WO2021210112A1 publication Critical patent/WO2021210112A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • This disclosure relates to a motion analyzer, a motion analysis method, and a motion analysis program.
  • labeling work is required to create teacher data.
  • the breakdown of labeling work is the designation of work delimiters and the labeling of work elements.
  • all of these tasks are done manually.
  • the labeling target is all the data used for learning.
  • the conventional manual labeling work requires an enormous amount of time.
  • the judgment of the division of work is made manually as in the conventional case, the judgment fluctuates. For this reason, it is difficult to specify work divisions based on a certain standard, and it is difficult to keep the labeling quality constant.
  • the cause of the fluctuation of judgment is considered to be the difference in judgment criteria when the person performing the labeling work is different, or the change in concentration due to the situation such as fatigue.
  • Patent Document 1 discloses a technique of dividing the locus data when the hand position invades a specific area and estimating the content of the operation by matching the locus data of the division unit with the defined model locus. ing.
  • Patent Document 1 In motion analysis technology, it is important to create teacher data with high accuracy.
  • the technique of Patent Document 1 can reduce the time required for the analysis of the work, but cannot improve the accuracy of the analysis of the work.
  • the creation of teacher data there is a problem that it is difficult to create the teacher data with high accuracy while reducing the creation time.
  • the purpose of this disclosure is to create teacher data used in motion analysis technology with high accuracy while reducing the creation time.
  • the motion analyzer is A detector that detects operation breaks from video data in which multiple cycles, each of which includes multiple tasks, are captured.
  • a basic motion extraction unit that classifies a plurality of motion elements separated by the motion delimiter and extracts a basic motion in which similar motion elements are grouped based on the classification results of the plurality of motion elements.
  • An order determining unit that determines the order of basic operations in one cycle as the standard basic operation order based on the order of basic operations for each of the plurality of cycles.
  • the work labeling for the standard basic operation order is accepted, and the work labeling is executed for each cycle of the plurality of cycles based on the standard basic operation order in which the work is labeled.
  • a teacher data output unit that outputs each cycle of the plurality of cycles labeled with the work as teacher data is provided.
  • the basic motion extraction unit extracts the basic motion based on the classification results of a plurality of motion elements.
  • the order determining unit determines the order of basic operations in one cycle as the standard basic operation order based on the order of basic operations for each cycle.
  • the teacher data output unit accepts the labeling of the work with respect to the standard basic operation order, and outputs the data of all the cycles in which the work is labeled as the teacher data. Therefore, according to the motion analyzer according to the present disclosure, when the teacher data for motion analysis is created, the teacher data can be created with high accuracy while reducing the creation time.
  • a configuration example of the motion analyzer according to the first embodiment Definition of terms in the motion analysis process by the motion analyzer according to the first embodiment. Outline of the operation of the motion analyzer according to the first embodiment. A learning phase of the motion analysis process of the motion analysis device according to the first embodiment.
  • the motion analysis device 100 is a device having a motion analysis function that automatically analyzes the work time and work procedure of a worker such as a factory line worker.
  • the motion analyzer 100 is a computer.
  • the motion analysis device 100 includes a processor 910 and other hardware such as a memory 921, an auxiliary storage device 922, an input interface 930, an output interface 940, and a communication device 950.
  • the processor 910 is connected to other hardware via a signal line and controls these other hardware.
  • the motion analysis device 100 includes a detection unit 110, a basic motion extraction unit 120, an order determination unit 130, a teacher data output unit 140, a learning model generation unit 170, an analysis unit 150, and a storage unit 160 as functional elements.
  • the storage unit 160 stores video data 21 in which a plurality of cycles including a plurality of tasks are captured, teacher data 23, and a learning model 22 used for motion analysis.
  • the data format of video data is not limited. It may be a color image taken by an RBG camera or a monochrome image. An image having depth information taken by a Depth camera may be used.
  • the functions of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150 are realized by software.
  • the storage unit 160 is provided in the memory 921.
  • the storage unit 160 may be provided in the auxiliary storage device 922, or may be provided in the memory 921 and the auxiliary storage device 922 in a distributed manner.
  • the processor 910 is a device that executes a motion analysis program.
  • the motion analysis program is a program that realizes the functions of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150.
  • the processor 910 is an IC (Integrated Circuit) that performs arithmetic processing. Specific examples of the processor 910 are a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a GPU (Graphics Processing Unit).
  • the memory 921 is a storage device that temporarily stores data.
  • a specific example of the memory 921 is a SRAM (Static Random Access Memory) or a DRAM (Dynamic Random Access Memory).
  • the auxiliary storage device 922 is a storage device that stores data.
  • a specific example of the auxiliary storage device 922 is an HDD.
  • the auxiliary storage device 922 may be a portable storage medium such as an SD (registered trademark) memory card, CF, NAND flash, flexible disk, optical disk, compact disc, Blu-ray (registered trademark) disk, or DVD.
  • HDD is an abbreviation for Hard Disk Drive.
  • SD® is an abbreviation for Secure Digital.
  • CF is an abbreviation for CompactFlash®.
  • DVD is an abbreviation for Digital Versatile Disc.
  • the input interface 930 is a port connected to an input device such as a mouse, keyboard, or touch panel. Specifically, the input interface 930 is a USB (Universal Serial Bus) terminal. The input interface 930 may be a port connected to a LAN (Local Area Network).
  • LAN Local Area Network
  • the output interface 940 is a port to which a cable of an output device such as a display is connected.
  • the output interface 940 is a USB terminal or an HDMI (registered trademark) (High Definition Multimedia Interface) terminal.
  • the display is an LCD (Liquid Crystal Display).
  • the output interface 940 is also referred to as a display interface.
  • the communication device 950 has a receiver and a transmitter.
  • the communication device 950 is connected to a communication network such as a LAN, the Internet, or a telephone line.
  • the communication device 950 is a communication chip or a NIC (Network Interface Card).
  • the motion analysis program is executed in the motion analysis device 100.
  • the motion analysis program is read into processor 910 and executed by processor 910.
  • the OS Operating System
  • the processor 910 executes the motion analysis program while executing the OS.
  • the motion analysis program and the OS may be stored in the auxiliary storage device 922.
  • the motion analysis program and the OS stored in the auxiliary storage device 922 are loaded into the memory 921 and executed by the processor 910. A part or all of the motion analysis program may be incorporated in the OS.
  • the motion analyzer 100 may include a plurality of processors that replace the processor 910. These plurality of processors share the execution of the motion analysis program.
  • Each processor like the processor 910, is a device that executes a motion analysis program.
  • Data, information, signal values and variable values used, processed or output by the motion analysis program are stored in the memory 921, the auxiliary storage device 922, or the register or cache memory in the processor 910.
  • the “parts” of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150 are read as “process", “procedure”, or “process”. You may.
  • the motion analysis program causes a computer to execute detection processing, basic motion extraction processing, order determination processing, teacher data output processing, learning model generation processing, and analysis processing.
  • "Processing" of detection processing, basic operation extraction processing, order determination processing, teacher data output processing, learning model generation processing, and analysis processing are "program”, “program product”, and “computer-readable storage medium that stores the program”. , Or "a computer-readable recording medium on which a program is recorded” may be read.
  • the motion analysis method is a method performed by the motion analysis device 100 executing a motion analysis program.
  • the motion analysis program may be provided stored in a computer-readable recording medium. Further, the motion analysis program may be provided as a program product.
  • the screw tightening work process described here refers to a series of work processes in which a processed product as a main body is prepared, screws are tightened in two places, and the process is sent to the next process.
  • Cycle 10 refers to one work process.
  • one series of work steps of "stationary”, “screw tightening”, “screw tightening”, and “feeding” is referred to as cycle 10.
  • each element of “stationary”, “screw tightening”, and “feeding” is referred to as work 11.
  • the work delimiter 12 refers to a time point at which the work 11 starts or ends.
  • the work 11 refers to the elements constituting the cycle 10 separated by the work division 12. Work 11 is also called a work element.
  • each of “stationary”, “screw tightening”, and “feeding” is work 11.
  • the particle size of the work 11 may be arbitrarily determined by the user. As shown in FIG.
  • each of "stationary” and “screw tightening” may be used as a working element, or “stationary” and “screw tightening” may be combined to form “screw tightening”.
  • Each of the operations 11 is composed of a plurality of basic operations 15 described later.
  • the motion described here refers to a movement of a person whose start time or end time can be mechanically specified by video analysis or signal processing. Specifically, the speed or the direction of the movement clearly changes at the start point and the end point of the movement, such as "reaching a hand” or "pulling a hand”.
  • the operation delimiter 14 refers to the start time point or the end time point of the operation. It is assumed that the operation break 14 can be automatically detected. The operation delimiter 14 and the work delimiter 12 partially match.
  • the operation element 13 is an operation that constitutes the work 11 and is separated by the operation division 14. Taking the “stationary” operation of FIG. 2 as an example, each of “reaching for the main body” and “lowering the main body” becomes the operating element 13. As a caveat, even if the operation is the same, the operation elements are different. As shown in FIG. 2, the operation of "lifting the driver” appears twice, but the first time and the second time are different operation elements 13.
  • the basic operation 15 is a collection of similar operation elements 13.
  • the basic operation 15 represents a basic operation constituting the work 11. In FIG. 2, each basic operation is represented by the same pattern.
  • the operation procedure of the motion analysis device 100 corresponds to the motion analysis method.
  • the program that realizes the operation of the motion analysis device 100 corresponds to the motion analysis program.
  • the motion analysis process by the motion analyzer 100 includes a learning phase S100 and a recognition phase S200.
  • the learning phase S100 includes (1) detection process (step S101), (2) basic motion extraction process (step S102), (3) order determination process (step S103), and (4) teacher data output process (step S104). (5) A learning model generation process (step S105) is provided. Note that (1), (2), (3) and (5) are automatically executed.
  • the detection unit 110 detects the operation division 14 from the video data 21 in which a plurality of cycles 10 each including a plurality of operations 11 are captured.
  • a feature amount obtained by image processing such as motion information of a video may be used, or a locus of skeleton information of an operator extracted from the video may be used.
  • the basic motion extraction unit 120 classifies a plurality of motion elements 13 separated by the motion delimiter 14, and classifies similar motion elements 13 based on the classification results of the plurality of motion elements 13.
  • the summarized basic operation 15 is extracted.
  • the order determination unit 130 determines the order of the basic operations 15 in one cycle as the standard basic operation order 31 based on the order of the basic operations 15 for each cycle of the plurality of cycles 10.
  • the teacher data output unit 140 accepts work labeling for the standard basic operation order 31.
  • the teacher data output unit 140 automatically executes the work labeling for each cycle of the plurality of cycles 10 based on the standard basic operation order 31 in which the work is labeled.
  • the teacher data output unit 140 outputs each cycle of the plurality of cycles 10 to which the work is labeled as the teacher data 23.
  • the learning model generation unit 170 performs learning using the teacher data 23 and outputs the learning model.
  • the format of the learning model is designed according to the architecture used in the recognition phase S200.
  • the analysis process (6) is performed.
  • the analysis unit 150 analyzes the operation of the worker in the work process by using the learning model 22.
  • step S201 when the worker performs the work, the movement division 14 of the worker during the work is detected from the video data 21 and the movement element 13 is acquired as in step S101. In order to detect the operation break 14, for example, the locus of the skeleton information of the operator may be used.
  • step S202 the analysis unit 150 determines from the operation element 13 the time-series order of the basic operations of the worker who is working.
  • the analysis unit 150 uses the learning model 22 to determine the work 11 from the time-series order of the basic movements of the worker during the work. As described above, the work 11 of the worker who is working is recognized.
  • the algorithm used in the recognition phase is not limited to the one described above.
  • a deep learning algorithm such as CNN (Convolutional Neural Network) or RNN (Recurrent Neural Network) may be used.
  • a matching process such as DP (Dynamic Programming) matching or nearest neighbor search may be used.
  • step S101 the detection unit 110 detects the operation division 14 from the video data 21 in which a plurality of cycles 10 each including a plurality of operations 11 are captured.
  • FIG. 5 is a schematic diagram showing the detection process (1) and the basic operation extraction process (2) according to the present embodiment.
  • the detection unit 110 obtains time-series information which is an operation locus from the video data 21 in which a plurality of cycles are captured.
  • the detection unit 110 automatically detects the operation break 14 by detecting the change point of the operation based on the locus of the operation.
  • the skeleton information may be extracted from the video data 21 in order to obtain the trajectory of the operation.
  • the skeletal information is coordinate data of each joint of a person.
  • the method of acquiring the skeleton information and the format of the data may be any method and format. As an example, it is acquired from a camera or depth sensor installed on the factory line.
  • the coordinate data of the joint may be two-dimensional or three-dimensional.
  • the change point of motion is a change point of speed or direction.
  • the detection of the operation break 14 is performed for all cycles of the video data 21. Since the purpose of the learning phase S100 is to generate the learning model 22, the video data 21 is preferably the data of a model worker such as an expert.
  • the work 11 is composed of a plurality of operations, and it is difficult to automatically detect the work division 12. Therefore, in the present embodiment, first, the operation break 14 that is relatively easy to automatically detect is automatically detected.
  • step S102 the basic motion extraction unit 120 classifies the plurality of motion elements 13 separated by the motion delimiter 14, and the basic motion 15 is a collection of similar motion elements 13 based on the classification results of the plurality of motion elements 13. Is extracted. Specifically, the basic motion extraction unit 120 classifies the plurality of motion elements 13 into a plurality of classes, and extracts each class of the plurality of classes as the basic motion 15.
  • the basic motion extraction unit 120 classifies the motion elements 13 and specifies the basic motion 15. Specifically, the basic motion extraction unit 120 realizes automatic classification of the motion elements 13 by applying clustering to all the motion elements 13 obtained as a result of the motion break 14. As a result of classification, the basic motion extraction unit 120 obtains the basic motion 15 constituting the work 11.
  • the clustering method does not matter, but as an example, a method such as x-means that automatically determines the number of clusters is used.
  • step S103 the order determination unit 130 determines the order of the basic operations 15 in one cycle as the standard basic operation order 31 based on the order of the basic operations 15 for each cycle of the plurality of cycles 10.
  • the order determination unit 130 determines the standard basic operation order 31 based on the comparison result of comparing the order of the basic operations for each cycle of the plurality of cycles 10.
  • FIG. 6 is a schematic view showing the order determination process (3) according to the present embodiment.
  • the order determination process (3) is represented by the three figures of the left figure, the center figure, and the right figure.
  • the video data 21 includes 7 cycles of work video.
  • the left figure of FIG. 6 shows the result of dividing 7 cycles into basic operations.
  • the order determination unit 130 compares the order of basic operations between cycles and specifies the correct order of basic operations.
  • the order determination unit 130 uses an algorithm (MSA: multiple sequence alignment) for performing multiple sequence alignment to compare the order of basic operations between cycles, so that the correct order of basic operations can be determined. Identify automatically.
  • MSA multiple sequence alignment
  • the MSA even if there is a cycle including omissions or mistakes as shown in the central figure of FIG. 6, the standard basic operation order 31 which is the correct basic operation order can be specified.
  • the order determining unit 130 compares the standard basic operation order 31 with the order of the basic operations for each cycle, and extracts the basic operations different from the standard basic operation order 31 as different operations from the order of the basic operations for each cycle. good. In the central view of FIG. 6, since the sixth basic operation of the cycle 6 is different from the other cycles, this basic operation is extracted as the different operation 15x.
  • the order determining unit 130 may determine whether or not to include the difference operation 15x in the standard basic operation order 31 based on the existence ratio of the difference operation 15x. Alternatively, the order determining unit 130 may generate two types of standard basic operation orders 31: a standard basic operation order including the difference operation 15x and a standard basic operation order not including the difference operation 15x.
  • the order determination unit 130 may determine whether or not to include the order of basic operations different from the standard basic operation order 31 in the standard basic operation order 31 by clustering the order of the basic operations for each cycle.
  • the standard basic operation order 31 which is the correct basic operation order, is not limited to one.
  • one cycle is composed of a combination of five basic operations 15 of a, b, c, d, e, and f.
  • the third from the beginning of the standard basic operation order 31 is “a, b, a”.
  • the head portion is white, that is, the operation is missing. Therefore, even in the case of "white, a, b", the correct basic operation order may be used.
  • the 5th, 6th, and 7th positions are "d, c, e", but "d, b, e" may also be the correct basic operation order. Judgment as to whether or not the order is correct may be made manually, or may be automatically determined based on the frequency as described above. By allowing a plurality of correct basic operation orders to exist, it is possible to accurately learn and recognize even in cases where the work content changes between cycles. As an example of a case where the work content changes between cycles, there is a case where the work order is changed.
  • step S104 the teacher data output unit 140 receives the labeling of the work for the standard basic operation order 31, and reflects the labeling result in the entire cycle.
  • FIG. 7 is a schematic diagram showing the teacher data output process (4) according to the present embodiment.
  • the teacher data output unit 140 receives the labeling of work elements with respect to the standard basic operation order 31 by the user via the input interface 930.
  • the user labels the work elements with respect to the standard basic operation order 31, which is the correct basic operation order.
  • the labeling work may be performed for one cycle of the standard basic operation order 31.
  • the user labels while viewing the video frame corresponding to a typical time point of each basic operation.
  • the teacher data output unit 140 executes the work labeling for each cycle of the plurality of cycles based on the standard basic operation order 31 in which the work is labeled.
  • the teacher data output unit 140 outputs each cycle of the plurality of cycles in which the work is labeled as the teacher data 23.
  • the teacher data output unit 140 reflects the standard basic operation order 31 in which the work is labeled in all cycles, and outputs the teacher data 23.
  • step S105 the learning model generation unit 170 performs the learning process using the teacher data 23, and generates the learning model 22 used for the analysis of the motion.
  • the learning model generation unit 170 performs a learning process using the teacher data 23 as an input, and generates a learning model 22.
  • learning method or the format of the learning model For example, learning using machine learning such as CNN or RNN may be performed and a learning model may be generated. Further, a set of the motion locus of each basic motion, the standard basic motion order, and the labeling result may be used as a learning model.
  • the functions of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150 are realized by software.
  • the functions of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150 may be realized by hardware.
  • the motion analyzer 100 includes an electronic circuit 909 instead of the processor 910.
  • FIG. 8 is a diagram showing the configuration of the motion analysis device 100 according to the modified example of the present embodiment.
  • the electronic circuit 909 is a dedicated electronic circuit that realizes the functions of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150.
  • the electronic circuit 909 is specifically a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a GA, an ASIC, or an FPGA.
  • GA is an abbreviation for Gate Array.
  • ASIC is an abbreviation for Application Special Integrated Circuit.
  • FPGA is an abbreviation for Field-Programmable Gate Array.
  • the functions of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150 may be realized by one electronic circuit, or may be realized by a plurality of electronic circuits. It may be realized in a distributed manner.
  • some functions of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150 are realized by electronic circuits, and the rest.
  • the function may be realized by software.
  • some or all the functions of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150 may be realized by the firmware.
  • Each of the processor and the electronic circuit is also called a processing circuit. That is, the functions of the detection unit 110, the basic motion extraction unit 120, the order determination unit 130, the teacher data output unit 140, the learning model generation unit 170, and the analysis unit 150 are realized by the processing circuit.
  • FIG. 9 is a diagram illustrating the effect of the motion analyzer 100 according to the present embodiment.
  • the work division is manually specified.
  • the motion analyzer 100 according to the present embodiment automatically detects the motion delimiter, unlike the method of manually designating the work delimiter.
  • the delimiter is set by a certain standard, so that the labeling quality is homogenized.
  • the work generally consists of a plurality of operations, and it is difficult to automatically detect the work break. According to the motion analysis device 100 according to the present embodiment, by paying attention to the motion that can be easily automatically detected, it is possible to reliably automatically detect the delimiter position of the motion.
  • the user in order to specify the position of the delimiter, the user needs to confirm the learning data over all the time.
  • the motion analyzer 100 it is not necessary for the user to confirm the learning data over the entire time.
  • the confirmation of the learning data can be performed within a limited range, and the cost of the labeling work can be reduced.
  • the motion elements are automatically extracted from all the learning data, and the basic motion is extracted by classifying the motion elements.
  • the basic motions by specifying the basic motions, the basic motions constituting each cycle and their order can be known.
  • the order of the basic operations of each cycle is compared, and the correct order of the basic operations is automatically specified. Therefore, in the motion analyzer 100 according to the present embodiment, it is not necessary to perform labeling work for all cycles. In the motion analyzer 100 according to the present embodiment, labeling is only required for one cycle in the standard basic operation order, and the labeling work cost can be reduced.
  • the motion analysis device 100 it is possible to homogenize the labeling quality while reducing the cost and labor required for the labeling work required for motion analysis.
  • each part of the motion analyzer has been described as an independent functional block.
  • the configuration of the motion analyzer does not have to be the configuration as in the above-described embodiment.
  • the functional block of the motion analyzer may have any configuration as long as it can realize the functions described in the above-described embodiment.
  • the motion analysis device may be a system composed of a plurality of devices instead of one device.
  • a plurality of parts may be combined and carried out.
  • one part of this embodiment may be implemented.
  • this embodiment may be implemented in any combination as a whole or partially. That is, in the first embodiment, it is possible to freely combine the respective embodiments, modify any component of each embodiment, or omit any component in each embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

Dans la présente invention, une unité de détection (110) détecte un diviseur d'opérations à partir de données d'image (21) obtenues par photographie d'une pluralité de cycles qui comprennent chacun une pluralité de tâches. Une unité d'extraction d'opérations de base (120) classifie une pluralité d'éléments d'opérations divisés par le diviseur des opérations, et, sur la base des résultats de classification de la pluralité d'éléments d'opérations, extrait des opérations de base obtenues par consolidation d'éléments d'opération similaires. Une unité de détermination d'ordre (130) détermine l'ordre des opérations de base dans un cycle en tant qu'ordre d'opérations de base standard sur la base de l'ordre des opérations de base dans chaque cycle de la pluralité de cycles. Une unité de sortie de données d'apprentissage (140) reçoit un étiquetage de tâche pour l'ordre d'opérations de base standard, exécute un étiquetage de tâche pour chaque cycle de la pluralité de cycles sur la base de l'ordre d'opérations de base standard sur lequel le marquage de tâche a été effectué, et délivre, en tant que données d'apprentissage (23), chaque cycle de la pluralité de cycles sur lesquels le marquage de tâche a été effectué.
PCT/JP2020/016640 2020-04-15 2020-04-15 Dispositif d'analyse d'opération, procédé d'analyse d'opération et programme d'analyse d'opération WO2021210112A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2022514935A JP7086322B2 (ja) 2020-04-15 2020-04-15 動作分析装置、動作分析方法、および、動作分析プログラム
PCT/JP2020/016640 WO2021210112A1 (fr) 2020-04-15 2020-04-15 Dispositif d'analyse d'opération, procédé d'analyse d'opération et programme d'analyse d'opération
TW109131846A TW202141328A (zh) 2020-04-15 2020-09-16 動作分析裝置、動作分析方法及動作分析程式產品

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/016640 WO2021210112A1 (fr) 2020-04-15 2020-04-15 Dispositif d'analyse d'opération, procédé d'analyse d'opération et programme d'analyse d'opération

Publications (1)

Publication Number Publication Date
WO2021210112A1 true WO2021210112A1 (fr) 2021-10-21

Family

ID=78083617

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/016640 WO2021210112A1 (fr) 2020-04-15 2020-04-15 Dispositif d'analyse d'opération, procédé d'analyse d'opération et programme d'analyse d'opération

Country Status (3)

Country Link
JP (1) JP7086322B2 (fr)
TW (1) TW202141328A (fr)
WO (1) WO2021210112A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020050111A1 (fr) * 2018-09-03 2020-03-12 国立大学法人東京大学 Procédé et dispositif de reconnaissance de mouvements

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020050111A1 (fr) * 2018-09-03 2020-03-12 国立大学法人東京大学 Procédé et dispositif de reconnaissance de mouvements

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MATSUBARA, YASUKO ET AL.: "Automatic Feature Extraction from Large-Scale Time Series Data. DEIM Forum", 12TH ANNUAL MEETING OF THE DATABASE SOCIETY OF JAPAN )., 2012 *

Also Published As

Publication number Publication date
JPWO2021210112A1 (fr) 2021-10-21
TW202141328A (zh) 2021-11-01
JP7086322B2 (ja) 2022-06-17

Similar Documents

Publication Publication Date Title
TWI709919B (zh) 車險圖像處理方法、裝置、伺服器及系統
US7233699B2 (en) Pattern matching using multiple techniques
Kwon et al. Practical guide to machine vision software: an introduction with LabVIEW
US10769427B1 (en) Detection and definition of virtual objects in remote screens
CN111902712B (zh) 异常检查装置及异常检查方法
US11521312B2 (en) Image processing apparatus, image processing method, and storage medium
Deng et al. Binarizationshop: a user-assisted software suite for converting old documents to black-and-white
US20170148487A1 (en) Video Manipulation for Privacy Enhancement
JP2023134688A (ja) ビジョンシステムで画像内のパターンを検出及び分類するためのシステム及び方法
US20200103353A1 (en) Information processing apparatus, information processing method, and storage medium
Suarez et al. OpenCV Essentials
US20200279359A1 (en) Inspection apparatus, inspection method, and non-volatile storage medium
US20230053085A1 (en) Part inspection system having generative training model
KR102011212B1 (ko) 인공지능 신경망의 학습 데이터로 사용되는 객체의 추출 및 저장방법
WO2021210112A1 (fr) Dispositif d'analyse d'opération, procédé d'analyse d'opération et programme d'analyse d'opération
WO2022060408A1 (fr) Génération d'interfaces homme-machine à partir d'un outil de conception visuelle à l'aide d'un apprentissage automatique en plusieurs étapes
JP6736988B2 (ja) 画像検索システム、画像処理システム及び画像検索プログラム
WO2016117564A1 (fr) Programme, support de stockage d'informations et dispositif de reconnaissance
AU2022203118A1 (en) Computer-implemented method for extracting content from a physical writing surface
CN113793349A (zh) 目标检测方法及装置、计算机可读存储介质、电子设备
KR20210026176A (ko) 딥 러닝을 위한 라벨링 이미지 생성 방법
CN113379742B (zh) 基于人工智能的器件的结构检测方法、装置及电子设备
JP7488467B2 (ja) 部品特定プログラム、部品特定方法及び情報処理装置
Jain et al. Getting Started with OpenCV
Ashaduzzaman et al. An Automated Testing Framework for Gesture Recognition System using Dynamic Image Pattern Generation with Augmentation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20931075

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022514935

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20931075

Country of ref document: EP

Kind code of ref document: A1