CN112784667A - Motion analysis device, motion analysis method, and computer-readable storage medium - Google Patents

Motion analysis device, motion analysis method, and computer-readable storage medium Download PDF

Info

Publication number
CN112784667A
CN112784667A CN202011089015.1A CN202011089015A CN112784667A CN 112784667 A CN112784667 A CN 112784667A CN 202011089015 A CN202011089015 A CN 202011089015A CN 112784667 A CN112784667 A CN 112784667A
Authority
CN
China
Prior art keywords
motion
evaluation
time
unit
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011089015.1A
Other languages
Chinese (zh)
Inventor
宫崎雅
和田洋贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Corp
Original Assignee
Omron Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Corp filed Critical Omron Corp
Publication of CN112784667A publication Critical patent/CN112784667A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • General Factory Administration (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides an operation analysis device, an operation analysis method and a computer readable storage medium storing an operation analysis program, which can improve the operation of an operator more smoothly. The motion analysis device (10) is provided with: an acquisition unit (11) that acquires time-series data relating to the movements of a plurality of parts of an operator with respect to a work performed by the operator; an analysis unit (12) which analyzes the time-series data and generates motion data indicating the type of the element motion and the execution time from the start time to the end time of the element motion; an evaluation unit (14) that evaluates the element actions performed by the plurality of locations based on the execution timing of the element actions; and a display control unit (15) that controls the display unit to display the evaluation and the motion data together by dividing periods corresponding to different element motions.

Description

Motion analysis device, motion analysis method, and computer-readable storage medium
Technical Field
The present invention relates to a motion analysis device, a motion analysis method, and a computer-readable storage medium.
Background
Conventionally, a motion sensor (motion sensor) for measuring motion data of an operator or a technique for generating motion data by analyzing a moving image of a state in which the operator moves is used. The motion data is sometimes used to evaluate whether the operator is performing an appropriate motion.
For example, patent document 1 listed below discloses a system for evaluating exercise performance, which measures the position and/or angle of a body part of a subject during a body movement in daily life, calculates the exercise performance of the body part of the subject in time series based on the measurement result, calculates specific values of exercise performance of at least three body parts of an upper body, a lower body on the left side, and a lower body on the right side based on the calculated exercise performance of the body part of the subject, evaluates the specific values of exercise performance as exercise performance during the body movement in daily life of the subject, and divides the exercise performance into the upper body, the lower body on the left side, and the lower body on the right side to output.
[ Prior art documents ]
[ patent document ]
Patent document 1: japanese patent No. 6535778
Disclosure of Invention
Problems to be solved by the invention
In patent document 1, the exercise ability is evaluated in a plurality of body parts, but no feedback is given as to what kind of action is associated with what kind of evaluation (feedback). Therefore, it is difficult for the operator to know which point is good and which point is bad in the operation of the operator, and it is difficult to improve the operation.
Therefore, the present invention provides a motion analysis device, a motion analysis method, and a motion analysis program that enable an operator to improve a motion more smoothly.
Means for solving the problems
An operation analysis device according to an aspect of the present invention includes: an acquisition unit that acquires time-series data relating to the operations of a plurality of parts of an operator with respect to a work performed by the operator; an analysis unit that analyzes the time-series data and generates operation data indicating the type of the element operation and the execution time from the start time to the end time of the element operation; an evaluation unit that evaluates the element operations performed by the plurality of portions based on the execution timing of the element operations; and a display control unit for controlling the display unit to display the evaluation and the motion data together by dividing periods corresponding to different element motions.
According to the above aspect, by displaying the evaluation relating to the element motion together with the motion data, it is possible to give feedback to the operator as to which element motion is related to which evaluation, and thereby the operator can improve the motion more smoothly.
In the above aspect, the time-series data relating to the plurality of sites may include time-series data relating to a left hand and a right hand, and the evaluation unit may evaluate whether the element operation is performed by the left hand and the right hand in parallel.
According to the above aspect, the operator can be encouraged to perform work in a shorter time by giving a high comment when element operations are executed in parallel using the left hand and the right hand.
In the above-described aspect, the evaluation unit may perform the evaluation in a case where no operation related to the work is performed at any of the plurality of portions.
According to the above aspect, the operator can be given a bad comment when the operator is not performing the operation related to the work, and thereby the useless time during the work can be encouraged to be reduced.
In the above-described aspect, the display control unit may control the display unit to display an icon (ion) indicating the evaluation along the operation data.
According to the above-described aspect, the evaluation relating to the element action can be grasped at a glance by the icon.
In the above aspect, the display control unit may control the display unit to display the operation data in a display mode different from the evaluation mode.
According to the above aspect, by displaying the motion data in a different display form according to the evaluation, the evaluation relating to the element motion can be grasped at a glance.
In the above aspect, the display control unit may control the display unit to display the comment indicating the evaluation together with the operation data.
According to the above-described aspect, by displaying the comment indicating the evaluation, the details of the evaluation relating to the element action can be grasped.
Another aspect of the present invention provides a motion analysis method including: acquiring time-series data on a plurality of parts of a worker with respect to a work performed by the worker; analyzing the time-series data to generate motion data indicating the type and execution time of the element motion; evaluating the element actions performed by the plurality of parts based on the execution timing of the element actions; and controlling the display unit to display the evaluation and the motion data together so as to distinguish periods corresponding to different element motions.
According to the above-described aspect, feedback can be given to the operator as to what element action is associated with what evaluation, so that the operator can improve the action more smoothly.
A computer-readable storage medium according to another aspect of the present invention stores an operation analysis program for causing an operation analysis device to function as: an acquisition unit that acquires time-series data relating to a plurality of parts of a worker with respect to a work performed by the worker; an analysis unit that analyzes the time-series data and generates motion data indicating the type and execution time of the element motion; an evaluation unit that evaluates the element operations performed by the plurality of portions based on the execution timing of the element operations; and a display control unit for controlling the display unit to display the evaluation and the motion data together so that the periods corresponding to the different element motions can be distinguished.
According to the above-described aspect, feedback can be given to the operator as to what element action is associated with what evaluation, so that the operator can improve the action more smoothly.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, it is possible to provide an operation analysis device, an operation analysis method, and an operation analysis program that enable an operator to improve an operation more smoothly.
Drawings
Fig. 1 is a diagram showing an outline of an operation analysis system according to an embodiment of the present invention.
Fig. 2 is a diagram showing functional blocks of the operation analysis device according to the present embodiment.
Fig. 3 is a diagram showing the physical configuration of the operation analysis device according to the present embodiment.
Fig. 4 is a diagram showing operation data generated by the operation analysis device of the present embodiment.
Fig. 5 is a diagram showing an example of a screen on which display control is performed by the operation analysis device according to the present embodiment.
Fig. 6 is a flowchart of a first example of the display control process executed by the motion analysis device according to the present embodiment.
Fig. 7 is a flowchart of a second example of the display control process executed by the operation analysis device according to the present embodiment.
Fig. 8 is a flowchart of a third example of the display control process executed by the operation analysis device according to the present embodiment.
Description of the symbols
10: motion analysis device
10a:CPU
10b:RAM
10c:ROM
10 d: communication unit
10 e: input unit
10 f: display unit
11: acquisition unit
12: analysis section
13: storage unit
13 a: motion data
14: evaluation unit
15: display control unit
20: video camera
21: photoelectric sensor
22: pressure sensor
100: motion analysis system
Detailed Description
Embodiments of the present invention are explained with reference to the drawings. In the drawings, the same or similar structures are denoted by the same reference numerals.
Application example § 1
First, an example of a scenario to which the present invention is applied will be described with reference to fig. 1. Fig. 1 is a diagram showing an outline of an operation analysis system 100 according to an embodiment of the present invention. The motion analysis system 100 of the present embodiment includes: a camera 20 that captures a moving image relating to the movement of the operator performed in the work area R; a photoelectric sensor 21 for detecting the entry and exit of the hand of the operator into and out of a predetermined area; and a pressure sensor 22 for measuring a pressure applied to the predetermined region. Here, a moving image captured by the camera 20, a signal measured by the photoelectric sensor 21, and a signal measured by the pressure sensor 22 are examples of time-series data of the present invention, respectively. The working area R in this example is an area including the entire manufacturing line, but the working area R may be any area, and may be, for example, an area where a predetermined process is performed or an area where a predetermined element operation is performed. The element operation is a unit operation performed by an operator, and includes operations such as grasping a component, conveying the component, assembling and adjusting the component, and storing an assembled product.
In this example, a case where the first operator a1 and the second operator a2 perform predetermined operations in the operation region R will be described. Hereinafter, the first worker a1 and the second worker a2 are collectively referred to as a worker a.
The motion analysis system 100 includes a motion analysis device 10. The motion analysis device 10 acquires time-series data regarding motions of a plurality of parts of the operator a, such as a moving image, analyzes the time-series data, and generates motion data indicating the type of the element motion and the execution time from the start time to the end time of the element motion. The motion analysis device 10 evaluates the element motions performed at a plurality of locations based on the execution timing of the element motions, and controls the display unit 10f to display the evaluation and the motion data together by dividing the periods corresponding to different element motions.
The display unit 10f displays the evaluation together with the motion data by dividing the periods corresponding to the different element motions. Further, the display unit 10f may display a comment indicating the evaluation. Further, the display unit 10f may reproduce a moving image relating to the movement of the operator a for each of the plurality of element movements.
According to the motion analysis device 10 of the present embodiment, by displaying the evaluation relating to the element motion together with the motion data, it is possible to give feedback to the operator as to what element motion is related to what evaluation, and the operator can improve the motion more smoothly.
Construction example 2
[ functional Structure ]
Fig. 2 is a diagram showing functional blocks of the operation analysis device 10 according to the present embodiment. The motion analysis device 10 includes an acquisition unit 11, an analysis unit 12, a storage unit 13, an evaluation unit 14, and a display control unit 15.
< acquisition part >
The acquisition unit 11 acquires time-series data on the operations of a plurality of parts of the operator a with respect to the operation performed by the operator a. The time-series data includes a moving image captured by the camera 20, a signal measured by the photoelectric sensor 21, and a signal measured by the pressure sensor 22.
< analysis part >
The analysis unit 12 analyzes the time-series data and generates operation data indicating the type of the element operation and the execution time from the start time to the end time of the element operation. The types of element operations include, for example, grasping, transporting, adjusting, and storing of parts, but may include other types of operations. Moreover, the element operation can be set arbitrarily. The start time and the end time of the element operation may be represented by time or by the elapsed time from the start time of the time-series data.
< storage part >
The storage unit 13 stores the operation data 13a generated by the analysis unit 12. The storage unit 13 may store time-series data.
< department of evaluation >
The evaluation unit 14 evaluates the element operations performed by the plurality of portions based on the execution timing of the element operations. If the time-series data on the plurality of parts includes time-series data on the left hand and the right hand, the evaluation unit 14 can evaluate if the element operation is executed in parallel by the left hand and the right hand. By giving a favorable comment when the element operation is executed in parallel using the left hand and the right hand as described above, the operator can be encouraged to perform the operation in a shorter time.
The evaluation unit 14 may perform the difference evaluation when no operation related to the work is performed at any of the plurality of portions. By giving a bad comment when the operator does not perform the operation related to the work as described above, it is possible to inspire to reduce the dead time during the work.
< display control part >
The display control unit 15 controls the display unit 10f to display the evaluation and the operation data together by dividing the period corresponding to the different element operations. The dividing of the period corresponding to the different element operation includes displaying the period corresponding to the different element operation in a different display form. Displaying the evaluation together with the motion data includes displaying the motion data in time series, and displaying the evaluation in a manner corresponding to the execution timing of the element motion represented by the motion data.
The display control unit 15 may control the display unit 10f to display an icon representing the evaluation along the operation data. The icon may visually indicate whether a good comment or a bad comment and may include a circle indicating a good comment and a cross indicating a bad comment. In this way, the evaluation relating to the element operation can be grasped at a glance by the icon.
The display control unit 15 may perform control so that the operation data is displayed on the display unit 10f in a different display mode according to the evaluation. The display control unit 15 may display, for example, the motion data corresponding to the element motion with a high score in blue, and the motion data corresponding to the element motion with a low score in red. By displaying the operation data in a different display mode according to the evaluation, the evaluation relating to the element operation can be grasped at a glance.
The display control unit 15 may also perform control so that the display unit 10f displays a comment indicating the evaluation together with the motion data. The annotations may include good comments, bad comments, reasons for the assessment, and suggestions for improving the assessment. By displaying the comment indicating the evaluation in this manner, the details of the evaluation relating to the element operation can be grasped.
[ hardware configuration ]
Fig. 3 is a diagram showing the physical configuration of the operation analysis device 10 according to the present embodiment. The operation analysis device 10 includes a Central Processing Unit (CPU) 10a corresponding to an arithmetic section, a Random Access Memory (RAM) 10b corresponding to a storage section, a Read Only Memory (ROM) 10c corresponding to a storage section, a communication section 10d, an input section 10e, and a display section 10 f. The structures are connected to each other via a bus in a manner that allows data to be transmitted and received. In the present example, a case where the operation analysis device 10 includes one computer is described, but the operation analysis device 10 may be implemented by combining a plurality of computers. The configuration shown in fig. 3 is an example, and the motion analysis device 10 may have a configuration other than these, or may not have some of these configurations.
The CPU 10a is a control unit that performs control related to execution of a program stored in the RAM 10b or the ROM 10c, calculation of data, and processing. The CPU 10a is a computing unit that executes a program (operation analysis program) for analyzing time-series data relating to a job performed by an operator to generate operation data, and controlling the operation data and an evaluation thereof to be displayed on a display unit. The CPU 10a receives various data from the input unit 10e or the communication unit 10d, and displays the calculation result of the data on the display unit 10f or stores the calculation result in the RAM 10 b.
The RAM 10b is a portion of the memory section in which data can be rewritten, and may include, for example, a semiconductor memory element. The RAM 10b can store data such as programs and operation data executed by the CPU 10 a. These are examples, and data other than these may be stored in the RAM 10b, or a part of them may not be stored.
The ROM 10c is a portion of the storage section from which data can be read, and may include, for example, a semiconductor memory element. The ROM 10c may store, for example, an operation analysis program or data not to be rewritten.
The communication unit 10d is an interface for connecting the operation analysis device 10 to another device. The communication unit 10d is connectable to a communication network such as the Internet (Internet).
The input unit 10e receives data input from a user, and may include a keyboard and a touch panel, for example.
The Display unit 10f visually displays the operation result obtained by the CPU 10a, and may include a Liquid Crystal Display (LCD), for example. The display unit 10f can display the motion data and the evaluation thereof.
The operation analysis program may be stored in a computer-readable storage medium such as the RAM 10b or the ROM 10c, or may be provided via a communication network connected via the communication unit 10 d. In the operation analysis device 10, the CPU 10a executes the operation analysis program to realize various operations described with reference to fig. 2. These physical configurations are examples, and may not necessarily be independent configurations. For example, the operation analysis device 10 may include a Large-Scale integrated circuit (LSI) in which the CPU 10a is integrated with the RAM 10b or the ROM 10 c.
Action example 3
Fig. 4 is a diagram showing operation data generated by the operation analysis device 10 of the present embodiment. In this figure, an example of the left-hand motion data D1 and the right-hand motion data D2 is shown. The characters t1 to t12 shown in the figure indicate time points arranged in time series.
The left-hand motion data D1 and the right-hand motion data D2 include a column of "element motion" indicating the type of element motion, a column of "start time" indicating the start time of the element motion, and a column of "end time" indicating the end time of the element motion.
For example, regarding the left-hand motion data D1, the "element motion" having the beginning "t 2" and the end "t 4" is "grab". In the right-hand motion data D2, "element motion" having a start time of "t 1" and an end time of "t 3" is "grasping". At this time, it shows: the gripping action starts with the right hand, then with the left hand, and then the right hand gripping ends and the left hand gripping ends.
In the right-hand operation data D2, "element operation" having the start time "t 7" and the end time "t 8" is "NA", indicating that no operation related to the job is performed. Note that "element operation" indicates "NA" and includes an operation that the operator a stops or performs an operation unrelated to a predetermined element operation.
Fig. 5 is a diagram showing an example of a screen on which display control is performed by the operation analysis device 10 according to the present embodiment. In this example, the motion analysis device 10 evaluates the left-hand motion data D1 and the right-hand motion data D2 shown in fig. 4, and displays the evaluation and motion data together.
For example, by means of a graph representing the grip: the gripping operation is performed with the right hand from time t1 to time t3, and the gripping operation is performed with the left hand from time t2 to time t 4. Here, from time t2 to time t3, the grasping is performed in parallel with the right hand and the left hand. The motion analysis device 10 detects that the element motion of the grasping is executed by the right hand and the left hand in parallel, and the motion is evaluated. In response to this comment, a comment "two-handed job", a circle icon, and an icon representing the two-handed job are displayed. Such icons are also displayed from time t4 until time t 5. From time t4 to time t5, the element operations of conveyance are executed in parallel by the right hand and the left hand.
On the other hand, from time t7 to time t8, neither the right hand nor the left hand performs the operation related to the job. In other words, the right hand and the left hand are NA for the motion data from time t7 to time t 8. The operation analysis device 10 gives a bad comment when neither the right hand nor the left hand performs an operation related to a job. Corresponding to this comment, "useless" is displayed, crossed icons and icons representing the confounding appearance.
Fig. 6 is a flowchart of a first example of the display control process executed by the motion analysis device 10 according to the present embodiment. The first example of the display control processing is processing for controlling the display unit 10f to display the operation data.
First, the operation analysis device 10 reads flow data from the storage unit 13 (S10). Here, the flow data is operation data indicating an ideal flow relating to the work of the worker a. The flow data may be read from the storage section 13, but may also be read from an external storage device.
Next, the motion analysis device 10 repeatedly executes the following processing S111 and processing S112 until the end of the flow data (S11). The motion analysis device 10 sets a display width in accordance with the execution time from the start time to the end time of the element motion (S111), and displays data in left alignment while dividing the periods corresponding to different element motions (S112).
By executing the first example of the display control processing, the flow data is displayed on the display unit 10f in time series and with a width corresponding to the execution time.
Fig. 7 is a flowchart of a second example of the display control process executed by the operation analysis device 10 according to the present embodiment. A second example of the display control processing is an example of processing that is executed after the first example of the display control processing and that controls to display the evaluation of the element action together with the action data.
The motion analysis device 10 reads the motion data of the right hand (S12), and reads the motion data of the left hand (S13). It is to be noted that the right-hand operation data and the left-hand operation data may be read in any order.
Next, the motion analysis device 10 repeatedly executes the following processing S141 to S145 until the end of the motion data (S14). The motion analysis device 10 sets a display width in accordance with the execution time from the start time to the end time of the element motion (S141), and displays data in left alignment by dividing the periods corresponding to different element motions (S142).
The motion analysis device 10 evaluates the element motions performed by both hands (S143), and displays an icon indicating the evaluation along the data if the evaluation related to the element motions is a good evaluation or a bad evaluation (S144: yes) (S145). On the other hand, if the evaluation on the element operation is not good or bad (S144: no), the operation analysis device 10 executes the next process without displaying the icon.
By executing the second example of the display control processing, the right-hand and left-hand motion data are displayed on the display unit 10f in time series and with a width corresponding to the execution time, and an icon indicating the evaluation is displayed on the display unit 10f together with the motion data.
Fig. 8 is a flowchart of a third example of the display control process executed by the operation analysis device 10 according to the present embodiment. A third example of the display control processing is an example of processing that is executed after the first example of the display control processing and that controls to change the display form of the operation data in accordance with the evaluation of the element operation.
The motion analysis device 10 reads the motion data of the right hand (S12), and reads the motion data of the left hand (S13). It is to be noted that the right-hand operation data and the left-hand operation data may be read in any order.
Next, the motion analysis device 10 repeatedly executes the following processing S141 to S145 until the end of the motion data (S14). The motion analysis device 10 sets a display width in accordance with the execution time from the start time to the end time of the element motion (S141), and displays data in left alignment by dividing the periods corresponding to different element motions (S142).
Then, the motion analysis device 10 evaluates the element motions performed by both hands (S143), and if the evaluation regarding the element motions is a good evaluation or a bad evaluation (S144: yes), changes the display form of the motion data according to the evaluation (S145). For example, the motion analysis device 10 may display the motion data corresponding to the element motion with a high score in blue, and display the motion data corresponding to the element motion with a low score in red. On the other hand, if the evaluation on the element operation is not good or bad (S144: no), the operation analysis device 10 does not change the display mode of the operation data, but displays the operation data in black, for example, and executes the next processing.
In the third example of the display control process, the motion data of the right hand and the left hand are displayed on the display unit 10f in time series and with a width corresponding to the execution time, and the motion data is displayed on the display unit 10f in a display form corresponding to the evaluation of the element motion.
Embodiments of the present invention can also be described as follows. However, the embodiments of the present invention are not limited to the embodiments described in the following notes. Further, the embodiments of the present invention may be in the form of substitution or combination of the descriptions in the figures.
[ Note 1]
A motion analysis apparatus 10 comprising:
an acquisition unit 11 that acquires time-series data relating to operations of a plurality of parts of a worker with respect to a work performed by the worker;
an analysis unit 12 that analyzes the time-series data and generates motion data indicating the type of the element motion and the execution time from the start time to the end time of the element motion;
an evaluation unit 14 that evaluates the element motion performed by the plurality of parts based on an execution timing of the element motion; and
the display control unit 15 controls the display unit to display the evaluation and the operation data together by dividing periods corresponding to different element operations.
[ Note 2]
The motion analyzing apparatus 10 according to note 1, wherein
The time-series data associated with the plurality of sites comprises time-series data associated with a left hand and a right hand,
the evaluation unit 14 evaluates the element action if the element action is executed in parallel by the left hand and the right hand.
[ Note 3]
The motion analyzing apparatus 10 according to note 1 or 2, wherein
The evaluation unit 14 performs the difference evaluation when none of the plurality of portions performs the operation related to the work.
[ Note 4]
The motion analysis device 10 according to any one of notes 1 to 3, wherein
The display control unit 15 controls the display unit to display an icon representing the evaluation along the operation data.
[ Note 5]
The motion analysis device 10 according to any one of notes 1 to 3, wherein
The display control unit 15 controls the display unit to display the motion data in a display mode different from the evaluation mode.
[ Note 6]
The motion analysis device 10 according to any one of notes 1 to 5, wherein
The display control unit 15 controls the display unit to display the comment indicating the evaluation together with the motion data.
[ Note 7]
A motion analysis method, comprising:
acquiring time-series data on a plurality of parts of a worker with respect to a work performed by the worker;
analyzing the time-series data to generate motion data indicating the type and execution time of the element motion;
evaluating the element actions performed by the plurality of parts based on execution timings of the element actions; and
and controlling the evaluation and the action data to be displayed on a display part together in a manner that the periods corresponding to the different element actions can be distinguished.
[ Note 8]
An operation analysis program for causing an arithmetic unit included in an operation analysis device 10 to function as:
an acquisition unit 11 that acquires time-series data relating to a plurality of sites of a worker with respect to a work performed by the worker;
an analysis unit 12 for analyzing the time-series data and generating motion data indicating the type and execution time of the element motion;
an evaluation unit 14 that evaluates the element motion performed by the plurality of parts based on an execution timing of the element motion; and
the display control unit 15 controls the display unit to display the evaluation together with the operation data so that a period corresponding to the operation of the different elements can be divided.

Claims (9)

1. A motion analysis apparatus comprising:
an acquisition unit that acquires time-series data relating to the operations of a plurality of parts of a worker with respect to a work performed by the worker;
an analysis unit that analyzes the time-series data and generates motion data indicating a type of a component motion and an execution time from a start time to an end time of the component motion;
an evaluation unit that evaluates the element motion performed by the plurality of parts based on an execution timing of the element motion; and
and a display control unit that controls the display unit to display the evaluation and the operation data together, while dividing periods corresponding to different element operations.
2. The motion analysis apparatus according to claim 1, wherein
The time-series data associated with the plurality of sites comprises time-series data associated with a left hand and a right hand,
the evaluation unit evaluates the element action if the element action is executed in parallel by the left hand and the right hand.
3. The motion analysis apparatus according to claim 1, wherein
The evaluation unit performs a bad evaluation when none of the plurality of portions performs the operation related to the work.
4. The motion analysis apparatus according to claim 2, wherein
The evaluation unit performs a bad evaluation when none of the plurality of portions performs the operation related to the work.
5. The motion analysis apparatus according to any one of claims 1 to 4, wherein
The display control unit controls the display unit to display an icon representing the evaluation along the operation data.
6. The motion analysis apparatus according to any one of claims 1 to 4, wherein
The display control unit performs control so that the operation data is displayed on the display unit in a display mode different from that of the evaluation.
7. The motion analysis apparatus according to any one of claims 1 to 4, wherein
The display control unit controls to display the comment indicating the evaluation together with the motion data on the display unit.
8. A motion analysis method, comprising:
acquiring time-series data on a plurality of parts of a worker with respect to a work performed by the worker;
analyzing the time-series data to generate motion data indicating the type and execution time of the element motion;
evaluating the element actions performed by the plurality of parts based on execution timings of the element actions; and
and controlling the evaluation and the action data to be displayed on a display part together in a manner that the periods corresponding to the different element actions can be distinguished.
9. A computer-readable storage medium storing an operation analysis program for causing an arithmetic unit included in an operation analysis device to function as:
an acquisition unit that acquires time-series data relating to a plurality of sites of a worker with respect to a work performed by the worker;
an analysis unit that analyzes the time-series data and generates motion data indicating the type and execution time of the element motion;
an evaluation unit that evaluates the element motion performed by the plurality of parts based on an execution timing of the element motion; and
and a display control unit that controls the evaluation and the operation data to be displayed on a display unit so that a period corresponding to the operation of the different elements can be divided.
CN202011089015.1A 2019-11-08 2020-10-13 Motion analysis device, motion analysis method, and computer-readable storage medium Pending CN112784667A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019203039A JP7385826B2 (en) 2019-11-08 2019-11-08 Motion analysis device, motion analysis method, and motion analysis program
JP2019-203039 2019-11-08

Publications (1)

Publication Number Publication Date
CN112784667A true CN112784667A (en) 2021-05-11

Family

ID=75750472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011089015.1A Pending CN112784667A (en) 2019-11-08 2020-10-13 Motion analysis device, motion analysis method, and computer-readable storage medium

Country Status (3)

Country Link
US (1) US11700450B2 (en)
JP (1) JP7385826B2 (en)
CN (1) CN112784667A (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7339604B2 (en) * 2019-11-12 2023-09-06 オムロン株式会社 Motion recognition device, motion recognition method, motion recognition program, and motion recognition system
JP2023140036A (en) * 2022-03-22 2023-10-04 パナソニックIpマネジメント株式会社 Operation analysis device and operation analysis method
JP2023140047A (en) * 2022-03-22 2023-10-04 パナソニックIpマネジメント株式会社 Operation analysis device and operation analysis method

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003006286A (en) 2001-06-19 2003-01-10 Sharp Corp Work added-value evaluation system, work added-value evaluation method, and program for implementing method with computer
US8560267B2 (en) * 2009-09-15 2013-10-15 Imetrikus, Inc. Identifying one or more activities of an animate or inanimate object
JP5318834B2 (en) 2010-09-29 2013-10-16 日立電線株式会社 Optical fiber end processing method and optical fiber end processing apparatus
JP5856456B2 (en) 2011-12-02 2016-02-09 株式会社日立製作所 Human flow prediction apparatus and method
CN105828894A (en) * 2013-12-27 2016-08-03 索尼公司 Analysis Device, Recording Medium, And Analysis Method
JP2015197847A (en) 2014-04-02 2015-11-09 富士電機株式会社 Work analysis system, work analysis method, and work analysis program
JP6326701B2 (en) * 2014-11-04 2018-05-23 国立大学法人宇都宮大学 Cooperative movement evaluation device
JP6897037B2 (en) 2016-09-15 2021-06-30 オムロン株式会社 Workability evaluation device
JP6710644B2 (en) * 2017-01-05 2020-06-17 株式会社東芝 Motion analysis device, motion analysis method and program
JP2019101919A (en) 2017-12-06 2019-06-24 キヤノン株式会社 Information processor, information processing method, computer program, and storage medium
JP6561305B2 (en) 2017-12-24 2019-08-21 Gva Tech株式会社 Legal document review program, legal document review method, and legal document review system
JP6535778B1 (en) 2018-03-07 2019-06-26 社会福祉法人兵庫県社会福祉事業団 Motor ability evaluation system
JP7385825B2 (en) * 2019-11-07 2023-11-24 オムロン株式会社 Motion analysis device, motion analysis method, and motion analysis program
JP7421745B2 (en) * 2019-11-12 2024-01-25 オムロン株式会社 Motion analysis device, motion analysis method, and motion analysis program

Also Published As

Publication number Publication date
US11700450B2 (en) 2023-07-11
US20210144303A1 (en) 2021-05-13
JP2021077070A (en) 2021-05-20
JP7385826B2 (en) 2023-11-24

Similar Documents

Publication Publication Date Title
CN112784667A (en) Motion analysis device, motion analysis method, and computer-readable storage medium
US5881288A (en) Debugging information generation system
JP2020194242A (en) Learning device, learning method, learning program, automatic control device, automatic control method, and automatic control program
CN112749615A (en) Skill evaluation device, skill evaluation method, and recording medium
CN112784666A (en) Motion analysis device, motion analysis method, and computer-readable storage medium
CN110490034B (en) Action analysis device, action analysis method, recording medium, and action analysis system
CN112861597A (en) Motion analysis device, motion analysis method, and storage medium
JP2005032015A (en) Electronic device and program
JP2009237672A (en) Stress analysis device and method
JP2008165488A (en) Unit, method and program for assembly operability evaluation
US20090058858A1 (en) Electronic apparatus having graph display function
CN112784668A (en) Element work division device, method, system, and storage medium
CN112861596A (en) Motion recognition device, motion recognition method, storage medium, and motion recognition system
EP2477096A1 (en) Gesture determination device and method of same
JP4811177B2 (en) Graph display device and graph display processing program
CN102955763B (en) Display packing and display device
JP2002056037A (en) Contact analyzing device
WO2022153597A1 (en) Task level conversion device, task level conversion method, and task level conversion program
EP4088894A1 (en) Robot control device, robot control system, robot control method, and robot control program
JP7068941B2 (en) Program comparison device, program comparison method and comparison program
JP3257656B2 (en) Debug device
JP4165453B2 (en) Electronics
WO2020136924A1 (en) Motion analysis device, motion analysis method, and motion analysis program
JP2785950B2 (en) 2D graph drawing device
JP4333266B2 (en) Graph display control device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination