WO2019138877A1 - Motion-analyzing system, motion-analyzing device, motion analysis method, and motion analysis program - Google Patents

Motion-analyzing system, motion-analyzing device, motion analysis method, and motion analysis program Download PDF

Info

Publication number
WO2019138877A1
WO2019138877A1 PCT/JP2018/047812 JP2018047812W WO2019138877A1 WO 2019138877 A1 WO2019138877 A1 WO 2019138877A1 JP 2018047812 W JP2018047812 W JP 2018047812W WO 2019138877 A1 WO2019138877 A1 WO 2019138877A1
Authority
WO
WIPO (PCT)
Prior art keywords
operation information
information
motion
worker
extracted
Prior art date
Application number
PCT/JP2018/047812
Other languages
French (fr)
Japanese (ja)
Inventor
浩臣 音田
修一 後藤
直浩 河合
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2019138877A1 publication Critical patent/WO2019138877A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present invention relates to a motion analysis system, a motion analysis device, a motion analysis method, and a motion analysis program.
  • one or more cameras are installed on a product manufacturing line, and an operator's operation may be recorded as an image or a moving image.
  • motion capture that can measure the three-dimensional position and movement speed of joints without mounting a tracker has begun to spread, and in addition to images and moving pictures, motion information that quantitatively indicates the motion of the worker You may get
  • Patent Document 1 an operation signal representing a change in the feature amount of the worker's moving image and the worker's operation is acquired, and the operation signal and the operation result when the operation result is good are poor.
  • a work movement analysis system has been described which compares a movement signal of a case to extract a difference and displays a moving image corresponding to the difference.
  • the present invention provides a motion analysis system, a motion analysis device, a motion analysis method, and a motion analysis program that can easily extract a scene in which a specific motion is performed.
  • a motion analysis system includes: a measurement unit configured to measure motion information indicating a motion of one or more workers performed in a work area; and a moving image including a scene where the worker is performing a motion An operation that satisfies a predetermined condition by comparing an imaging unit that captures an image, a storage unit that stores reference operation information indicating a reference operation that is a reference for comparison with the operation of the worker, the operation information, and the reference operation information
  • the information processing apparatus includes an extraction unit that extracts information, and a display unit that displays a scene in which an operator is performing an operation indicated by the extracted operation information using a moving image.
  • the motion information may be information indicating the movement of the worker's body, and may be information indicating the displacement of a representative position of the worker's body.
  • the representative position of the worker's body may be one position of the body, but typically there may be more than one.
  • the motion information can be calculated, for example, by projecting the pattern light to the worker and extracting feature points from the captured moving image.
  • the reference operation may be a standard operation that the operator should follow or a non-standard operation of the operator such as a mistake case.
  • the predetermined condition may be a condition that the difference between the operation information and the reference operation information is equal to or more than a threshold.
  • the predetermined condition may be a condition that the difference between the operation information and the reference operation information is equal to or less than a threshold.
  • the reference operation information may be information that can be compared with the operation information, and may not necessarily be information of the same data format as the operation information.
  • the reference operation information and the operation information may be time-series data.
  • “a scene where the worker is executing the motion indicated by the extracted motion information” refers to a scene of a time zone in which the worker is executing the motion indicated by the motion information extracted from one moving image. It may be a scene of a part of moving images in which the operator is executing an operation indicated by the extracted operation information out of a plurality of moving images.
  • the operation indicated by the operation information may be an operation corresponding to quantitative information on the operation of the worker indicated by the operation information.
  • the operation information quantitatively indicating the operation of the worker the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the extracted operation information is extracted using the moving image.
  • the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the extracted operation information is extracted using the moving image.
  • the measurement unit may measure a plurality of pieces of operation information respectively indicating the operations of the plurality of workers, and the display unit may display information identifying the worker for which the extracted operation information is measured. .
  • the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the extracted operation information is measured.
  • the information for identifying the worker it is possible to identify the worker who has executed the specific operation satisfying the predetermined condition among the plurality of workers. Therefore, for example, it is possible to reduce the burden of confirming who has performed a particular operation.
  • the imaging unit includes a first imaging unit that captures a first moving image that captures an image of the work area, and a second imaging unit that captures a second moving image that captures a portion of the work area,
  • the scene of the time zone in which the worker is executing the operation indicated by the extracted operation information may be displayed by using the first moving image and the second moving image.
  • a specific condition satisfying a predetermined condition is displayed by displaying a scene of a time zone in which the worker is executing the operation indicated by the extracted operation information among the first moving images obtained by photographing the work area.
  • the operation can be confirmed as a whole. It is possible to confirm the details of a specific operation that satisfies a predetermined condition. Therefore, for example, even when the length of the moving image becomes long, the scene of the time zone in which the specific operation is performed can be easily extracted, and the load of the extraction operation can be reduced.
  • the second imaging unit captures a plurality of second moving images obtained by capturing a plurality of parts of the work area, and estimates a position in the work area where the operation indicated by the extracted motion information is performed.
  • the display unit may display a second moving image captured at an estimated position among the plurality of second moving images.
  • the reference operation information is determined for each of the plurality of steps, and the extraction unit compares the operation information with the reference operation information for each of the plurality of steps and determines the operation information satisfying the predetermined condition.
  • the extraction unit may display information identifying a process indicated by the extracted operation information.
  • the motion information satisfying the predetermined condition is extracted in comparison with the reference motion information determined for each of the plurality of steps, and the extracted motion information is By displaying the information for identifying the indicated process, it is possible to confirm in which process the specific operation satisfying the predetermined condition is the operation performed. Therefore, for example, it is possible to reduce the burden of confirming in which process a specific operation has been performed.
  • the estimation unit may estimate the position in the work area where the worker is executing the operation indicated by the extracted operation information, based on the process indicated by the extracted operation information.
  • the plurality of positions in the work area are estimated by estimating the position in the work area where the worker is executing the operation indicated by the extracted operation information based on the process indicated by the extracted operation information.
  • the second moving image photographed at the position where the process indicated by the extracted operation information is performed among the plurality of second moving images photographed in step can be displayed, and the details of the operation performed in the step are confirmed can do. Therefore, for example, it is possible to reduce the burden of confirming where and for which process a specific operation has been performed.
  • the motion information and the reference motion information respectively include coordinate values of the joint of the worker
  • the extraction unit compares the coordinate value included in the motion information with the coordinate value included in the reference motion information, You may extract the operation information which satisfy
  • the degree to which the movement information deviates from the reference movement information is accurately determined. It is possible to evaluate and to appropriately extract operation information that satisfies a predetermined condition.
  • the estimation unit may estimate the position in the work area where the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information.
  • the extracted motion information based on the coordinate values of the joints included in the extracted motion information, it is extracted by estimating the position in the work area where the worker is executing the motion indicated by the extracted motion information. It is possible to accurately estimate the position at which the motion indicated by the motion information is performed.
  • the reference motion information is determined for each of the plurality of element motions, and the estimation unit is extracted based on the coordinate value included in the extracted motion information and the element motion indicated by the extracted motion information.
  • the position in the work area where the worker is executing the motion indicated by the motion information may be estimated.
  • the element operation is a unit operation performed by the operator, and includes, for example, an operation such as picking of parts, arrangement of parts, fixing of parts, packing of products.
  • the element operation may be a component of the operator's operation, and a combination of one or more element operations may constitute a series of operations of the operator.
  • a work area in which the operator performs the motion indicated by the extracted motion information By estimating the position in, the second moving image photographed at the position at which the element motion indicated by the extracted motion information is displayed among the plurality of second moving images photographed at the plurality of positions in the work area is displayed. And the details of the element operation can be confirmed. Therefore, for example, it is possible to reduce the burden of confirming where a particular operation has been performed with respect to which element operation.
  • the display unit displays a scene in which the operator is executing the operation indicated by the operation information extracted using the moving image, and a moving image in which the operator is executing the reference operation indicated by the reference operation information. You may display it.
  • the display unit displays one screen of a scene in which the operator is executing the operation indicated by the operation information extracted using the moving image, and a moving image in which the operator is executing the reference operation indicated by the reference operation information. They may be displayed in parallel at the same time, may be superimposed on one screen and may be displayed simultaneously, or may be displayed continuously and alternately.
  • the worker displays the scene where the operator is executing the operation indicated by the motion information extracted using the moving image and the moving image where the operator is executing the reference operation. It becomes easy to compare the specific operation that fulfills the predetermined condition that has been executed with the reference operation.
  • the display unit may display a graph indicating the extracted operation information and a scene of a moving image in which the operator is executing the operation indicated by the extracted operation information.
  • the specific operation satisfying the predetermined condition is displayed by displaying the graph indicating the extracted operation information and the scene of the moving image in which the operator is executing the operation indicated by the extracted operation information. It is possible to confirm the scene where was executed from different viewpoints of animation and graph. Therefore, for example, it is possible to reduce the burden of qualitatively and quantitatively confirming a specific operation.
  • the display unit may superimpose a plurality of frames included in a scene in which the worker is executing an operation indicated by the extracted operation information in the moving image and display the frame as a single image.
  • a predetermined condition can be obtained by superimposing a plurality of frames included in a scene in which the operator is executing an operation indicated by motion information extracted from the moving image and displaying the frame as a single image. It is possible to see at a glance the whole scene where a specific action to be fulfilled has been performed. Therefore, for example, the entire specific operation can be confirmed in a short time, and the burden of confirmation can be reduced.
  • a motion analysis apparatus includes a first acquisition unit that acquires motion information indicating the motion of one or more workers executed in a work area, and a scene where the worker is executing a motion
  • a second acquisition unit for acquiring a moving image including a third acquisition unit for acquiring a reference operation information indicating a reference operation as a reference for comparison of the operation of the worker, the operation information, and the reference operation information
  • An extraction unit that extracts operation information that satisfies a predetermined condition, and an identification unit that specifies a scene in which an operator is executing an operation indicated by the extracted operation information using a moving image.
  • the operation information indicating the operation of the worker the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the operation indicated by the extracted operation information is displayed using the moving image.
  • a motion analysis method includes acquiring motion information indicating a motion of one or more workers performed in a work area, and a moving image including a scene in which the worker is performing a motion Obtaining motion information, obtaining reference motion information indicating a reference motion serving as a comparison reference for the motion of the worker, comparing the motion information with the reference motion information, and obtaining motion information satisfying a predetermined condition Extracting and identifying a scene in which the worker is executing the operation indicated by the extracted operation information using the moving image.
  • the operation information indicating the operation of the worker the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the operation indicated by the extracted operation information is displayed using the moving image.
  • a motion analysis program includes: a first acquisition unit that acquires operation information indicating an operation of one or more workers executed in a work area, in an arithmetic device provided in a motion analysis device; A second acquisition unit that acquires a moving image including a scene in which an operator is executing an operation; a third acquisition unit that acquires reference operation information indicating a reference operation that is a reference for comparison of the worker's operation; An extraction unit that compares operation information with reference operation information and extracts operation information that satisfies a predetermined condition, and an identification unit that specifies a scene where an operator is executing an operation indicated by the extracted operation information using a moving image Act as.
  • the operation information indicating the operation of the worker the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the operation indicated by the extracted operation information is displayed using the moving image.
  • a motion analysis system capable of easily extracting a scene in which a specific motion is performed.
  • the present embodiment an embodiment according to one aspect of the present invention (hereinafter referred to as “the present embodiment”) will be described based on the drawings.
  • symbol in each figure has the same or same structure.
  • the motion analysis system 100 takes a measurement unit 30 that measures motion information that quantitatively indicates a worker's motion performed in a certain work area R, and captures a moving image in which the worker is performing a motion And a second imaging unit 20b and a third imaging unit 20c.
  • the work area R in this example is an area including the entire manufacturing line, the work area R may be any area, for example, an area where a predetermined process is performed or a predetermined element operation is performed. It may be an area.
  • the element operation is a unit operation performed by the operator, and includes, for example, an operation such as picking of parts, arrangement of parts, fixing of parts, packing of products.
  • the first worker A1 can perform, for example, an operation such as picking, arranging, and fixing a first part
  • the second worker A2 can perform an operation such as picking, arranging, and fixing a second part, for example.
  • the motion analysis system 100 includes a motion analysis device 10.
  • the motion analysis apparatus 10 compares the motion information and the reference motion information with a storage unit that stores reference motion information that quantitatively indicates a reference motion serving as a reference for comparison of the motion of the worker, and determines predetermined conditions.
  • An extraction unit that extracts operation information to be satisfied, and a display unit 10 f that displays a scene where an operator is performing an operation indicated by the extracted operation information using a moving image.
  • the operation indicated by the operation information is an operation corresponding to quantitative information on the operation of the worker indicated by the operation information.
  • the storage unit may be separate from the motion analysis device 10, as long as it can communicate with the motion analysis device 10.
  • the reference operation may be a standard operation to be followed by the worker, or may be a worker's mistake operation or a non-standard operation.
  • the predetermined condition may be a condition that the difference between the operation information and the reference operation information is equal to or more than a threshold. In this case, the extraction unit extracts operation information indicating a nonstandard operation.
  • the predetermined condition may be a condition that the difference between the operation information and the reference operation information is equal to or less than a threshold. In this case, the extraction unit extracts a misoperation or a nonstandard operation.
  • the reference operation is a standard operation and the predetermined condition is a condition that the difference between the operation information and the reference operation information is equal to or more than a threshold will be described.
  • the non-standard operation may include an operation that may affect the quality of the product, such as a worker coughing or rubbing his / her nose.
  • the display unit 10 f may display a scene of a time zone during which the worker is executing the operation indicated by the extracted operation information of one moving image, or the operation information extracted from the plurality of moving images is indicated. You may display the scene of the one part animation which the operator is performing.
  • the movement analysis system 100 measures movement information indicating displacement of representative positions of the worker's body, such as displacements of coordinate values of a plurality of joints of the worker, by the measurement unit 30. Then, operation information when a standard operation to be followed by the worker is executed is stored in advance in the storage unit as reference operation information.
  • the measurement unit 30 measures operation information of the worker, and in parallel with the measurement of the operation information by the measurement unit 30, the worker operates by the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c.
  • the extraction unit compares the measured operation information with the reference operation information stored in the storage unit to extract operation information having a difference between the operation information and the reference operation information equal to or greater than a threshold.
  • a threshold for comparison of the operation information and the reference operation information, for example, specifying the start time point and the end time point of the operation information indicating a certain operation, and finding the difference between the operation information recorded at the corresponding timing and the reference operation information You may go there.
  • the worker is executing the operation indicated by the operation information whose divergence is equal to or more than the threshold compared with the extracted operation information, that is, the reference operation information when the standard operation to be followed by the operator is executed.
  • the scene of the moving image is displayed by the display unit 10f.
  • the motion information quantitatively indicating the motion of the worker the motion information satisfying the predetermined condition in comparison with the reference motion information is extracted, and the motion indicated by the extracted motion information using the moving image
  • the scene in which the worker is executing it is possible to easily extract the scene in which the specific operation satisfying the predetermined condition is performed regardless of the number or length of the captured moving images. For example, by displaying a scene of a time zone in which the operation indicated by the extracted motion information in one moving image is executed, a scene suitable for confirming a specific operation is extracted from the one moving image. be able to.
  • the operation analysis system 100 when the reference operation is a standard operation and the predetermined condition is a condition that the difference between the operation information and the reference operation information is equal to or greater than a threshold, according to the operation analysis system 100 according to the present embodiment, It is possible to extract moving image scenes in which non-standard operations are performed. As a result, in the short term, it is possible to detect early that a non-standard operation has been performed, and to reduce, for example, the possibility of defective products being produced on the production line. Even if a non-standard operation is performed, even if a non-defective product is manufactured, it has technical significance in that it can catch signs that a problem occurs and make the subsequent confirmation work more efficient. Such a merit is a merit that can be obtained in real time when using the motion analysis system 100.
  • the motion analysis system 100 includes a first imaging unit 20a, a second imaging unit 20b, a third imaging unit 20c, a measurement unit 30, and the operation analysis apparatus 10.
  • the motion analysis apparatus 10 includes a first acquisition unit 11, a second acquisition unit 12, a third acquisition unit 13, a storage unit 14, an extraction unit 15, an identification unit 16, an estimation unit 17, and a display unit 10f.
  • the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may be configured by general-purpose cameras, and the first operator A1 and the second operator A2 execute operations in the work area R.
  • You may shoot a movie that includes a scene that you
  • the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may respectively capture a part of the work area R, and may capture a moving image in an area smaller than the work area R. Specifically, a moving image in which the operations performed by the first worker A1 and the second worker A2 are closed up may be taken.
  • the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may capture, for example, a moving image in which the hands of the first worker A1 and the second worker A2 are close up.
  • the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may capture a plurality of moving images obtained by capturing a plurality of portions of the work area R, respectively.
  • the first imaging unit 20a mainly captures a moving image in which the first operator A1 is performing an operation
  • the third imaging unit 20c mainly performs a moving image in which the second operator A2 is performing an operation
  • the second imaging unit 20b may capture both the moving image in which the first operator A1 is performing an operation and the moving image in which the second operator A2 is performing an operation.
  • the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may capture a moving image in which different steps are performed at a plurality of positions in the work area R.
  • the measurement unit 30 may be configured by motion capture, and may measure a plurality of pieces of operation information quantitatively showing the operations of the first worker A1 and the second worker A2 performed in a certain work area R.
  • the configuration of the measurement unit 30 is arbitrary, for example, the pattern light is projected to the first worker A1 and the second worker A2, and the first worker A1 and the second worker in a state in which the pattern light is projected A moving image of A2 may be photographed, and coordinate values of a plurality of joints of the first worker A1 and the second worker A2 may be measured based on the photographed moving image.
  • the measurement unit 30 may add information identifying the worker to the operation information.
  • the measurement unit 30 may measure other than the coordinate values of the plurality of joints of the first worker A1 and the second worker A2, and, for example, the measurement unit 30 of the first worker A1 and the second worker A2 Coordinate values of representative positions of the body not necessarily joints, such as fingertips and heads, may be measured. Coordinate values of joints and coordinate values of representative positions of bodies other than joints may be measured together. It may be In addition, the measurement unit 30 may measure the coordinate values of the positions of the trackers worn by the first worker A1 and the second worker A2, and in this case, the representative position of the body is It may be a position. Also, the representative position of the body may be reworded as the position of feature points of the body.
  • the motion analysis system 100 may include a plurality of measurement units 30.
  • the operation information of a plurality of workers is measured by a plurality of measuring units 30, the operation information of the same worker may be measured in duplicate, but information for identifying the worker is added to the operation information Then, duplicates may be removed, or operation information measured by different measurement units 30 may be combined.
  • the measurement unit 30 may also serve as a fourth imaging unit that captures a moving image in which the first worker A1 and the second worker A2 are performing an operation.
  • the fourth imaging unit may capture a moving image of the entire work area R. That is, the fourth imaging unit may capture the state in which the first worker A1 and the second worker A2 are performing operations so that both the first worker A1 and the second worker A2 are included.
  • the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c are the first worker A1 and the second worker so that one of the first worker A1 and the second worker A2 is included.
  • A2 may shoot a moving image in action.
  • a moving image obtained by shooting the work area R shot by the measuring unit 30 corresponds to the “first moving image” of the present invention
  • the first shooting unit 20a, the second shooting unit 20b, and the third shooting unit corresponds to the “second moving image” in the present invention
  • the measurement unit 30 (fourth imaging unit) of the present embodiment corresponds to the "first imaging unit” of the present invention
  • 20c corresponds to the "second imaging unit” in the present invention.
  • the first acquisition unit 11 acquires, from the measurement unit 30, operation information that quantitatively indicates the operations of the first worker A1 and the second worker A2 performed in the work area R.
  • the operation information acquired by the first acquisition unit 11 is transmitted to the storage unit 14 and stored as an operation information history 14 b.
  • the first acquisition unit 11 may acquire operation information from each of the plurality of measurement units 30, and it is determined from which measurement unit 30 the operation information is acquired.
  • the operation information may be transmitted to the storage unit 14 by adding information to be identified. A specific example of the operation information will be described in detail later with reference to FIG.
  • the movement information may be, for example, information obtained by measuring coordinate values of representative positions of the worker's body at one-second intervals.
  • the second acquiring unit 12 is a moving image in which the first worker A1 and the second worker A2 are performing the operation, the first imaging unit 20a, the second imaging unit 20b, the third imaging unit 20c, and the measuring unit 30 ( Acquired from the fourth imaging unit).
  • the moving image acquired by the second acquisition unit 12 is transmitted to the storage unit 14 and stored as a moving image history 14 a.
  • the second acquisition unit 12 may transmit the moving image to the storage unit 14 by adding information that identifies the moving image acquired from any of the plurality of imaging units.
  • the third acquisition unit 13 acquires, from the measurement unit 30, reference operation information that quantitatively indicates a reference operation serving as a comparison reference for the operation of the worker.
  • the reference operation information acquired by the third acquisition unit 13 is transmitted to the storage unit 14 and stored as reference operation information 14c.
  • the third acquisition unit 13 acquires, as reference operation information, operation information not extracted by the extraction unit 15 described later among the already stored operation information history 14 b, and indicates that it is a reference operation. Information may be added and stored in the storage unit 14.
  • the third acquisition unit 13 acquires the operation information specified by the user among the already stored operation information history 14b as the reference operation information, and adds and stores information indicating that it is the reference operation. It may be stored in the unit 14.
  • the reference operation information may be, for example, information obtained by measuring coordinate values of representative positions of the worker's body when the standard operation to be followed by the worker is performed at one second intervals.
  • the storage unit 14 stores at least reference operation information 14 c quantitatively indicating a reference operation which is a reference of comparison for the operation of the worker.
  • the storage unit 14 stores a moving image history 14a, an operation information history 14b, a reference operation information 14c, and a correspondence table 14d.
  • the correspondence table 14 d is used to estimate the position in the work area R in which the operation indicated by the operation information extracted by the extraction unit 15 is performed.
  • the correspondence table 14d will be described later by showing a specific example.
  • the extraction unit 15 compares the operation information measured by the measurement unit 30 with the reference operation information 14c, and extracts operation information that satisfies a predetermined condition.
  • the motion information and the reference motion information 14c may include the coordinate values of the worker's joints, respectively, and the extraction unit 15 may calculate the coordinate values included in the motion information and the coordinate values included in the reference motion information 14c.
  • the operation information may be extracted to satisfy a predetermined condition by comparison.
  • the predetermined condition may be a condition that the degree of deviation between the coordinate value included in the operation information and the coordinate value included in the reference operation information 14c is equal to or greater than a threshold.
  • the extraction unit 15 may extract the operation information satisfying the predetermined condition by comparing the operation information measured by the measurement unit 30 with the reference operation information 14c determined for each of the plurality of steps. In this case, the extracted operation information corresponds to one process. Further, the extraction unit 15 may extract the operation information satisfying the predetermined condition by comparing the operation information measured by the measurement unit 30 with the reference operation information 14c determined for each of the plurality of element operations. . In this case, the extracted operation information corresponds to one element operation.
  • the identifying unit 16 identifies a scene in which the operator is executing the operation indicated by the extracted operation information, using the moving image history 14a.
  • the identifying unit 16 compares the time when the motion information is measured with the time when the moving image is shot, and in the moving picture history 14a, a scene of a time zone in which the operation indicated by the extracted action information is performed. You may specify.
  • the identifying unit 16 may identify the worker for which the extracted operation information is measured, based on the information identifying the worker added to the extracted operation information.
  • the specification unit 16 may specify a process indicated by the operation information extracted by the extraction unit 15 or may specify an element operation indicated by the operation information extracted by the extraction unit 15.
  • the estimation unit 17 estimates the position in the work area R in which the operation indicated by the operation information extracted by the extraction unit 15 is performed.
  • the estimation unit 17 may estimate the position in the work area R corresponding to the operation information extracted by the extraction unit 15 by referring to the correspondence table 14 d stored in the storage unit 14.
  • the storage unit 14 indicates the first correspondence table D3 in which the correspondence between the process and the imaging unit is indicated, and the correspondence between the coordinate values of the joints of the worker and the imaging unit.
  • a second correspondence table D4 may be stored, and a third correspondence table D6 in which the coordinate values of the joints of the worker and the correspondence between the element operation and the photographing unit are indicated.
  • the identifying unit 16 may identify a moving image captured at a position estimated by the estimating unit 17 among the plurality of moving images.
  • the estimation unit 17 may estimate the position in the work area R where the worker is executing the operation indicated by the extracted operation information based on the process indicated by the operation information extracted by the extraction unit 15. In this case, the estimation unit 17 refers to the first correspondence table D3 in which the correspondence between the process and the imaging unit is indicated, and the position in the work area R where the operator performs the operation indicated by the extracted operation information. You may estimate
  • the estimation unit 17 may estimate the position in the work area R where the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the operation information extracted by the extraction unit 15 .
  • the estimation unit 17 refers to the second correspondence table D4 in which the correspondence between the coordinate values of the joint of the worker and the imaging unit is indicated, and the worker executes the operation indicated by the extracted operation information.
  • the position in the working area R may be estimated.
  • the estimation unit 17 is a task in which the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the operation information extracted by the extraction unit 15 and the element operation indicated by the extracted operation information.
  • the position in the region R may be estimated.
  • the estimation unit 17 refers to the third correspondence table D6 in which the coordinate value of the joint of the worker and the correspondence between the element operation and the imaging unit are displayed, and the worker performs the operation indicated by the extracted operation information.
  • the position in the work area R being executed may be estimated.
  • the motion analysis system 100 may evaluate the worker's tiredness and intimacy based on the frequency at which the worker performs a non-standard motion.
  • the evaluation of fatigue and irritation may be performed, for example, by evaluating how often the frequency of occurrence of non-standard motion detected for a certain worker deviates from the average frequency of occurrence for all the workers.
  • the motion analysis system 100 may adjust the work load of the worker based on the tiredness and inconsistencies of the evaluated worker. More specifically, adjustment is made to slow down the flow speed of the production line as fatigue and insecurity of the evaluated worker increase, thereby reducing the workload of the worker and preventing the generation of defective products. It may be
  • the operation analysis device 10 includes a central processing unit (CPU) 10 a corresponding to an arithmetic device, a random access memory (RAM) 10 b corresponding to the storage unit 14, and a read only memory (ROM) 10 c corresponding to the storage unit 14.
  • a communication unit 10d, an input unit 10e, and a display unit 10f are provided.
  • Each of these configurations is mutually connected so as to be able to transmit and receive data via a bus.
  • the motion analysis apparatus 10 may be implement
  • the CPU 10a is a control unit that performs control related to the execution of a program stored in the RAM 10b or the ROM 10c, and performs calculation and processing of data.
  • the CPU 10a compares the operation information with the reference operation information to extract the operation information satisfying the predetermined condition, and identifies a part of the moving image that is executing the operation indicated by the extracted operation information (operation (operation) Is an arithmetic unit that executes an analysis program).
  • the CPU 10a receives various input data from the input unit 10e and the communication unit 10d, and displays the calculation result of the input data on the display unit 10f or stores it in the RAM 10b or the ROM 10c.
  • the RAM 10 b is a part of the storage unit 14 in which data can be rewritten, and may be formed of, for example, a semiconductor storage element.
  • the RAM 10 b stores an operation analysis program executed by the CPU 10 a and data such as a moving image history 14 a, an operation information history 14 b, a reference operation information 14 c, and a correspondence table 14 d.
  • the ROM 10 c is one of the storage units 14 that can read data, and may be configured of, for example, a semiconductor storage element.
  • the ROM 10 c stores, for example, an operation analysis program and data not to be rewritten.
  • the communication unit 10 d is an interface that connects the operation analysis device 10 to an external device.
  • the communication unit 10d is connected to the first imaging unit 20a, the second imaging unit 20b, the third imaging unit 20c, and the measuring unit 30, for example, by a LAN (Local Area Network), and is connected to the first imaging unit 20a and the second imaging unit 20b.
  • the moving image may be received from the third imaging unit 20c, and the moving image and operation information may be received from the measuring unit 30.
  • the communication unit 10 d may be connected to the Internet to receive a moving image or receive operation information via the Internet.
  • the communication unit 10d may transmit, to the external device, a portion of the captured moving image that is performing the operation indicated by the operation information extracted by the extraction unit 15 through the Internet.
  • the input unit 10 e receives an input of data from the user, and may include, for example, a keyboard, a mouse, and a touch panel.
  • the display unit 10 f visually displays the calculation result by the CPU 10 a, and may be configured of, for example, an LCD (Liquid Crystal Display). An example of the screen displayed on the display unit 10 f will be described in detail later.
  • LCD Liquid Crystal Display
  • the operation analysis program may be stored in a computer-readable storage medium such as the RAM 10 b or the ROM 10 c and provided, or may be provided via a communication network connected by the communication unit 10 d.
  • the CPU 10a executes the operation analysis program to realize various operations described with reference to FIG. Note that these physical configurations are exemplifications and may not necessarily be independent configurations.
  • the operation analysis apparatus 10 may include an LSI (Large-Scale Integration) in which the CPU 10a, the RAM 10b, and the ROM 10c are integrated.
  • FIG. 4 is a view showing an example of operation information D1 measured by the operation analysis system 100 according to the present embodiment.
  • the figure shows an example in which the coordinate values of joints of a plurality of workers are measured at intervals of one second by the measuring unit 30.
  • time indicates the date and time when the operation information was measured by the measurement unit 30, and for example, “2017/9/1 8:43:21” is shown in the second and sixth lines from the top. , And it is represented that it is operation information measured at 8:43:21 on September 1, 2017. Similarly, the third line from the top indicates “2017/9/1 8:43:22”, which is operation information measured at 8:43:22 on September 1, 2017 Is represented.
  • the fourth line from the top indicates "2017/9/1 8:43:23", and it is operation information measured at 8:43:23 on September 1, 2017 Is represented. Note that the description “...” Shown in the fifth and seventh lines is an ellipsis indicating that a plurality of data are included.
  • the “worker ID” is information for identifying a plurality of workers whose operation information has been measured by the measurement unit 30.
  • the worker ID is “A1” for the second to fourth lines from the top and “A2” for the sixth line from the top. That is, the operation information on the second to fourth lines from the top is the operation information measured for the first worker A1, and the operation information on the sixth line from the top is the operation measured for the second worker A2 It is shown to be information.
  • the “right hand (X coordinate)” indicates the position on the X axis of the coordinate system with the measuring unit 30 as the origin, for the right hand joint of the worker.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the X coordinate value of the right hand is “463” at 8:43:21 for the first worker A1, “533” at 8:43:22, and 8:43:23. It is shown that it is "483" in second. From this, it can be read that the first worker has moved the right hand by 7 cm in the X-axis direction. Also, for the second worker A2, the X coordinate value of the right hand is shown to be “416” at 8:43:21.
  • the “right hand (Y coordinate)” indicates the position on the Y axis of the coordinate system with the measuring unit 30 as the origin, for the right hand joint of the worker.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the Y coordinate value of the right hand is “574” at 8:43:21 for the first worker A1, “977” at 8:43:22, and 8:43:23. It is shown that it is "830" in second. From this, it can be read that the first worker moved the right hand by about 40 cm in the Y-axis direction and returned the position by about 15 cm.
  • the Y coordinate value of the right hand is shown to be “965” at 8:43:21.
  • the “right hand (Z coordinate)” indicates the position of the operator's right joint in the Z axis of the coordinate system with the measuring unit 30 as the origin.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the Z coordinate value of the right hand is “531” at 8:43:21 for the first worker A1, “341” at 8:43:22, and 8:43:23. It is shown that the second is "624". From this, it can be read that the first worker moved the right hand about 9 cm in the minus direction of the Z coordinate and moved the right hand about 28 cm in the plus direction of the Z coordinate. Also, for the second worker A2, the Z coordinate value of the right hand is shown to be “408” at 8:43:21.
  • the “left hand (X coordinate)” indicates the position on the X axis of the coordinate system with the measuring unit 30 as the origin with respect to the left hand joint of the worker.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the X coordinate value of the left hand is “327” at 8:43:21 for the first worker A1, “806” at 8:43:22, 8:43:23. It is shown that it is "652" in second. From this, it can be read that the first operator moved the left hand by 7 cm in the X-axis direction and immediately returned to the original position.
  • the X coordinate value of the left hand is shown to be “374” at 8:43:21.
  • the description of coordinate values of other joints included in the movement information D1 is omitted, but the movement information D1 is a three-dimensional movement of each joint such as wrist, elbow, shoulder, head, waist, knee, ankle and so on. Coordinate values may be included.
  • the operation information D1 includes the operation information of two workers, but may include the operation information of three or more workers, and a plurality of measured by the plurality of measurement units 30 Operation information for each worker may be included.
  • FIG. 5 is a diagram showing an example of the first reference operation information D2 stored by the operation analysis system 100 according to the present embodiment.
  • reference operation information defined for each of a plurality of steps is shown, and an example is shown in which coordinate values of joints when an operator performs a standard operation are recorded at one second intervals.
  • the "elapsed time from start” indicates the elapsed time from the start to the end of one process in seconds.
  • “00:00:00” is shown in the second line from the top, which indicates that the line indicates reference operation information at the start of the process.
  • the third line from the top indicates “00:00:01”, which indicates that it is reference operation information one second after the start of the process.
  • the fifth line from the top shows “00:00:00”, which indicates that it is a line indicating reference operation information at the start of a different process.
  • the sixth line from the top indicates “00:00:01”, which indicates that it is reference operation information one second after the start of the process. Note that the description “...” Shown in the fourth and seventh lines is an ellipsis indicating that a plurality of data are included.
  • the “step” is information indicating which step the reference operation information relates to. For example, in the second and third lines from the top, "1. Assembly” is shown, and it is indicated that the reference operation information of these lines relates to the assembly process. Similarly, in the fifth and sixth lines from the top, "5. package” is shown, and it is indicated that the reference operation information of these lines relates to the packing process.
  • the “right hand (X coordinate)” indicates the position on the X axis of the coordinate system with the measuring unit 30 as the origin, for the right hand joint of the worker.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the standard right hand X coordinate value is shown to be “463" at the start and "533" after one second for the assembly process. Also, for the packaging process, it is shown that the standard right hand X coordinate value is “416” at the start and “796” after one second.
  • the “right hand (Y coordinate)” indicates the position on the Y axis of the coordinate system with the measuring unit 30 as the origin, for the right hand joint of the worker.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the standard right hand Y coordinate value is shown to be "574" at the beginning and "977" after one second for the assembly process. Also, for the packaging process, it is shown that the standard right hand Y coordinate value is "965" at the beginning and "595" after one second.
  • the “right hand (Z coordinate)” indicates the position of the operator's right joint in the Z axis of the coordinate system with the measuring unit 30 as the origin.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the standard right-hand Z coordinate value is shown to be "531" at the beginning and "341" after one second for the assembly process. Also, for the packaging process, the standard right-hand Z coordinate value is shown to be "408" at the start and "949" after one second.
  • the “left hand (X coordinate)” indicates the position on the X axis of the coordinate system with the measuring unit 30 as the origin with respect to the left hand joint of the worker.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the standard left hand X coordinate value is shown to be "327" at the beginning and "806" after one second for the assembly process. Also, for the packaging process, it is shown that the standard left hand X coordinate value is "374" at the start and "549" after one second.
  • the description of coordinate values of other joints included in the first reference motion information D2 is omitted, but the first reference motion information D2 includes a wrist, an elbow, a shoulder, a head, a hip, a knee, an ankle, etc. And three-dimensional coordinate values of representative positions of each joint and body.
  • the first reference operation information D2 includes reference operation information of two steps, but may include reference operation information of three or more steps.
  • FIG. 6 is a diagram showing an example of the first correspondence table D3 stored by the behavior analysis system 100 according to the present embodiment. In the figure, the example which recorded the correspondence of several process and several imaging
  • first correspondence table D3 of this example items such as "process” and "shooting unit” are shown as an example.
  • correspondences between a plurality of processes and a plurality of imaging units are shown in one-to-one correspondence.
  • the corresponding imaging unit is the "first imaging unit” (the first imaging unit 20a) for the process "1. assembly”, and the imaging corresponding to the process "5. packing”
  • the unit is a "third imaging unit” (third imaging unit 20c).
  • the first correspondence table D3 may indicate the correspondence between a plurality of processes and a plurality of imaging units in a one-to-many manner.
  • two or more imaging units may correspond to the assembly process.
  • two or more imaging units in which the assembly process is taken at different angles and different distances may be shown.
  • FIG. 7 is a flowchart of a first example of the process of extracting operation information performed by the operation analysis system 100 according to the present embodiment.
  • the first example of the process of extracting the operation information is a process of comparing the operation information D1 and the first reference operation information D2 and extracting the operation information satisfying the predetermined condition.
  • the motion analysis system 100 first calculates the difference between the coordinate value of each row included in the motion information D1 and the start coordinate value included in the reference motion information of a predetermined step of the first reference motion information D2 (S10) .
  • the start coordinate value is a coordinate value of each joint when “the elapsed time from the start” is “00:00:00”.
  • the motion analysis system 100 specifies the row of the motion information D1 for which the difference between the coordinate values is equal to or less than the threshold as the start row of the predetermined process (S11).
  • whether or not the difference between coordinate values is equal to or less than the threshold may be determined based on whether the absolute value of the difference between coordinate values is equal to or less than the threshold. Such processing is necessary because it is not clear in advance which coordinate value of each row of the motion information D1 corresponds to.
  • the motion analysis system 100 calculates a difference between the coordinate value of each row included in the motion information D1 and the end coordinate value included in the reference motion information of the predetermined process in the first reference motion information D2 (S12). ).
  • the end coordinate value is the coordinate value of each joint shown in the last line of the predetermined process.
  • the behavior analysis system 100 specifies the row of the behavior information for which the difference between the coordinate values is equal to or less than the threshold as the end row of the predetermined process (S13).
  • whether or not the difference between coordinate values is equal to or less than the threshold may be determined based on whether the absolute value of the difference between coordinate values is equal to or less than the threshold.
  • the motion analysis system 100 cuts out from the identified start row to the end row among the plurality of rows included in the motion information D1 (S14). At this time, the number of lines to be cut out does not necessarily match the number of lines of reference operation information of a predetermined process in the first reference operation information D2. Therefore, the motion analysis system 100 performs interpolation or thinning so that the number of rows of motion information extracted from the motion information D1 matches the number of rows of reference motion information of a predetermined process in the first reference motion information D2 (S15) ).
  • the number of lines of motion information cut out from the motion information D1 is smaller than the number of lines of reference motion information of a predetermined step in the first reference motion information D2, interpolation is performed and motion information cut out from the motion information D1 If the number of rows is larger than the number of rows of reference operation information of a predetermined process in the first reference operation information D2, the number of rows may be reduced.
  • the method of interpolation or thinning-out is arbitrary, when performing interpolation, for example, you may interpolate by the average value of the line of back and front, and when thinning-out, the line with a comparatively small change of a coordinate value is comparatively It is good to thin out the
  • the motion analysis system 100 calculates the degree of deviation between the coordinate value of the motion information interpolated or thinned out and the coordinate value of the reference motion information of the predetermined process (S16).
  • the degree of deviation is a numerical value indicating how much the operation of the operator deviates from the standard operation to be followed by the operator.
  • the degree of divergence may be calculated for each joint, or may be calculated in a unit (such as the right hand) in which a plurality of joints are combined.
  • the degree of deviation for each joint is the coordinate value (x, y, z) of the motion information and the coordinate value (X, Y, Z) of the reference motion information of a predetermined process for the three-dimensional coordinate value of the joint at a certain time.
  • the behavior analysis system 100 extracts the behavior information from the identified start line to the end line (S18).
  • the operation information to be extracted may be operation information in a state where interpolation or thinning is not performed.
  • the threshold may also be set for each joint, and when any one of the degrees of divergence calculated for each joint is equal to or more than the threshold, the motion information is extracted. The motion information may be extracted if the degree of divergence calculated for two or more joints is equal to or greater than a threshold.
  • the predetermined process is performed by calculating the difference between the coordinate value of each row included in the operation information D1 and the start coordinate value included in the reference operation information of the predetermined process of the first reference operation information D2.
  • the specified start line is specified, and the difference between the coordinate value of each row included in the operation information D1 and the end coordinate value included in the reference operation information of the predetermined step in the first reference operation information D2 is calculated.
  • the end line of the process is specified, the movement analysis system 100 may cut out movement information by other methods.
  • the motion analysis system 100 determines that the coordinate value representing the center of the worker's body is a predetermined range
  • the section of the motion information contained in may be cut out as a section of motion information corresponding to a predetermined process.
  • FIG. 8 is a flowchart of a first example of the moving image display process executed by the behavior analysis system 100 according to the present embodiment.
  • the first example of the moving image display process in the first example of the process of extracting the operation information, after the operation information is extracted, a process of displaying a portion of the moving image that is executing the operation indicated by the extracted operation information (S19 An example).
  • the motion analysis system 100 refers to the first correspondence table D3 to estimate the position at which the process indicated by the extracted motion information is performed (S190). For example, when the process indicated by the extracted operation information is a packing process, “5. packing” is identified in the “process” column of the first correspondence table D3, and the corresponding “imaging unit 20c” is ", To estimate the position at which the process is performed.
  • the position may be estimated based on, for example, the correspondence between a plurality of imaging units and the position in the work area R in which the plurality of imaging units are installed, or the position may be estimated based on the specification of the imaging units. .
  • the motion analysis system 100 is photographed at a position estimated for a first moving image, which is a moving image of the entire work area R photographed by the measurement unit 30 (fourth image pickup unit), and an area narrower than the work area R.
  • a portion of a time zone in which the operation indicated by the extracted operation information is executed is displayed for the second moving image (S191).
  • the portion of the time zone in which the operation indicated by the extracted operation information is being executed is based on the time of the start line and the end line of the extracted operation information. It is specified by cutting out the second moving image.
  • the behavior analysis system 100 determines the time based on the start row and the end row of the extracted operation information.
  • the first moving image and the second moving image are cut out and displayed on the display unit 10 f.
  • the motion analysis system 100 displays the process indicated by the extracted motion information (S192), and displays the identification information of the worker for which the extracted motion information is measured (S193).
  • the process indicated by the extracted operation information is a packing process
  • the worker who has executed the process is the first worker A1
  • the operation analysis system 100 determines the first moving image and the second moving image that have been cut out.
  • the display unit 10f displays that the content of the moving image is the packing process and that the worker who executes the process is the first worker A1.
  • the display of a process may be performed by the information which can identify a several process. For example, in the present example, a serial number is added to the process, and the number "5" may be displayed to represent the packing process.
  • the display of the worker may be performed by information that can identify a plurality of workers.
  • a worker ID may be added to a plurality of workers, and the ID “A1” may be displayed to represent the first worker A1.
  • the first example of the moving image display process ends.
  • FIG. 9 is an example of the screen DP displayed by the motion analysis system 100 according to the present embodiment.
  • the screen DP is a scene of a time zone during which the operator performs an operation indicated by the extracted operation information with respect to the first moving image and the second moving image shot at the estimated position. Is displayed (S191).
  • the screen DP includes a summary DP1, an entire moving picture DP2, and a hand moving picture DP3.
  • the outline DP1 is information indicating an outline of the extracted operation information.
  • the operation indicated by the extracted operation information is “non-standard operation”, and the start time is “2017/7/7 10: 10: 44.138”, that is, 10 am on July 7, 2017. 10 minutes 44.138 seconds and the end time is “2017/7/7 10: 12: 44.435”, that is, 10: 12: 44.435 seconds on July 7, 2017, and extracted
  • the process indicated by the operation information is the "packaging process", and the required time is "5.5 s" (5.5 seconds).
  • the outline DP1 may indicate not the name of the process indicated by the extracted operation information but information identifying the process indicated by the extracted operation information. For example, an ID such as a serial number assigned to each process may be indicated.
  • the whole moving image DP2 is a first moving image obtained by photographing the entire work area R by the measuring unit 30 (fourth image pickup unit), and it is "2017/7/7 10: 10: 44.138" at the lower right of the moving image screen. "It is shown.
  • the entire moving image DP2 makes it possible to generally understand how a plurality of workers are performing an operation. Further, in the general moving image DP2, the positions of joints of a plurality of workers detected by the measurement unit 30 are indicated by a skeletal model. This makes it possible to confirm that the coordinates of joints of a plurality of workers are measured at appropriate positions.
  • the hand moving image DP3 is a second moving image obtained by capturing an area narrower than the work area R by the third imaging unit 20c, and as the start time at the lower right of the moving image screen, “2017/7/7 10:10: 44.138 "is shown.
  • “packaging process” and “worker ID: A1” are displayed on the moving image screen, and information identifying the process indicated by the extracted operation information and the worker whose extracted operation information was measured Identification information is displayed. The details of the operation actually performed by the worker can be confirmed by the hand moving image DP3 in which the worker's hand is closed up.
  • the motion information satisfying the predetermined condition in comparison with the reference motion information is extracted from the plurality of pieces of motion information quantitatively indicating the motions of the plurality of workers.
  • the motion information satisfying the predetermined condition in comparison with the reference motion information is extracted from the plurality of pieces of motion information quantitatively indicating the motions of the plurality of workers.
  • the specific operation satisfying the predetermined condition is entirely performed.
  • the second moving image obtained by capturing an area narrower than the work area R by displaying the scene of the time zone during which the operator is executing the operation indicated by the extracted operation information, It is possible to confirm the details of a specific operation that meets a predetermined condition.
  • the position where the specific operation satisfying the predetermined condition is performed can be estimated.
  • the second moving image captured at the estimated position among the plurality of second moving images captured at multiple positions in it is possible to confirm the details of the operation performed at the estimated position. .
  • the operation information quantitatively indicating the operation of the worker is extracted in comparison with the reference operation information determined for each of the plurality of steps, and the extracted operation information indicates By displaying the information identifying the process, it is possible to confirm in which process the specific operation satisfying the predetermined condition is the operation performed. In the case of this example, it can be confirmed that the non-standard operation has been performed in the packing process.
  • the second moving image shot at the position at which the process indicated by the extracted operation information is performed can be displayed, and the details of the operation performed in the step Can be confirmed.
  • FIG. 10 is a diagram showing an example of the second correspondence table D4 stored by the behavior analysis system 100 according to the present embodiment. In the figure, the example which recorded the correspondence of the coordinate range of several joints and several imaging
  • the second correspondence table D4 of this example items such as “right hand range (X coordinate)”, “right hand range (Y coordinate)”, and “shooting unit” are shown as an example.
  • the correspondence relationship between the range of the three-dimensional coordinates of the plurality of joints and the plurality of imaging units is shown in one-to-one correspondence.
  • the corresponding imaging unit is “first imaging unit” (first It is shown that the photographing unit 20a).
  • the second correspondence table D4 includes a wrist, an elbow, a shoulder, a head, a waist, a knee, an ankle, etc. in one row.
  • a range of three-dimensional coordinate values of each joint may be included.
  • the second correspondence table D4 may indicate the correspondence between the ranges of the three-dimensional coordinates of the plurality of joints and the plurality of imaging units in a one-to-many manner.
  • two or more imaging units may correspond to the range of three-dimensional coordinates of a plurality of joints.
  • movement from different angles and different distances may be shown.
  • FIG. 11 is a flowchart of a second example of the moving image display process executed by the behavior analysis system 100 according to the present embodiment.
  • the second example of the moving image display process is a process of displaying the portion of the moving image that is executing the operation indicated by the extracted operation information after the operation information is extracted in the first example of the operation information extraction process (S19 Another example of).
  • the motion analysis system 100 refers to the second correspondence table D4 to search for a row including the largest number of coordinate values included in the motion information extracted from the second correspondence table D4 (S195). For example, when the X coordinate of the right hand included in the extracted motion information is in the range of 550 to 650, the “right hand range (X coordinate)” shown in the second row from the top of the second correspondence table D4 is extracted It is identified as a line including the largest number of coordinate values included in the selected motion information. Similarly, with regard to the Y coordinate and Z coordinate of the right hand, and the three-dimensional coordinates of other joints, a row with the most overlap is searched as compared with the coordinate range shown in the second correspondence table D4.
  • the motion analysis system 100 refers to the searched line to estimate the position at which the motion indicated by the extracted motion information is being executed (S196).
  • the searched line is the second line from the top of the second correspondence table D4
  • the position at which the operation has been performed is estimated by the “first imaging unit 20a” that is the corresponding imaging unit.
  • the position may be estimated based on, for example, the correspondence between a plurality of imaging units and the position in the work area R in which the plurality of imaging units are installed, or the position may be estimated based on the specification of the imaging units. .
  • the motion analysis system 100 includes a first moving image, which is a moving image of the entire work area R taken by the measurement unit 30 (fourth imaging unit), and a second moving image taken at a position estimated for an area narrower than the work area R.
  • the portion of the time zone in which the operation indicated by the extracted operation information is executed is displayed (S 197).
  • the portion of the time zone in which the operation indicated by the extracted operation information is being executed is based on the time of the start line and the end line of the extracted operation information. It is specified by cutting out the second moving image.
  • the motion analysis system 100 displays the identification information of the worker for which the extracted motion information has been measured (S198). For example, when the worker who has executed the operation indicated by the extracted operation information is the first worker A1, the operation analysis system 100 performs the operation to execute the operation together with the first moving image and the second moving image which have been cut out.
  • the display unit 10f displays that the worker is the first worker A1.
  • the second example of the moving image display process ends.
  • the position in the work area where the worker is executing the motion indicated by the motion information is By estimating, the position where the motion indicated by the extracted motion information is performed can be accurately estimated.
  • FIG. 12 is a diagram showing an example of the second reference motion information D5 stored by the motion analysis system 100 according to the present embodiment.
  • reference motion information defined for each of a plurality of element motions is shown, and an example is shown in which coordinate values of joints when a worker performs a standard motion are recorded at one second intervals.
  • the "elapsed time from start” indicates the elapsed time from the start to the end of one process in seconds.
  • the second line from the top is indicated as "00:00:00”, which represents a line indicating reference operation information at the start of an element operation.
  • the third line from the top indicates “00:00:01”, which indicates that it is reference operation information one second after the start of the element operation.
  • the fifth line from the top indicates “00:00:00”, which represents a line indicating reference operation information at the start of the different element operation.
  • the sixth line from the top indicates "00:00:01”, which indicates that it is reference operation information one second after the start of the element operation. Note that the description “...” Shown in the fourth and seventh lines is an ellipsis indicating that a plurality of data are included.
  • the “element operation” is information indicating which element operation the reference operation information relates to. For example, in the second and third lines from the top, "part pick” is indicated, and it is indicated that the reference operation information of these lines relates to picking of parts. Similarly, in the fifth and sixth lines from the top, “arrangement” is shown, and it is indicated that the reference operation information of these lines relates to the arrangement of parts.
  • the “right hand (X coordinate)” indicates the position of the operator's right joint in the X axis of the coordinate system with the start point as the origin.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the standard right-handed X-coordinate value is shown to be "0" at the beginning and "533" one second after the element operation of the part pick.
  • the standard right hand X coordinate value is "0" at the start and "796" after one second.
  • the “right hand (Y coordinate)” indicates the position on the Y axis of the coordinate system with the start point as the origin for the right hand joint of the worker.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the standard right-handed Y coordinate value is shown to be "0" at the beginning and "977" one second after the element operation of the part pick. Also, for the element operation of the arrangement, it is shown that the standard right hand Y coordinate value is "0" at the start and "595" after one second.
  • the “right hand (Z coordinate)” indicates the position of the operator's right joint in the Z axis of the coordinate system with the start point as the origin.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the standard right-hand Z coordinate value is shown to be “0" at the beginning and "341" after one second for the element operation of the part pick. Also, for the element operation of the arrangement, the standard right hand Z coordinate value is shown to be "0" at the start and "949" after one second.
  • the “left hand (X coordinate)” indicates the position of the operator's left joint in the X axis of the coordinate system with the start point as the origin.
  • a unit is arbitrary, it is mm (millimeter) in this example.
  • the standard left hand X coordinate value is shown to be "0" at the beginning and "806" one second after the element operation of the part pick. Also, for the element operation of the arrangement, it is shown that the standard left hand X coordinate value is "0" at the start and "549" after one second.
  • the description of the coordinate values of other joints included in the second reference motion information D5 is omitted, but the second reference motion information D5 includes a wrist, an elbow, a shoulder, a head, a hip, a knee, an ankle, etc. May include three-dimensional coordinate values of each joint. Also, in the present example, the second reference motion information D5 includes reference motion information of two element motions, but may include reference motion information of three or more element motions.
  • FIG. 13 is a flowchart of a second example of the process of extracting operation information performed by the operation analysis system 100 according to the present embodiment.
  • the second example of the process of extracting the operation information is a process of comparing the operation information D1 and the second reference operation information D5 to extract the operation information satisfying the predetermined condition.
  • the motion analysis system 100 first calculates the speed of each joint based on the adjacent coordinate values included in the motion information D1 (S40). For example, in the case of coordinate values measured at one-second intervals, the joint velocity may be calculated by dividing the difference between the coordinate values of adjacent rows by one second.
  • the motion analysis system 100 identifies a row in which the velocity of any joint is less than or equal to the threshold as a row in which element motion changes (S41). For example, when transitioning from an element motion of a part pick to an element motion of placing a part, the speed of a hand joint becomes almost zero when finishing picking a part and moving to placement, so the speed of any joint is thresholded
  • the following line can be identified as a line in which element operation changes.
  • the motion analysis system 100 specifies all the lines where the element motion changes for the motion information D1, and then cuts out from the motion information D1 from the start line to the end line of one element motion (S42). At this time, the number of lines to be cut out does not necessarily match the number of lines of reference operation information of a predetermined element operation in the second reference operation information D5. Therefore, the motion analysis system 100 performs interpolation or thinning so that the number of rows of motion information extracted from the motion information D1 matches the number of rows of reference motion information of a predetermined element motion in the second reference motion information D5 ( S43).
  • the motion analysis system 100 calculates the degree of deviation between the coordinate value of the motion information interpolated or thinned out and the coordinate value of the reference motion information of the predetermined element motion (S44).
  • the degree of divergence may be calculated for each joint.
  • the degree of divergence is the difference between the coordinate value (x, y, z) of the motion information and the coordinate value (X, Y, Z) of the reference motion information of a predetermined process for the three-dimensional coordinate value of the joint at a certain time.
  • the square is taken and the square root is taken, the value of ((x-X) 2 + (y-Y) 2 + (z-Z) 2 ) 1/2 is obtained, and this value is calculated by summing over all time You may However, the degree of deviation may be calculated by another value.
  • the behavior analysis system 100 extracts the behavior information from the identified start line to the end line (S46).
  • the operation information to be extracted may be operation information in a state where interpolation or thinning is not performed.
  • the threshold may also be set for each joint, and when any one of the degrees of divergence calculated for each joint is equal to or more than the threshold, the motion information is extracted. The motion information may be extracted if the degree of divergence calculated for two or more joints is equal to or greater than a threshold.
  • the motion analysis system 100 After the motion analysis system 100 extracts the motion information, the motion analysis system 100 displays a portion of the moving image in which the motion indicated by the extracted motion information is being executed (S47). The moving image display process will be described in detail using the following figure.
  • the motion analysis system 100 finally determines whether analysis of all element motions included in the second reference motion information D5 is completed (S48), and analysis of all element motions is completed (S48: YES), The second example of the operation information extraction process ends.
  • FIG. 14 is a flowchart of a third example of the moving image display process executed by the behavior analysis system 100 according to the present embodiment.
  • the third example of the moving image display process is a process of displaying the portion of the moving image indicated by the extracted operation information that is executing the operation after the operation information is extracted in the second example of the operation information extraction process (S47 An example).
  • the motion analysis system 100 refers to the second correspondence table D4 to search for a row including the largest number of coordinate values included in the extracted motion information in the second correspondence table D4 (S470). For example, when the X coordinate of the right hand included in the extracted motion information is in the range of 550 to 650, the “right hand range (X coordinate)” shown in the second row from the top of the second correspondence table D4 is extracted It is identified as a line including the largest number of coordinate values included in the selected motion information. Similarly, with regard to the Y coordinate and Z coordinate of the right hand, and the three-dimensional coordinates of other joints, a row with the most overlap is searched as compared with the coordinate range shown in the second correspondence table D4.
  • the motion analysis system 100 estimates the position at which the motion indicated by the extracted motion information is being executed, with reference to the searched line (S471). For example, when the searched line is the second line from the top of the second correspondence table D4, the position at which the operation has been performed is estimated by the “first imaging unit 20a” that is the corresponding imaging unit.
  • the position may be estimated based on, for example, the correspondence between a plurality of imaging units and the position in the work area R in which the plurality of imaging units are installed, or the position may be estimated based on the specification of the imaging units. .
  • the motion analysis system 100 includes a first moving image, which is a moving image of the entire work area R taken by the measurement unit 30 (fourth imaging unit), and a second moving image taken at a position estimated for an area narrower than the work area R.
  • the portion of the time zone in which the operation indicated by the extracted operation information is executed is displayed (S 472).
  • the portion of the time zone in which the operation indicated by the extracted operation information is being executed is based on the time of the start line and the end line of the extracted operation information. It is specified by cutting out the second moving image.
  • the motion analysis system 100 displays the element motion indicated by the extracted motion information (S 473), and displays the identification information of the worker for which the extracted motion information is measured (S 474). For example, if the element motion indicated by the extracted motion information is a component pick and the worker who has executed the element motion is the first worker A1, the motion analysis system 100 determines that the first moving image and the second motion image have been cut out. In addition to the moving image, the display unit 10f displays that the content of the moving image is a component pick and that the worker who executes the element operation is the first worker A1. Note that the display of the element operation may be performed by information that can identify a plurality of element operations. In addition, the display of the worker may be performed by information that can identify a plurality of workers. Thus, the third example of the video display process ends.
  • the worker is caused to exhibit the motion indicated by the extracted motion information based on the coordinate value of the joint included in the extracted motion information and the element motion indicated by the extracted motion information.
  • Is estimated at the position where the element motion indicated by the extracted motion information is performed among the plurality of second moving images shot at the plurality of positions in the work area by estimating the position in the work area being executed by The second moving image can be displayed, and details of the element operation can be confirmed.
  • by defining the reference operation information for each of a plurality of element operations it is possible to easily check whether there is a leak occurring in the element operation being executed by the operator, and the operator can perform an appropriate operation. It is possible to reduce the burden of checking whether you are going.
  • FIG. 15 is a diagram showing an example of the third correspondence table D6 stored by the behavior analysis system 100 according to the present embodiment.
  • the example which recorded the correspondence of the coordinate range of several joints, several element operation
  • the third correspondence table D6 of this example items such as “right hand range (X coordinate)”, “right hand range (Y coordinate)”, “element operation”, “shooting unit”, and “remarks” are shown as an example.
  • the correspondence relationship between the range of the three-dimensional coordinates of the plurality of joints and the plurality of element operations and the plurality of imaging units is shown in a one-on-one manner. For example, for “right hand range (X coordinate),” “600 to 700”, “200 right hand for“ right hand range (Y coordinate) ”and“ 200 to 300 ”, and corresponding element operation is“ part pick ” It is shown that the unit is a "first imaging unit" (first imaging unit 20a).
  • the third correspondence table D6 includes the wrist, elbow, shoulder, head, hip, knee, ankle, etc. in one row.
  • a range of three-dimensional coordinate values of each joint may be included.
  • the third correspondence table D6 may indicate the correspondence between the ranges of the three-dimensional coordinates of the plurality of joints and the plurality of element operations and the plurality of imaging units in a one-to-many manner.
  • two or more imaging units may correspond to the range of three-dimensional coordinates of a plurality of joints and element motion.
  • movement from different angles and different distances may be shown.
  • FIG. 16 is a flowchart of a fourth example of the moving image display process executed by the behavior analysis system 100 according to the present embodiment.
  • the fourth example of the moving picture display process is a process of displaying the portion of the moving picture that is executing the operation indicated by the extracted action information after the action information is extracted in the second example of the action information extraction process (S47 Another example of).
  • the motion analysis system 100 refers to the third correspondence table D6 to search for a row including the largest number of coordinate values included in the motion information extracted from the third correspondence table D6 (S475).
  • the X coordinate of the right hand included in the extracted motion information is in the range of 550 to 650
  • “right hand range (X coordinate) shown in the second and third rows from the top of the third correspondence table D6 “Is identified as the line including the largest number of coordinate values included in the extracted motion information.
  • a row with the most overlap is searched as compared with the coordinate range shown in the third correspondence table D6.
  • the motion analysis system 100 specifies a row corresponding to the element motion indicated by the extracted motion information among the searched rows (S471). For example, when the second and third rows from the top of the third correspondence table D6 are searched as a row including the largest number of coordinate values included in the extracted operation information, an element operation indicated by the extracted operation information If is a component pick, the second line from the top of the third correspondence table D6 is specified as a line corresponding to the element operation indicated by the extracted operation information.
  • the motion analysis system 100 estimates the position at which the motion indicated by the extracted motion information is being executed, with reference to the identified line (S477). For example, when the specified line is the second line from the top of the third correspondence table D6, the “first imaging unit 20a” that is the corresponding imaging unit estimates the position at which the operation has been performed.
  • the position may be estimated based on, for example, the correspondence between a plurality of imaging units and the position in the work area R in which the plurality of imaging units are installed, or the position may be estimated based on the specification of the imaging units. .
  • the motion analysis system 100 includes a first moving image, which is a moving image of the entire work area R taken by the measurement unit 30 (fourth imaging unit), and a second moving image taken at a position estimated for an area narrower than the work area R.
  • the portion of the time zone in which the operation indicated by the extracted operation information is executed is displayed (S 478).
  • the portion of the time zone in which the operation indicated by the extracted operation information is being executed is based on the time of the start line and the end line of the extracted operation information. It is specified by cutting out the second moving image.
  • the motion analysis system 100 displays the element motion indicated by the extracted motion information (S479), and displays the identification information of the worker for which the extracted motion information is measured (S4710). For example, if the element motion indicated by the extracted motion information is a component pick and the worker who has executed the element motion is the first worker A1, the motion analysis system 100 determines that the first moving image and the second motion image have been cut out. In addition to the moving image, the display unit 10f displays that the content of the moving image is a component pick and that the worker who executes the element operation is the first worker A1. Note that the display of the element operation may be performed by information that can identify a plurality of element operations. In addition, the display of the worker may be performed by information that can identify a plurality of workers. Thus, the fourth example of the video display process ends.
  • the worker is caused to exhibit the motion indicated by the extracted motion information based on the coordinate value of the joint included in the extracted motion information and the element motion indicated by the extracted motion information. Is estimated at the position where the element motion indicated by the extracted motion information is performed among the plurality of second moving images shot at the plurality of positions in the work area by estimating the position in the work area being executed by The second moving image can be displayed, and details of the element operation can be confirmed.
  • FIG. 17 is a flowchart of a process of selecting a display mode performed by the behavior analysis system 100 according to a modified example of the present embodiment.
  • the motion analysis system 100 according to the present embodiment is an outline of the extracted motion information, a first moving image (whole moving image) obtained by capturing the entire work area R, and a second obtained by capturing the hand of the operator. Display videos (videos at hand).
  • the behavior analysis system 100 according to the present modification can change the content displayed on the display unit 10 f according to the user's selection.
  • the motion analysis system 100 determines whether to display the reference motion together (S50). It may be specified based on the input by the user's input unit 10e whether or not the reference operation is displayed together.
  • the reference motion is displayed together (S50: YES)
  • the first moving image is displayed, and a portion of the second moving image that is executing the motion indicated by the extracted motion information and the reference indicated by the reference motion information
  • the moving image on which the operation is being performed is displayed (S51).
  • the moving image executing the reference operation indicated by the reference operation information may be one captured in advance and stored in the storage unit 14 or may not be extracted by the extraction unit 15 in the moving image history 14 a It may be selected from videos.
  • the behavior analysis system 100 determines whether to display a graph indicating the behavior information together (S52). It may be specified based on the input by the user's input unit 10e whether or not the graph indicating the operation information is displayed together.
  • a graph showing the extracted operation information is displayed (S53).
  • the graph showing the extracted motion information may be a graph of any aspect, but may be a graph showing coordinate values of each joint on the vertical axis and an elapsed time on the horizontal axis, for example.
  • the scene in which the specific motion satisfying the predetermined condition is executed is displayed. It can be confirmed from different points of view, animation and graph.
  • the motion analysis system 100 determines whether to display a superimposed image (S54). Whether or not to display the superimposed image may be designated based on an input from the input unit 10 e of the user.
  • the superimposed image is displayed (S54: YES)
  • the first moving image is displayed, and a plurality of frames included in the scene where the operator is executing the operation indicated by the extracted operation information in the second moving image are displayed. It is superimposed and displayed as a single image (S55).
  • S55 For the superimposed image, for example, a plurality of frames included in a scene where the operator is executing the motion indicated by the extracted motion information in the second moving image is thinned out and subjected to transmission processing, and synthesized into one image. You may generate by doing.
  • FIG. 18 is an example of a screen DP displayed by the behavior analysis system 100 according to the present modification.
  • the screen DP is an example in the case where a graph indicating operation information is displayed together (S53) in the display mode selection process.
  • the screen DP includes a graph DP4, an entire moving picture DP2, and a hand moving picture DP3.
  • the graph DP4 of this example is a graph showing the extracted motion information, and the X coordinate value ("right hand X"), Y coordinate value ("right hand Y”) and Z coordinate value ("right hand Z") of the right hand It is an operation
  • the X coordinate value (“right hand X”) of the right hand is indicated by a solid line
  • the Y coordinate value (“right hand Y”) is indicated by a broken line
  • the Z coordinate value (“right hand Z”) is indicated by an alternate long and short dash line .
  • the time series change of the coordinate value of the right hand is represented for the period of the extracted operation information.
  • the whole moving image DP2 is a first moving image obtained by photographing the entire work area R by the measuring unit 30 (fourth image pickup unit), and it is "2017/7/7 10: 10: 44.138" at the lower right of the moving image screen. "It is shown.
  • the entire moving image DP2 makes it possible to generally understand how a plurality of workers are performing an operation. Further, in the general moving image DP2, the positions of joints of a plurality of workers detected by the measurement unit 30 are indicated by a skeletal model. This makes it possible to confirm that the coordinates of joints of a plurality of workers are measured at appropriate positions.
  • the hand moving image DP3 is a second moving image obtained by capturing an area narrower than the work area R by the third imaging unit 20c, and as the start time at the lower right of the moving image screen, “2017/7/7 10:10: 44.138 "is shown.
  • “packaging process” and “worker ID: A1” are displayed on the moving image screen, and information identifying the process indicated by the extracted operation information and the worker whose extracted operation information was measured Identification information is displayed. The details of the operation actually performed by the worker can be confirmed by the hand moving image DP3 in which the worker's hand is closed up.
  • the motion indicated by the extracted motion information is executed for the first motion image and the second motion image captured at the estimated position.
  • the 1st moving image and 2nd moving image which are performing reference operation may be displayed.
  • the first moving images may be adjacent to each other, and the second moving images may be displayed adjacent to each other to facilitate comparison.
  • the first moving image displays the portion of the time zone in which the operation indicated by the extracted operation information is displayed, and the first image captured at the estimated position.
  • a plurality of frames may be superimposed and displayed as a single image.
  • the motion analysis system 100 is not limited to one that measures motion information indicating a worker's motion performed in a certain work area of a manufacturing line.
  • the motion analysis system 100 measures motion information indicating a motion of a person performing sports, such as a swing of a golf club, or a motion of a person performing a performance using a body such as a theater, etc.
  • the motion analysis system 100 can be used when a motion that serves as a model is defined, and stores motion information indicating a motion that serves as a model in the storage unit in advance as reference motion information, and the measured motion information By comparing with, it is possible to display a moving image scene in which an operation deviated from the example is performed.
  • the motion analysis system 100 includes a plurality of imaging units, and the operator executes an operation indicated by the extracted operation information using moving images captured by the plurality of imaging units.
  • the motion analysis system may extract information other than the moving image.
  • FIG. 19 is a diagram showing functional blocks of a motion analysis system 100g according to another embodiment of the present disclosure.
  • a motion analysis system 100g includes a measurement unit 30 that measures motion information indicating a worker's motion performed in a certain work area, a first sensor 40a that senses a worker's motion to be executed, and The extraction unit 15 that extracts operation information that satisfies a predetermined condition by comparing the 2 sensors 40b and the third sensor 40c with the operation information and the reference operation information, a first sensor 40a, a second sensor 40b, and a third sensor
  • the output unit 18 may output a scene in which the worker is performing the operation indicated by the extracted operation information using the sensing data sensed by 40c.
  • the motion analysis system 100g may have a configuration in which the imaging unit (that is, the image sensor) of the motion analysis system 100 according to the present embodiment is replaced with another sensor.
  • the imaging unit that is, the image sensor
  • the output unit 18 may output a scene in which the operator is executing the operation indicated by the extracted operation information to the display unit 10 f, but may output the scene to another analyzer.
  • the second acquisition unit 12 of the motion analysis device 10g included in the motion analysis system 100g may obtain sensing data obtained by sensing a motion performed by the worker.
  • the storage unit 14 may store a sensing data history 14e which is a history of sensing data sensed by the first sensor 40a, the second sensor 40b, and the third sensor 40c for an operation performed by the worker.
  • the motion analysis device 10g may have the same configuration as the motion analysis device 10 according to the present embodiment.
  • the number of sensing data to be confirmed increases as the number of sensors to be installed increases, or if the length of individual sensing data becomes long, an operation specific to any part of the recorded sensing data is performed. Even if the scene where the action was performed is recorded, it becomes difficult to understand when the specific action is performed, and it becomes difficult to extract the scene where the specific action is performed.
  • the operation performed by the operator is sensed by a sensor, and the data sensed by the sensor is recorded by specifying the portion executing the operation indicated by the extracted operation information.
  • a motion analysis system, a motion analysis method, and a motion analysis program can be provided that can extract a scene in which a specific motion is performed regardless of the number or length of sensing data.
  • the scene in which the specific operation is performed may be a scene of a moving image sensed by an image sensor, or may be a scene of sensing data sensed by another sensor.
  • Embodiments of the present invention may also be described as the following appendices. However, embodiments of the present invention are not limited to the modes described in the following appendices. In addition, the embodiments of the present invention may be in the form of replacing or combining the descriptions of the supplementary notes.
  • the measurement unit (30) measures a plurality of the operation information respectively indicating the operations of the plurality of the workers;
  • the display unit (10f) displays information for identifying the worker for which the extracted operation information has been measured.
  • the motion analysis system according to appendix 1.
  • the imaging unit (20a, 20b, 20c, 30) is configured to capture a first imaging unit (30) that captures a first moving image that captures the work area, and a second capturing that captures a portion of the work area 2 including an imaging unit (20a, 20b, 20c),
  • the display unit (10f) uses the first moving image and the second moving image to display a scene of a time zone in which the worker is executing the operation indicated by the extracted operation information, respectively.
  • the motion analysis system according to Appendix 1 or 2.
  • the second imaging unit (20a, 20b, 20c) captures a plurality of second moving images obtained by capturing a plurality of portions of the work area, It further comprises an estimation unit (17) for estimating the position in the work area where the operation indicated by the extracted operation information has been executed.
  • the display unit (10f) displays the second moving image captured at an estimated position among the plurality of second moving images.
  • the reference operation information is determined for each of a plurality of steps, and the extraction unit (15) compares the operation information with the reference operation information for each of a plurality of steps to determine the predetermined condition. Extract the operation information to satisfy The display unit (10f) displays information identifying a process indicated by the extracted operation information.
  • the motion analysis system according to appendix 4.
  • the estimation unit (17) estimates the position in the work area where the worker is executing the operation indicated by the extracted operation information, based on the process indicated by the extracted operation information.
  • the motion analysis system according to appendix 5.
  • the motion information and the reference motion information each include coordinate values of joints of the worker
  • the extraction unit (15) compares the coordinate value included in the movement information with the coordinate value included in the reference movement information to extract movement information satisfying the predetermined condition.
  • the motion analysis system according to appendix 4.
  • the estimation unit (17) estimates a position in the work area where the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information.
  • the reference operation information is determined for each of a plurality of element operations,
  • the estimation unit (17) is configured such that the worker indicates an operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information and the element operation indicated by the extracted operation information. Estimate the position in the working area being executed, The motion analysis system according to appendix 7 or 8.
  • the display unit (10f) causes the worker to execute the reference operation indicated by the reference operation information and a scene in which the operator is executing the operation indicated by the operation information extracted using the moving image.
  • To display videos that include The motion analysis system according to any one of appendices 1 to 9.
  • the display unit (10f) displays a graph indicating the extracted operation information and a scene of the moving image in which the worker is executing the operation indicated by the operation information extracted from the moving image.
  • the operation analysis system according to any one of appendices 1 to 10.
  • the display unit (10f) superimposes a plurality of frames included in a scene in which the operator is executing an operation indicated by the operation information extracted from the moving image and displays the frame as a single image.
  • the motion analysis system according to any one of appendices 1 to 11.
  • Operation analysis device comprising:
  • [Supplementary Note 14] Obtaining operation information indicating an operation of one or more workers performed in a work area; Acquiring a moving image including a scene in which the worker is performing the operation; Obtaining reference operation information indicating a reference operation which is a reference of comparison for the operation of the worker; Extracting the operation information satisfying a predetermined condition by comparing the operation information and the reference operation information; Specifying the scene in which the operator is executing the operation indicated by the extracted operation information using the moving image; Operation analysis method including:
  • a computing device provided in the motion analysis device, A first acquisition unit (11) for acquiring operation information indicating an operation of one or more workers executed in a certain work area; A second acquisition unit (12) for acquiring a moving image including a scene in which the worker is executing the operation; A third acquisition unit (13) for acquiring reference operation information indicating a reference operation serving as a reference for comparison of the operation of the worker; An extraction unit (15) for extracting the operation information satisfying a predetermined condition by comparing the operation information with the reference operation information; and the work indicated by the operation information extracted using the moving image Identifying part (16) for identifying the scene where the person is performing, Operation analysis program to operate as.
  • An output unit (18) for outputting a scene in which the worker is performing an operation indicated by the extracted operation information using sensing data sensed by the sensor (40a, 40b, 40c, 30);
  • the measurement unit (30) measures a plurality of the operation information respectively indicating the operations of the plurality of the workers;
  • the output unit (18) outputs information for identifying the worker for which the extracted operation information has been measured.
  • the motion analysis system according to appendix 16.
  • the sensor (40a, 40b, 40c, 30) is configured to sense a first sensor (30) for sensing a first sensing data sensing the work area, and a second sensing data for sensing a portion of the work area Including two sensors (40a, 40b, 40c),
  • the output unit (18) outputs, using the first sensing data and the second sensing data, a scene of a time zone in which the worker is executing the operation indicated by the extracted operation information, respectively.
  • the motion analysis system according to appendix 16 or 17.
  • the second sensor (40a, 40b, 40c) senses a plurality of second sensing data obtained by sensing a plurality of portions of the work area, respectively. It further comprises an estimation unit (17) for estimating the position in the work area where the operation indicated by the extracted operation information has been executed. The output unit (18) outputs the second sensing data sensed at a position estimated among the plurality of second sensing data, The motion analysis system according to appendix 18.
  • the reference operation information is determined for each of a plurality of steps
  • the extraction unit (15) compares the operation information with the reference operation information for each of a plurality of steps, and extracts the operation information satisfying the predetermined condition
  • the output unit (18) outputs information identifying a process indicated by the extracted operation information.
  • the motion analysis system according to appendix 19.
  • the estimation unit (17) estimates the position in the work area where the worker is executing the operation indicated by the extracted operation information, based on the process indicated by the extracted operation information.
  • the motion information and the reference motion information each include coordinate values of joints of the worker
  • the extraction unit (15) compares the coordinate value included in the movement information with the coordinate value included in the reference movement information to extract movement information satisfying the predetermined condition.
  • the motion analysis system according to appendix 19.
  • the estimation unit (17) estimates a position in the work area where the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information.
  • the reference operation information is determined for each of a plurality of element operations,
  • the estimation unit (17) is configured such that the worker indicates an operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information and the element operation indicated by the extracted operation information. Estimate the position in the working area being executed, The motion analysis system according to appendix 22 or 23.
  • the output unit (18) causes the worker to execute the reference operation indicated by the reference operation information and a scene in which the operator indicates the operation indicated by the operation information extracted from the sensing data.
  • Output sensing data including the scene being The motion analysis system according to any one of appendices 16 to 24.
  • the output unit (18) outputs a graph indicating the extracted operation information and a scene in which the operator is executing the operation indicated by the operation information extracted from the sensing data.
  • the motion analysis system according to any one of appendices 16 to 25.
  • Operation analysis device comprising:
  • [Supplementary Note 28] Obtaining operation information indicating an operation of one or more workers performed in a work area; Obtaining sensing data obtained by sensing the operation performed by the worker; Obtaining reference operation information indicating a reference operation which is a reference of comparison for the operation of the worker; Extracting the operation information satisfying a predetermined condition by comparing the operation information and the reference operation information; Outputting a scene in which the operator is executing an operation indicated by the extracted operation information using the sensing data; Operation analysis method including:
  • a computing device provided in the motion analysis device, A first acquisition unit (11) for acquiring operation information indicating an operation of one or more workers executed in a certain work area; A second acquisition unit (12) for acquiring sensing data obtained by sensing the operation performed by the worker; A third acquisition unit (13) for acquiring reference operation information indicating a reference operation serving as a reference for comparison of the operation of the worker; An extraction unit (15) which compares the operation information with the reference operation information and extracts operation information satisfying a predetermined condition; and the operation indicated by the operation information extracted using the sensing data An output unit (18) for outputting a scene that the operator is executing; Operation analysis program to operate as.

Abstract

Provided is a motion-analyzing system with which it is possible to easily extract a scene in which a specific motion has been executed. The motion-analyzing system is provided with a measurement unit for measuring motion information that indicates the motion of one or more workers which has been executed in some work area; an imaging unit for capturing a moving-image that includes a scene in which the workers are executing a motion; a storage unit for storing reference motion information that indicates a reference motion that serves as a reference for comparison with regard to the motion of the workers; an extraction unit for comparing the motion information with the reference motion information and extracting motion information that satisfies a prescribed condition; and a display unit for displaying, using the moving-image, a scene in which the workers are executing the motion indicated by the extracted motion information.

Description

動作分析システム、動作分析装置、動作分析方法及び動作分析プログラムMotion analysis system, motion analysis apparatus, motion analysis method, and motion analysis program
 本発明は、動作分析システム、動作分析装置、動作分析方法及び動作分析プログラムに関する。 The present invention relates to a motion analysis system, a motion analysis device, a motion analysis method, and a motion analysis program.
 従来、製品の製造ラインに1又は複数のカメラを設置して、作業者の動作を画像や動画で記録することがある。近年、トラッカーを装着せずとも関節の3次元的な位置や移動速度を測定することのできるモーションキャプチャが普及し始めており、画像や動画の他に、作業者の動作を定量的に示す動作情報を取得する場合がある。 Conventionally, one or more cameras are installed on a product manufacturing line, and an operator's operation may be recorded as an image or a moving image. In recent years, motion capture that can measure the three-dimensional position and movement speed of joints without mounting a tracker has begun to spread, and in addition to images and moving pictures, motion information that quantitatively indicates the motion of the worker You may get
 下記特許文献1には、作業者の動画及び作業者の動作の特徴量の変化を表す動作信号を取得し、作業の結果が良好であった場合の動作信号と作業の結果が不良であった場合の動作信号とを比較して相違点を抽出し、相違点に対応する動画を表示する作業動作分析システムが記載されている。 In Patent Document 1 below, an operation signal representing a change in the feature amount of the worker's moving image and the worker's operation is acquired, and the operation signal and the operation result when the operation result is good are poor. A work movement analysis system has been described which compares a movement signal of a case to extract a difference and displays a moving image corresponding to the difference.
特開2009-110239号公報JP, 2009-110239, A
 製造ラインで様々な作業が行われる場合に、作業者が特定の動作を実行した場面を確認したい場合がある。このような要求に応えるために、製造ラインに1又は複数のカメラを設置して、作業者の動作を動画により継続的に撮影することがある。ここで、例えば、特許文献1に記載の技術によれば、作業の結果が不良であった場合の問題点を抽出するために、作業結果が良好であった場合と不良であった場合の相違点に対応する動画を表示している。しかしながら、作業結果が必ずしも明らかでない場合には、特許文献1に記載の技術を適用することはできず、作業者が特定の動作を実行した場面を抽出することもできない。 When various operations are performed on a manufacturing line, it may be desirable to confirm a scene where an operator has performed a specific operation. In order to meet such a demand, one or more cameras may be installed on the production line to continuously capture the motion of the worker by moving pictures. Here, for example, according to the technology described in Patent Document 1, the difference between the case where the work result is good and the case where the work result is bad in order to extract the problem when the result of the work is bad The video corresponding to the point is displayed. However, when the work result is not always clear, the technology described in Patent Document 1 can not be applied, and it is also impossible to extract a scene in which the operator has performed a specific operation.
 また、仮に、製造ラインに1又は複数のカメラを設置して、作業者の動作を動画により継続的に撮影したとしても、設置するカメラの台数が増えることにより確認を要する動画の数が増えたり、個々の動画の長さが長大となったりすると、撮影された動画のうち、いずれかの部分に特定の動作が実行された場面が記録されていたとしても、特定の動作がいつ実行されたのかが分かりづらくなり、特定の動作が実行された場面を抽出することが困難となる。 In addition, even if one or more cameras are installed on the manufacturing line and the operator's action is continuously photographed as a moving image, the number of installed moving cameras increases and the number of moving images requiring confirmation increases. When the length of an individual moving picture becomes long, even if a scene where a specific action is performed is recorded in any part of the taken moving picture, the specific action is performed It becomes difficult to identify the scene in which a specific action has been performed.
 そこで、本発明は、特定の動作が実行された場面を容易に抽出することができる動作分析システム、動作分析装置、動作分析方法及び動作分析プログラムを提供する。 Therefore, the present invention provides a motion analysis system, a motion analysis device, a motion analysis method, and a motion analysis program that can easily extract a scene in which a specific motion is performed.
 本開示の一態様に係る動作分析システムは、ある作業領域において実行された1以上の作業者の動作を示す動作情報を測定する測定部と、作業者が動作を実行している場面を含む動画を撮影する撮影部と、作業者の動作について比較の基準となる基準動作を示す基準動作情報を記憶する記憶部と、動作情報と、基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部と、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を表示する表示部と、を備える。ここで、動作情報は、作業者の身体の動きを示す情報であってよく、作業者の身体の代表的な位置の変位を示す情報であってよい。作業者の身体の代表的な位置は、身体の1つの位置であってもよいが、典型的には複数あってよい。動作情報は、例えばパターン光を作業者に投影して撮影された動画から特徴点を抽出することで算出することができる。基準動作は、作業者が従うべき標準的な動作であってもよいし、ミス事例等の作業者の標準外の動作であってもよい。基準動作が標準的な動作である場合、所定の条件は、動作情報と基準動作情報との乖離が閾値以上であるという条件であってよい。基準動作が標準外の動作である場合、所定の条件は、動作情報と基準動作情報との乖離が閾値以下であるという条件であってよい。基準動作情報は、動作情報との比較が可能な情報であればよく、必ずしも動作情報と同じデータ形式の情報でなくてもよい。基準動作情報及び動作情報は、時系列データであってよい。また、「抽出された動作情報が示す動作を作業者が実行している場面」とは、一つの動画のうち抽出された動作情報が示す動作を作業者が実行している時間帯の場面であってもよいし、複数の動画のうち抽出された動作情報が示す動作を作業者が実行している一部の動画の場面であってもよい。また、動作情報が示す動作とは、動作情報により示される作業者の動作に関する定量的な情報に対応する動作であってもよい。 A motion analysis system according to an aspect of the present disclosure includes: a measurement unit configured to measure motion information indicating a motion of one or more workers performed in a work area; and a moving image including a scene where the worker is performing a motion An operation that satisfies a predetermined condition by comparing an imaging unit that captures an image, a storage unit that stores reference operation information indicating a reference operation that is a reference for comparison with the operation of the worker, the operation information, and the reference operation information The information processing apparatus includes an extraction unit that extracts information, and a display unit that displays a scene in which an operator is performing an operation indicated by the extracted operation information using a moving image. Here, the motion information may be information indicating the movement of the worker's body, and may be information indicating the displacement of a representative position of the worker's body. The representative position of the worker's body may be one position of the body, but typically there may be more than one. The motion information can be calculated, for example, by projecting the pattern light to the worker and extracting feature points from the captured moving image. The reference operation may be a standard operation that the operator should follow or a non-standard operation of the operator such as a mistake case. When the reference operation is a standard operation, the predetermined condition may be a condition that the difference between the operation information and the reference operation information is equal to or more than a threshold. When the reference operation is an operation other than the standard, the predetermined condition may be a condition that the difference between the operation information and the reference operation information is equal to or less than a threshold. The reference operation information may be information that can be compared with the operation information, and may not necessarily be information of the same data format as the operation information. The reference operation information and the operation information may be time-series data. Also, “a scene where the worker is executing the motion indicated by the extracted motion information” refers to a scene of a time zone in which the worker is executing the motion indicated by the motion information extracted from one moving image. It may be a scene of a part of moving images in which the operator is executing an operation indicated by the extracted operation information out of a plurality of moving images. Further, the operation indicated by the operation information may be an operation corresponding to quantitative information on the operation of the worker indicated by the operation information.
 この態様によれば、作業者の動作を定量的に示す動作情報のうち、基準動作情報との比較において所定の条件を満たす動作情報を抽出して、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を表示することで、所定の条件を満たす特定の動作が実行された場面を容易に抽出することができる。所定の条件を満たす特定の動作が実行された場面を抽出することで、例えば、作業者による作業の結果が良好であったか不良であったかに関わらず、作業者が特定の動作を実行した場面を抽出することができる。また、例えば、確認を要する動画の数が増えたり、個々の動画の長さが長大となったりした場合であっても、特定の動作が実行された場面を容易に抽出することができ、抽出作業の負担を軽減することができる。 According to this aspect, of the operation information quantitatively indicating the operation of the worker, the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the extracted operation information is extracted using the moving image. By displaying a scene in which the worker is executing the operation to be shown, it is possible to easily extract a scene in which a specific operation satisfying a predetermined condition is performed. By extracting a scene in which a specific operation satisfying a predetermined condition is performed, for example, regardless of whether the result of the work by the worker is good or bad, the scene in which the worker performed the specific operation is extracted can do. Also, for example, even when the number of moving images requiring confirmation increases or the length of each moving image increases, it is possible to easily extract a scene in which a specific operation is performed, and the extraction is performed. The burden of work can be reduced.
 上記態様において、測定部は、複数の作業者の動作をそれぞれ示す複数の動作情報を測定し、表示部は、抽出された動作情報が測定された作業者を識別する情報を表示してもよい。 In the above aspect, the measurement unit may measure a plurality of pieces of operation information respectively indicating the operations of the plurality of workers, and the display unit may display information identifying the worker for which the extracted operation information is measured. .
 この態様によれば、複数の作業者の動作をそれぞれ示す複数の動作情報のうち、基準動作情報との比較において所定の条件を満たす動作情報を抽出して、抽出された動作情報が測定された作業者を識別する情報を表示することで、複数の作業者のうち所定の条件を満たす特定の動作を実行した作業者を識別することができる。そのため、例えば、特定の動作を誰が実行したのかを確認するための負担を軽減することができる。 According to this aspect, among the plurality of pieces of operation information respectively indicating the operations of the plurality of workers, the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the extracted operation information is measured. By displaying the information for identifying the worker, it is possible to identify the worker who has executed the specific operation satisfying the predetermined condition among the plurality of workers. Therefore, for example, it is possible to reduce the burden of confirming who has performed a particular operation.
 上記態様において、撮影部は、作業領域を撮影した第1動画を撮影する第1撮影部及び作業領域の一部を撮影した第2動画を撮影する第2撮影部を含み、表示部は、第1動画及び第2動画を用いて、抽出された動作情報が示す動作を作業者が実行している時間帯の場面をそれぞれ表示してもよい。 In the above aspect, the imaging unit includes a first imaging unit that captures a first moving image that captures an image of the work area, and a second imaging unit that captures a second moving image that captures a portion of the work area, The scene of the time zone in which the worker is executing the operation indicated by the extracted operation information may be displayed by using the first moving image and the second moving image.
 この態様によれば、作業領域を撮影した第1動画のうち、抽出された動作情報が示す動作を作業者が実行している時間帯の場面を表示することで、所定の条件を満たす特定の動作を全体的に確認することができ、作業領域の一部を撮影した第2動画のうち、抽出された動作情報が示す動作を作業者が実行している時間帯の場面を表示することで、所定の条件を満たす特定の動作の詳細を確認することができる。そのため、例えば、動画の長さが長大となった場合であっても、特定の動作が実行された時間帯の場面を容易に抽出することができ、抽出作業の負担を軽減することができる。 According to this aspect, a specific condition satisfying a predetermined condition is displayed by displaying a scene of a time zone in which the worker is executing the operation indicated by the extracted operation information among the first moving images obtained by photographing the work area. By displaying the scene of the time zone in which the operator can execute the operation indicated by the extracted operation information among the second moving images obtained by photographing a part of the work area, the operation can be confirmed as a whole. It is possible to confirm the details of a specific operation that satisfies a predetermined condition. Therefore, for example, even when the length of the moving image becomes long, the scene of the time zone in which the specific operation is performed can be easily extracted, and the load of the extraction operation can be reduced.
 上記態様において、第2撮影部は、作業領域の複数の部分をそれぞれ撮影した複数の第2動画を撮影し、抽出された動作情報が示す動作が実行された作業領域における位置を推定する推定部をさらに備え、表示部は、複数の第2動画のうち、推定された位置で撮影された第2動画を表示してもよい。 In the above aspect, the second imaging unit captures a plurality of second moving images obtained by capturing a plurality of parts of the work area, and estimates a position in the work area where the operation indicated by the extracted motion information is performed. The display unit may display a second moving image captured at an estimated position among the plurality of second moving images.
 この態様によれば、抽出された動作情報が示す動作が実行された作業領域における位置を推定することで、所定の条件を満たす特定の動作が実行された位置を推定することができ、作業領域における複数の位置で撮影された複数の第2動画のうち、推定された位置で撮影された第2動画を表示することで、推定された位置で行われた動作の詳細を確認することができる。そのため、例えば、特定の動作がどこで実行されたのかを確認するための負担を軽減することができる。 According to this aspect, by estimating the position in the work area where the operation indicated by the extracted motion information is performed, it is possible to estimate the position where the specific operation satisfying the predetermined condition is performed, and the work area By displaying the second moving image captured at the estimated position among the plurality of second moving images captured at multiple positions in, it is possible to confirm the details of the operation performed at the estimated position. . Therefore, for example, it is possible to reduce the burden of confirming where a particular operation has been performed.
 上記態様において、基準動作情報は、複数の工程毎に定められており、抽出部は、複数の工程毎に、動作情報と、基準動作情報とを比較して、所定の条件を満たす動作情報を抽出し、表示部は、抽出された動作情報が示す工程を識別する情報を表示してもよい。 In the above aspect, the reference operation information is determined for each of the plurality of steps, and the extraction unit compares the operation information with the reference operation information for each of the plurality of steps and determines the operation information satisfying the predetermined condition. The extraction unit may display information identifying a process indicated by the extracted operation information.
 この態様によれば、作業者の動作を示す動作情報のうち、複数の工程毎に定められた基準動作情報との比較において所定の条件を満たす動作情報を抽出して、抽出された動作情報が示す工程を識別する情報を表示することで、所定の条件を満たす特定の動作がいずれの工程で実行された動作であるか確認することができる。そのため、例えば、特定の動作がいずれの工程で実行されたのかを確認するための負担を軽減することができる。 According to this aspect, among the motion information indicating the motion of the worker, the motion information satisfying the predetermined condition is extracted in comparison with the reference motion information determined for each of the plurality of steps, and the extracted motion information is By displaying the information for identifying the indicated process, it is possible to confirm in which process the specific operation satisfying the predetermined condition is the operation performed. Therefore, for example, it is possible to reduce the burden of confirming in which process a specific operation has been performed.
 上記態様において、推定部は、抽出された動作情報が示す工程に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域における位置を推定してもよい。 In the above aspect, the estimation unit may estimate the position in the work area where the worker is executing the operation indicated by the extracted operation information, based on the process indicated by the extracted operation information.
 この態様によれば、抽出された動作情報が示す工程に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域における位置を推定することで、作業領域における複数の位置で撮影された複数の第2動画のうち、抽出された動作情報が示す工程が行われた位置で撮影された第2動画を表示することができ、当該工程で行われた動作の詳細を確認することができる。そのため、例えば、特定の動作が、いずれの工程に関してどこで実行されたのかを確認するための負担を軽減することができる。 According to this aspect, the plurality of positions in the work area are estimated by estimating the position in the work area where the worker is executing the operation indicated by the extracted operation information based on the process indicated by the extracted operation information. The second moving image photographed at the position where the process indicated by the extracted operation information is performed among the plurality of second moving images photographed in step can be displayed, and the details of the operation performed in the step are confirmed can do. Therefore, for example, it is possible to reduce the burden of confirming where and for which process a specific operation has been performed.
 上記態様において、動作情報及び基準動作情報は、作業者の関節の座標値をそれぞれ含み、抽出部は、動作情報に含まれる座標値と、基準動作情報に含まれる座標値とを比較して、所定の条件を満たす動作情報を抽出してもよい。 In the above aspect, the motion information and the reference motion information respectively include coordinate values of the joint of the worker, and the extraction unit compares the coordinate value included in the motion information with the coordinate value included in the reference motion information, You may extract the operation information which satisfy | fills a predetermined condition.
 この態様によれば、動作情報に含まれる関節の座標値と、基準動作情報に含まれる関節の座標値とを比較することで、動作情報が基準動作情報からどの程度乖離しているかを正確に評価することができ、所定の条件を満たす動作情報を適切に抽出することができる。 According to this aspect, by comparing the coordinate values of the joints included in the movement information with the coordinate values of the joints included in the reference movement information, the degree to which the movement information deviates from the reference movement information is accurately determined. It is possible to evaluate and to appropriately extract operation information that satisfies a predetermined condition.
 上記態様において、推定部は、抽出された動作情報に含まれる座標値に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域における位置を推定してもよい。 In the above aspect, the estimation unit may estimate the position in the work area where the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information.
 この態様によれば、抽出された動作情報に含まれる関節の座標値に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域における位置を推定することで、抽出された動作情報が示す動作が行われた位置を正確に推定することができる。 According to this aspect, based on the coordinate values of the joints included in the extracted motion information, it is extracted by estimating the position in the work area where the worker is executing the motion indicated by the extracted motion information. It is possible to accurately estimate the position at which the motion indicated by the motion information is performed.
 上記態様において、基準動作情報は、複数の要素動作毎に定められており、推定部は、抽出された動作情報に含まれる座標値及び抽出された動作情報が示す要素動作に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域における位置を推定してもよい。ここで、要素動作とは、作業者により実行される一単位の動作であり、例えば、部品のピッキング、部品の配置、部品の固定、製品の梱包といった動作を含む。なお、要素動作は、作業者の動作の構成要素であってよく、1又は複数の要素動作の組合せによって、作業者の一連の動作が構成されてよい。 In the above aspect, the reference motion information is determined for each of the plurality of element motions, and the estimation unit is extracted based on the coordinate value included in the extracted motion information and the element motion indicated by the extracted motion information. The position in the work area where the worker is executing the motion indicated by the motion information may be estimated. Here, the element operation is a unit operation performed by the operator, and includes, for example, an operation such as picking of parts, arrangement of parts, fixing of parts, packing of products. The element operation may be a component of the operator's operation, and a combination of one or more element operations may constitute a series of operations of the operator.
 この態様によれば、抽出された動作情報に含まれる関節の座標値及び抽出された動作情報が示す要素動作に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域における位置を推定することで、作業領域における複数の位置で撮影された複数の第2動画のうち、抽出された動作情報が示す要素動作が行われた位置で撮影された第2動画を表示することができ、当該要素動作の詳細を確認することができる。そのため、例えば、特定の動作が、いずれの要素動作に関してどこで実行されたのかを確認するための負担を軽減することができる。 According to this aspect, based on the coordinate value of the joint included in the extracted motion information and the element motion indicated by the extracted motion information, a work area in which the operator performs the motion indicated by the extracted motion information By estimating the position in, the second moving image photographed at the position at which the element motion indicated by the extracted motion information is displayed among the plurality of second moving images photographed at the plurality of positions in the work area is displayed. And the details of the element operation can be confirmed. Therefore, for example, it is possible to reduce the burden of confirming where a particular operation has been performed with respect to which element operation.
 上記態様において、表示部は、動画を用いて抽出された動作情報が示す動作を作業者が実行している場面と、基準動作情報が示す基準動作を作業者が実行している動画と、を表示してもよい。例えば、表示部は、動画を用いて抽出された動作情報が示す動作を作業者が実行している場面と、基準動作情報が示す基準動作を作業者が実行している動画と、を一画面に並列させて同時に表示してもよいし、一画面に重畳させて同時に表示してもよいし、連続させて交互に表示してもよい。 In the aspect described above, the display unit displays a scene in which the operator is executing the operation indicated by the operation information extracted using the moving image, and a moving image in which the operator is executing the reference operation indicated by the reference operation information. You may display it. For example, the display unit displays one screen of a scene in which the operator is executing the operation indicated by the operation information extracted using the moving image, and a moving image in which the operator is executing the reference operation indicated by the reference operation information. They may be displayed in parallel at the same time, may be superimposed on one screen and may be displayed simultaneously, or may be displayed continuously and alternately.
 この態様によれば、動画を用いて抽出された動作情報が示す動作を作業者が実行している場面と、基準動作を作業者が実行している動画とを表示することで、作業者により実行された所定の条件を満たす特定の動作と、基準動作との比較が容易となる。 According to this aspect, the worker displays the scene where the operator is executing the operation indicated by the motion information extracted using the moving image and the moving image where the operator is executing the reference operation. It becomes easy to compare the specific operation that fulfills the predetermined condition that has been executed with the reference operation.
 上記態様において、表示部は、抽出された動作情報を示すグラフと、抽出された動作情報が示す動作を作業者が実行している動画の場面と、を表示してもよい。 In the above aspect, the display unit may display a graph indicating the extracted operation information and a scene of a moving image in which the operator is executing the operation indicated by the extracted operation information.
 この態様によれば、抽出された動作情報を示すグラフと、抽出された動作情報が示す動作を作業者が実行している動画の場面とを表示することで、所定の条件を満たす特定の動作が実行された場面を、動画とグラフという異なる観点で確認することができる。そのため、例えば、特定の動作を定性的及び定量的に確認するための負担を軽減することができる。 According to this aspect, the specific operation satisfying the predetermined condition is displayed by displaying the graph indicating the extracted operation information and the scene of the moving image in which the operator is executing the operation indicated by the extracted operation information. It is possible to confirm the scene where was executed from different viewpoints of animation and graph. Therefore, for example, it is possible to reduce the burden of qualitatively and quantitatively confirming a specific operation.
 上記態様において、表示部は、動画のうち抽出された動作情報が示す動作を作業者が実行している場面に含まれる複数のフレームを重畳させて一枚の画像として表示してもよい。 In the above aspect, the display unit may superimpose a plurality of frames included in a scene in which the worker is executing an operation indicated by the extracted operation information in the moving image and display the frame as a single image.
 この態様によれば、動画のうち抽出された動作情報が示す動作を作業者が実行している場面に含まれる複数のフレームを重畳させて一枚の画像として表示することで、所定の条件を満たす特定の動作が実行された場面の全体を一目で確認することができる。そのため、例えば、特定の動作の全体を短時間で確認することができるようになり、確認のための負担を軽減することができる。 According to this aspect, a predetermined condition can be obtained by superimposing a plurality of frames included in a scene in which the operator is executing an operation indicated by motion information extracted from the moving image and displaying the frame as a single image. It is possible to see at a glance the whole scene where a specific action to be fulfilled has been performed. Therefore, for example, the entire specific operation can be confirmed in a short time, and the burden of confirmation can be reduced.
 本開示の他の態様に係る動作分析装置は、ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得する第1取得部と、作業者が動作を実行している場面を含む動画を取得する第2取得部と、作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得する第3取得部と、動作情報と、基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部と、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を特定する特定部と、を備える。 A motion analysis apparatus according to another aspect of the present disclosure includes a first acquisition unit that acquires motion information indicating the motion of one or more workers executed in a work area, and a scene where the worker is executing a motion A second acquisition unit for acquiring a moving image including a third acquisition unit for acquiring a reference operation information indicating a reference operation as a reference for comparison of the operation of the worker, the operation information, and the reference operation information An extraction unit that extracts operation information that satisfies a predetermined condition, and an identification unit that specifies a scene in which an operator is executing an operation indicated by the extracted operation information using a moving image.
 この態様によれば、作業者の動作を示す動作情報のうち、基準動作情報との比較において所定の条件を満たす動作情報を抽出して、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を特定することで、所定の条件を満たす特定の動作が実行された場面を容易に抽出することができる。そのため、例えば、作業者による作業の結果が良好であったか不良であったかに関わらず、作業者が特定の動作を実行した場面を抽出することができる。また、例えば、確認を要する動画の数が増えたり、個々の動画の長さが長大となったりした場合であっても、特定の動作が実行された場面を容易に抽出することができ、抽出作業の負担を軽減することができる。 According to this aspect, of the operation information indicating the operation of the worker, the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the operation indicated by the extracted operation information is displayed using the moving image. By identifying the scene being executed by the worker, it is possible to easily extract the scene in which the specific operation satisfying the predetermined condition is executed. Therefore, for example, regardless of whether the result of the work by the worker is good or bad, it is possible to extract a scene where the worker performed a specific operation. Also, for example, even when the number of moving images requiring confirmation increases or the length of each moving image increases, it is possible to easily extract a scene in which a specific operation is performed, and the extraction is performed. The burden of work can be reduced.
 本開示の他の態様に係る動作分析方法は、ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得することと、作業者が動作を実行している場面を含む動画を取得することと、作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得することと、動作情報と、基準動作情報とを比較して、所定の条件を満たす動作情報を抽出することと、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を特定することと、を含む。 A motion analysis method according to another aspect of the present disclosure includes acquiring motion information indicating a motion of one or more workers performed in a work area, and a moving image including a scene in which the worker is performing a motion Obtaining motion information, obtaining reference motion information indicating a reference motion serving as a comparison reference for the motion of the worker, comparing the motion information with the reference motion information, and obtaining motion information satisfying a predetermined condition Extracting and identifying a scene in which the worker is executing the operation indicated by the extracted operation information using the moving image.
 この態様によれば、作業者の動作を示す動作情報のうち、基準動作情報との比較において所定の条件を満たす動作情報を抽出して、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を特定することで、所定の条件を満たす特定の動作が実行された場面を容易に抽出することができる。そのため、例えば、作業者による作業の結果が良好であったか不良であったかに関わらず、作業者が特定の動作を実行した場面を抽出することができる。また、例えば、確認を要する動画の数が増えたり、個々の動画の長さが長大となったりした場合であっても、特定の動作が実行された場面を容易に抽出することができ、抽出作業の負担を軽減することができる。 According to this aspect, of the operation information indicating the operation of the worker, the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the operation indicated by the extracted operation information is displayed using the moving image. By identifying the scene being executed by the worker, it is possible to easily extract the scene in which the specific operation satisfying the predetermined condition is executed. Therefore, for example, regardless of whether the result of the work by the worker is good or bad, it is possible to extract a scene where the worker performed a specific operation. Also, for example, even when the number of moving images requiring confirmation increases or the length of each moving image increases, it is possible to easily extract a scene in which a specific operation is performed, and the extraction is performed. The burden of work can be reduced.
 本開示の他の態様に係る動作分析プログラムは、動作分析装置に備えられた演算装置を、ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得する第1取得部、作業者が動作を実行している場面を含む動画を取得する第2取得部、作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得する第3取得部、動作情報と、基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部、及び動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を特定する特定部、として動作させる。 A motion analysis program according to another aspect of the present disclosure includes: a first acquisition unit that acquires operation information indicating an operation of one or more workers executed in a work area, in an arithmetic device provided in a motion analysis device; A second acquisition unit that acquires a moving image including a scene in which an operator is executing an operation; a third acquisition unit that acquires reference operation information indicating a reference operation that is a reference for comparison of the worker's operation; An extraction unit that compares operation information with reference operation information and extracts operation information that satisfies a predetermined condition, and an identification unit that specifies a scene where an operator is executing an operation indicated by the extracted operation information using a moving image Act as.
 この態様によれば、作業者の動作を示す動作情報のうち、基準動作情報との比較において所定の条件を満たす動作情報を抽出して、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を特定することで、所定の条件を満たす特定の動作が実行された場面を抽出することができる。そのため、例えば、作業者による作業の結果が良好であったか不良であったかに関わらず、作業者が特定の動作を実行した場面を抽出することができる。また、例えば、確認を要する動画の数が増えたり、個々の動画の長さが長大となったりした場合であっても、特定の動作が実行された場面を容易に抽出することができ、抽出作業の負担を軽減することができる。 According to this aspect, of the operation information indicating the operation of the worker, the operation information satisfying the predetermined condition in comparison with the reference operation information is extracted, and the operation indicated by the extracted operation information is displayed using the moving image. By identifying a scene being executed by the worker, it is possible to extract a scene in which a specific operation satisfying a predetermined condition is executed. Therefore, for example, regardless of whether the result of the work by the worker is good or bad, it is possible to extract a scene where the worker performed a specific operation. Also, for example, even when the number of moving images requiring confirmation increases or the length of each moving image increases, it is possible to easily extract a scene in which a specific operation is performed, and the extraction is performed. The burden of work can be reduced.
 本発明によれば、特定の動作が実行された場面を容易に抽出することができる動作分析システム、動作分析装置、動作分析方法及び動作分析プログラムを提供することができる。 According to the present invention, it is possible to provide a motion analysis system, a motion analysis device, a motion analysis method, and a motion analysis program capable of easily extracting a scene in which a specific motion is performed.
本発明の実施形態に係る動作分析システムの概要を示す図である。It is a figure showing the outline of the operation analysis system concerning the embodiment of the present invention. 本実施形態に係る動作分析システムの機能ブロックを示す図である。It is a figure showing the functional block of the operation analysis system concerning this embodiment. 本実施形態に係る動作分析装置の物理的構成を示す図である。It is a figure which shows the physical structure of the movement analysis apparatus which concerns on this embodiment. 本実施形態に係る動作分析システムにより測定される動作情報の一例を示す図である。It is a figure which shows an example of the movement information measured by the movement analysis system which concerns on this embodiment. 本実施形態に係る動作分析システムにより記憶される第1基準動作情報の例を示す図である。It is a figure which shows the example of the 1st reference | standard operation information memorize | stored by the movement analysis system which concerns on this embodiment. 本実施形態に係る動作分析システムにより記憶される第1対応テーブルの例を示す図である。It is a figure which shows the example of the 1st corresponding | compatible table memorize | stored by the movement analysis system which concerns on this embodiment. 本実施形態に係る動作分析システムにより実行される動作情報の抽出処理の第1例のフローチャートである。It is a flow chart of the 1st example of extraction processing of operation information performed by operation analysis system concerning this embodiment. 本実施形態に係る動作分析システムにより実行される動画表示処理の第1例のフローチャートである。It is a flow chart of the 1st example of the animation display processing performed by the operation analysis system concerning this embodiment. 本実施形態に係る動作分析システムにより表示される画面の一例である。It is an example of the screen displayed by the movement analysis system concerning this embodiment. 本実施形態に係る動作分析システムにより記憶される第2対応テーブルの例を示す図である。It is a figure which shows the example of the 2nd corresponding | compatible table memorize | stored by the movement analysis system which concerns on this embodiment. 本実施形態に係る動作分析システムにより実行される動画表示処理の第2例のフローチャートである。It is a flow chart of the 2nd example of the animation display processing performed by the operation analysis system concerning this embodiment. 本実施形態に係る動作分析システムにより記憶される第2基準動作情報の例を示す図である。It is a figure which shows the example of the 2nd reference | standard operation information memorize | stored by the movement analysis system which concerns on this embodiment. 本実施形態に係る動作分析システムにより実行される動作情報の抽出処理の第2例のフローチャートである。It is a flow chart of the 2nd example of extraction processing of operation information performed by operation analysis system concerning this embodiment. 本実施形態に係る動作分析システムにより実行される動画表示処理の第3例のフローチャートである。It is a flowchart of the 3rd example of the moving image display process performed by the motion analysis system which concerns on this embodiment. 本実施形態に係る動作分析システムにより記憶される第3対応テーブルの例を示す図である。It is a figure which shows the example of the 3rd corresponding | compatible table memorize | stored by the movement analysis system which concerns on this embodiment. 本実施形態に係る動作分析システムにより実行される動画表示処理の第4例のフローチャートである。It is a flowchart of the 4th example of the moving image display process performed by the motion analysis system which concerns on this embodiment. 変形例に係る動作分析システムにより実行される表示態様の選択処理のフローチャートである。It is a flow chart of selection processing of a display mode performed by a motion analysis system concerning a modification. 変形例に係る動作分析システムにより表示される画面の一例である。It is an example of the screen displayed by the movement analysis system concerning a modification. 本開示の他の実施形態に係る動作分析システムの機能ブロックを示す図である。It is a figure showing the functional block of the operation analysis system concerning other embodiments of this indication.
 以下、本発明の一側面に係る実施の形態(以下、「本実施形態」と表記する。)を、図面に基づいて説明する。なお、各図において、同一の符号を付したものは、同一又は同様の構成を有する。 Hereinafter, an embodiment according to one aspect of the present invention (hereinafter referred to as “the present embodiment”) will be described based on the drawings. In addition, what attached the same code | symbol in each figure has the same or same structure.
 §1 適用例
 まず、図1を用いて、本発明が適用される場面の一例について説明する。本実施形態に係る動作分析システム100は、ある作業領域Rにおいて実行された作業者の動作を定量的に示す動作情報を測定する測定部30と、作業者が動作を実行している動画を撮影する第1撮影部20a、第2撮影部20b及び第3撮影部20cと、を備える。本例の作業領域Rは、製造ライン全体を含む領域であるが、作業領域Rは、任意の領域であってよく、例えば所定の工程が行われる領域であったり、所定の要素動作が行われる領域であったりしてよい。ここで、要素動作とは、作業者により実行される一単位の動作であり、例えば、部品のピッキング、部品の配置、部品の固定、製品の梱包といった動作を含む。
First, an example of a scene to which the present invention is applied will be described with reference to FIG. The motion analysis system 100 according to the present embodiment takes a measurement unit 30 that measures motion information that quantitatively indicates a worker's motion performed in a certain work area R, and captures a moving image in which the worker is performing a motion And a second imaging unit 20b and a third imaging unit 20c. Although the work area R in this example is an area including the entire manufacturing line, the work area R may be any area, for example, an area where a predetermined process is performed or a predetermined element operation is performed. It may be an area. Here, the element operation is a unit operation performed by the operator, and includes, for example, an operation such as picking of parts, arrangement of parts, fixing of parts, packing of products.
 本例では、第1作業者A1及び第2作業者A2が、作業領域Rにおいて、予め定められた動作を行う場合について説明する。第1作業者A1は、例えば第1部品のピッキング、配置、固定といった動作を実行し、第2作業者A2は、例えば第2部品のピッキング、配置、固定といった動作を実行することができる。 In this example, a case where the first worker A1 and the second worker A2 perform predetermined operations in the work area R will be described. The first worker A1 can perform, for example, an operation such as picking, arranging, and fixing a first part, and the second worker A2 can perform an operation such as picking, arranging, and fixing a second part, for example.
 動作分析システム100は、動作分析装置10を含む。動作分析装置10は、作業者の動作について比較の基準となる基準動作を定量的に示す基準動作情報を記憶する記憶部と、動作情報と、基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部と、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を表示する表示部10fと、を備える。ここで、動作情報が示す動作とは、動作情報により示される作業者の動作に関する定量的な情報に対応する動作である。なお、記憶部は、動作分析装置10と別体であってもよく、動作分析装置10と通信可能なものであればよい。 The motion analysis system 100 includes a motion analysis device 10. The motion analysis apparatus 10 compares the motion information and the reference motion information with a storage unit that stores reference motion information that quantitatively indicates a reference motion serving as a reference for comparison of the motion of the worker, and determines predetermined conditions. An extraction unit that extracts operation information to be satisfied, and a display unit 10 f that displays a scene where an operator is performing an operation indicated by the extracted operation information using a moving image. Here, the operation indicated by the operation information is an operation corresponding to quantitative information on the operation of the worker indicated by the operation information. The storage unit may be separate from the motion analysis device 10, as long as it can communicate with the motion analysis device 10.
 ここで、基準動作は、作業者が従うべき標準的な動作であってもよいし、作業者のミス動作や標準外の動作であってもよい。基準動作が標準的な動作である場合、所定の条件は、動作情報と基準動作情報との乖離が閾値以上であるという条件であってよい。この場合、抽出部は、標準外の動作を示す動作情報を抽出することとなる。一方、基準動作がミス動作や標準外の動作である場合、所定の条件は、動作情報と基準動作情報との乖離が閾値以下であるという条件であってよい。この場合、抽出部は、ミス動作や標準外の動作を抽出することとなる。以下では、基準動作が標準的な動作であり、所定の条件が、動作情報と基準動作情報との乖離が閾値以上であるという条件である場合について説明する。なお、標準外の動作は、作業者が咳をしたり、鼻をこすったりといった製品の品質に影響を与え得る動作を含んでよい。 Here, the reference operation may be a standard operation to be followed by the worker, or may be a worker's mistake operation or a non-standard operation. When the reference operation is a standard operation, the predetermined condition may be a condition that the difference between the operation information and the reference operation information is equal to or more than a threshold. In this case, the extraction unit extracts operation information indicating a nonstandard operation. On the other hand, when the reference operation is a miss operation or an operation outside the standard, the predetermined condition may be a condition that the difference between the operation information and the reference operation information is equal to or less than a threshold. In this case, the extraction unit extracts a misoperation or a nonstandard operation. In the following, the case where the reference operation is a standard operation and the predetermined condition is a condition that the difference between the operation information and the reference operation information is equal to or more than a threshold will be described. Note that the non-standard operation may include an operation that may affect the quality of the product, such as a worker coughing or rubbing his / her nose.
 表示部10fは、一つの動画のうち抽出された動作情報が示す動作を作業者が実行している時間帯の場面を表示してもよいし、複数の動画のうち抽出された動作情報が示す動作を作業者が実行している一部の動画の場面を表示してもよい。 The display unit 10 f may display a scene of a time zone during which the worker is executing the operation indicated by the extracted operation information of one moving image, or the operation information extracted from the plurality of moving images is indicated. You may display the scene of the one part animation which the operator is performing.
 動作分析システム100は、測定部30によって、作業者の複数の関節の座標値の変位等、作業者の身体の代表的な位置の変位を示す動作情報を測定する。そして、作業者が従うべき標準的な動作が実行された場合の動作情報を基準動作情報として事前に記憶部に記憶する。 The movement analysis system 100 measures movement information indicating displacement of representative positions of the worker's body, such as displacements of coordinate values of a plurality of joints of the worker, by the measurement unit 30. Then, operation information when a standard operation to be followed by the worker is executed is stored in advance in the storage unit as reference operation information.
 その後、測定部30によって作業者の動作情報を測定し、測定部30による動作情報の測定と並行して、第1撮影部20a、第2撮影部20b及び第3撮影部20cによって作業者が動作を実行している動画を撮影して、記憶部に記憶する。続いて、抽出部によって、測定された動作情報と、記憶部に記憶された基準動作情報とを比較して、動作情報と基準動作情報との乖離が閾値以上であるような動作情報を抽出する。ここで、動作情報と基準動作情報の比較は、例えば、ある動作を示す動作情報の開始時点と終了時点を特定し、対応するタイミングで記録された動作情報と基準動作情報との乖離を求めることで行ってよい。 After that, the measurement unit 30 measures operation information of the worker, and in parallel with the measurement of the operation information by the measurement unit 30, the worker operates by the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c. Shoot the moving image that is being executed and store it in the storage unit. Subsequently, the extraction unit compares the measured operation information with the reference operation information stored in the storage unit to extract operation information having a difference between the operation information and the reference operation information equal to or greater than a threshold. . Here, for comparison of the operation information and the reference operation information, for example, specifying the start time point and the end time point of the operation information indicating a certain operation, and finding the difference between the operation information recorded at the corresponding timing and the reference operation information You may go there.
 そして、抽出された動作情報、すなわち作業者が従うべき標準的な動作が実行された場合の基準動作情報と比較して乖離が閾値以上である動作情報が示す動作を作業者が実行している動画の場面を表示部10fによって表示する。 Then, the worker is executing the operation indicated by the operation information whose divergence is equal to or more than the threshold compared with the extracted operation information, that is, the reference operation information when the standard operation to be followed by the operator is executed. The scene of the moving image is displayed by the display unit 10f.
 このように、作業者の動作を定量的に示す動作情報のうち、基準動作情報との比較において所定の条件を満たす動作情報を抽出して、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を表示することで、撮影された動画の数又は長さに関わらず、所定の条件を満たす特定の動作が実行された場面を容易に抽出することができる。例えば、一つの動画のうち抽出された動作情報が示す動作を実行している時間帯の場面を表示することで、一つの動画の中から特定の動作を確認するのに好適な場面を抽出することができる。また、例えば、複数の撮影部により撮影された複数の動画のうち抽出された動作情報が示す動作を実行している一部の動画の場面を表示することで、特定の動作を確認するのに好適な撮影部により撮影された動画を抽出することができる。さらに、所定の条件を満たす動作情報を抽出して、動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を特定することで、作業者の動作を撮影した動画と基準動作を撮影した動画とを比較して所定の条件を満たす特定の動作が実行された場面を検索する場合と比較して、演算負荷を大幅に減らすことができる。 As described above, among the motion information quantitatively indicating the motion of the worker, the motion information satisfying the predetermined condition in comparison with the reference motion information is extracted, and the motion indicated by the extracted motion information using the moving image By displaying the scene in which the worker is executing, it is possible to easily extract the scene in which the specific operation satisfying the predetermined condition is performed regardless of the number or length of the captured moving images. For example, by displaying a scene of a time zone in which the operation indicated by the extracted motion information in one moving image is executed, a scene suitable for confirming a specific operation is extracted from the one moving image. be able to. In addition, for example, to confirm a specific operation by displaying a scene of a part of moving images executing an operation indicated by the extracted operation information among a plurality of moving images captured by a plurality of imaging units. A moving image captured by a suitable imaging unit can be extracted. Furthermore, the motion information which extracted the operation information which satisfy | fills a predetermined condition, and identifies the scene where the operator is performing the operation | movement which the extracted operation information shows using a moving image, The moving image which image | photographed a worker's operation | movement The calculation load can be significantly reduced as compared with searching for a scene in which a specific operation satisfying a predetermined condition is performed by comparing the motion image captured with the reference operation with the moving image.
 例えば、基準動作が標準的な動作であり、所定の条件が、動作情報と基準動作情報との乖離が閾値以上であるという条件である場合、本実施形態に係る動作分析システム100によれば、標準的ではない動作が実行された動画の場面を抽出することができる。これにより、短期的には、標準的ではない動作が実行されたことを早期に検出し、例えば製造ラインにおいて不良品が製造されるおそれを低減することができる。仮に、標準的ではない動作が行われたが、良品が製造された場合であっても、問題が発生する兆候を捉え、その後の確認作業を効率化できるという点で技術的意義がある。このようなメリットは、動作分析システム100を用いる場合にリアルタイムに得られるメリットである。 For example, when the reference operation is a standard operation and the predetermined condition is a condition that the difference between the operation information and the reference operation information is equal to or greater than a threshold, according to the operation analysis system 100 according to the present embodiment, It is possible to extract moving image scenes in which non-standard operations are performed. As a result, in the short term, it is possible to detect early that a non-standard operation has been performed, and to reduce, for example, the possibility of defective products being produced on the production line. Even if a non-standard operation is performed, even if a non-defective product is manufactured, it has technical significance in that it can catch signs that a problem occurs and make the subsequent confirmation work more efficient. Such a merit is a merit that can be obtained in real time when using the motion analysis system 100.
 また、長期的には、作業者の習熟支援に用いたり、基準動作の改善を検討する材料を提供したりすることができる。例えば、多くの作業者が標準的でない動作を行う部分を特定して、基準動作を改良することが考えられる。このようなメリットは、動作分析システム100を継続的に用いる場合に得られるメリットであり、非リアルタイムに得られるメリットである。 In addition, in the long term, it can be used for assisting the learning of the worker, or can provide materials for considering improvement of the standard operation. For example, it is conceivable that many workers identify non-standard operating parts to improve the standard operation. Such an advantage is an advantage obtained when the motion analysis system 100 is continuously used, and an advantage obtained in non-real time.
 §2 構成例
 [機能構成]
 次に、図2を用いて、本実施形態に係る動作分析システム100の機能構成の一例を説明する。動作分析システム100は、第1撮影部20a、第2撮影部20b、第3撮影部20c、測定部30及び動作分析装置10を備える。そして、動作分析装置10は、第1取得部11、第2取得部12、第3取得部13、記憶部14、抽出部15、特定部16、推定部17及び表示部10fを備える。
22 Configuration example [Function configuration]
Next, an example of a functional configuration of the operation analysis system 100 according to the present embodiment will be described using FIG. The motion analysis system 100 includes a first imaging unit 20a, a second imaging unit 20b, a third imaging unit 20c, a measurement unit 30, and the operation analysis apparatus 10. The motion analysis apparatus 10 includes a first acquisition unit 11, a second acquisition unit 12, a third acquisition unit 13, a storage unit 14, an extraction unit 15, an identification unit 16, an estimation unit 17, and a display unit 10f.
 <撮影部の構成>
 第1撮影部20a、第2撮影部20b及び第3撮影部20cは、それぞれ汎用のカメラによって構成されてよく、作業領域Rにおいて第1作業者A1及び第2作業者A2が動作を実行している場面を含む動画を撮影してよい。第1撮影部20a、第2撮影部20b及び第3撮影部20cは、それぞれ作業領域Rの一部を撮影してよく、作業領域Rよりも狭い領域の動画を撮影してよい。具体的には、第1作業者A1及び第2作業者A2により実行される動作をクローズアップした動画を撮影してよい。第1撮影部20a、第2撮影部20b及び第3撮影部20cは、例えば、第1作業者A1及び第2作業者A2の手元をクローズアップした動画を撮影してよい。
<Structure of Shooting Unit>
The first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may be configured by general-purpose cameras, and the first operator A1 and the second operator A2 execute operations in the work area R. You may shoot a movie that includes a scene that you The first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may respectively capture a part of the work area R, and may capture a moving image in an area smaller than the work area R. Specifically, a moving image in which the operations performed by the first worker A1 and the second worker A2 are closed up may be taken. The first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may capture, for example, a moving image in which the hands of the first worker A1 and the second worker A2 are close up.
 また、第1撮影部20a、第2撮影部20b及び第3撮影部20cは、作業領域Rの複数の部分をそれぞれ撮影した複数の動画を撮影してよい。例えば、第1撮影部20aは、主に第1作業者A1が動作を実行している動画を撮影し、第3撮影部20cは、主に第2作業者A2が動作を実行している動画を撮影し、第2撮影部20bは、第1作業者A1が動作を実行している動画及び第2作業者A2が動作を実行している動画の両方を撮影してよい。また、第1撮影部20a、第2撮影部20b及び第3撮影部20cは、作業領域Rにおける複数の位置でそれぞれ異なる工程が実行される動画を撮影してよい。 In addition, the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may capture a plurality of moving images obtained by capturing a plurality of portions of the work area R, respectively. For example, the first imaging unit 20a mainly captures a moving image in which the first operator A1 is performing an operation, and the third imaging unit 20c mainly performs a moving image in which the second operator A2 is performing an operation The second imaging unit 20b may capture both the moving image in which the first operator A1 is performing an operation and the moving image in which the second operator A2 is performing an operation. In addition, the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c may capture a moving image in which different steps are performed at a plurality of positions in the work area R.
 <測定部の構成>
 測定部30は、モーションキャプチャにより構成されてよく、ある作業領域Rにおいて実行された第1作業者A1及び第2作業者A2の動作をそれぞれ定量的に示す複数の動作情報を測定してよい。測定部30の構成は任意であるが、例えば、第1作業者A1及び第2作業者A2にパターン光を投影して、パターン光が投影された状態の第1作業者A1及び第2作業者A2の動画を撮影し、撮影した動画に基づいて、第1作業者A1及び第2作業者A2の複数の関節の座標値を測定するものであってよい。また、測定部30は、複数の作業者に関する複数の動作情報を測定する場合、動作情報に作業者を識別する情報を付加してもよい。なお、測定部30は、第1作業者A1及び第2作業者A2の複数の関節の座標値以外を測定するものであってもよく、例えば、第1作業者A1及び第2作業者A2の指先や頭部等、必ずしも関節ではない身体の代表的な位置の座標値を測定するものであってもよく、関節の座標値及び関節以外の身体の代表的な位置の座標値を併せて測定するものであってもよい。また、測定部30は、第1作業者A1及び第2作業者A2が装着したトラッカーの位置の座標値を測定するものであってもよく、その場合、身体の代表的な位置は、トラッカーの位置であってよい。また、身体の代表的な位置は、身体の特徴点の位置と言い換えてもよい。
<Configuration of Measurement Unit>
The measurement unit 30 may be configured by motion capture, and may measure a plurality of pieces of operation information quantitatively showing the operations of the first worker A1 and the second worker A2 performed in a certain work area R. Although the configuration of the measurement unit 30 is arbitrary, for example, the pattern light is projected to the first worker A1 and the second worker A2, and the first worker A1 and the second worker in a state in which the pattern light is projected A moving image of A2 may be photographed, and coordinate values of a plurality of joints of the first worker A1 and the second worker A2 may be measured based on the photographed moving image. In addition, when measuring a plurality of pieces of operation information regarding a plurality of workers, the measurement unit 30 may add information identifying the worker to the operation information. In addition, the measurement unit 30 may measure other than the coordinate values of the plurality of joints of the first worker A1 and the second worker A2, and, for example, the measurement unit 30 of the first worker A1 and the second worker A2 Coordinate values of representative positions of the body not necessarily joints, such as fingertips and heads, may be measured. Coordinate values of joints and coordinate values of representative positions of bodies other than joints may be measured together. It may be In addition, the measurement unit 30 may measure the coordinate values of the positions of the trackers worn by the first worker A1 and the second worker A2, and in this case, the representative position of the body is It may be a position. Also, the representative position of the body may be reworded as the position of feature points of the body.
 動作分析システム100には、複数の測定部30が含まれてもよい。複数の測定部30によって、複数の作業者の動作情報を測定する場合、同一の作業者の動作情報が重複して測定されることがあり得るが、動作情報に作業者を識別する情報を付加して、重複を取り除いたり、異なる測定部30により測定された動作情報を合成したりしてもよい。 The motion analysis system 100 may include a plurality of measurement units 30. When the operation information of a plurality of workers is measured by a plurality of measuring units 30, the operation information of the same worker may be measured in duplicate, but information for identifying the worker is added to the operation information Then, duplicates may be removed, or operation information measured by different measurement units 30 may be combined.
 測定部30は、第1作業者A1及び第2作業者A2が動作を実行している動画を撮影する第4撮影部を兼ねてもよい。第4撮影部は、作業領域R全体の動画を撮影してよい。すなわち、第4撮影部は、第1作業者A1及び第2作業者A2が動作を実行している様子を、第1作業者A1及び第2作業者A2が両方含まれるように撮影してよい。一方、第1撮影部20a、第2撮影部20b及び第3撮影部20cは、第1作業者A1及び第2作業者A2の一方が含まれるように、第1作業者A1及び第2作業者A2が動作を実行している動画を撮影してよい。 The measurement unit 30 may also serve as a fourth imaging unit that captures a moving image in which the first worker A1 and the second worker A2 are performing an operation. The fourth imaging unit may capture a moving image of the entire work area R. That is, the fourth imaging unit may capture the state in which the first worker A1 and the second worker A2 are performing operations so that both the first worker A1 and the second worker A2 are included. . On the other hand, the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit 20c are the first worker A1 and the second worker so that one of the first worker A1 and the second worker A2 is included. A2 may shoot a moving image in action.
 測定部30(第4撮影部)により撮影される作業領域Rを撮影した動画は、本発明の「第1動画」に相当し、第1撮影部20a、第2撮影部20b及び第3撮影部20cにより撮影される作業領域Rの一部を撮影した動画は、本発明の「第2動画」に相当する。また、本実施形態の測定部30(第4撮影部)は、本発明の「第1撮影部」に相当し、本実施形態の第1撮影部20a、第2撮影部20b及び第3撮影部20cは、本発明の「第2撮影部」に相当する。 A moving image obtained by shooting the work area R shot by the measuring unit 30 (fourth shooting unit) corresponds to the “first moving image” of the present invention, and the first shooting unit 20a, the second shooting unit 20b, and the third shooting unit A moving image obtained by capturing a part of the work area R, which is captured by 20c, corresponds to the “second moving image” in the present invention. Further, the measurement unit 30 (fourth imaging unit) of the present embodiment corresponds to the "first imaging unit" of the present invention, and the first imaging unit 20a, the second imaging unit 20b, and the third imaging unit of the present embodiment. 20c corresponds to the "second imaging unit" in the present invention.
 <第1取得部の構成>
 第1取得部11は、作業領域Rにおいて実行された第1作業者A1及び第2作業者A2の動作を定量的に示す動作情報を、測定部30から取得する。第1取得部11により取得された動作情報は、記憶部14に伝送され、動作情報履歴14bとして記憶される。動作分析システム100が複数の測定部30を含む場合、第1取得部11は、複数の測定部30それぞれから動作情報を取得してよく、いずれの測定部30から取得した動作情報であるかを識別する情報を付加して、動作情報を記憶部14に伝送してもよい。なお、動作情報の具体的な例については、後に図4を用いて詳細に説明する。動作情報は、例えば、作業者の身体の代表的な位置の座標値を1秒間隔で測定した情報であってよい。
<Configuration of First Acquisition Unit>
The first acquisition unit 11 acquires, from the measurement unit 30, operation information that quantitatively indicates the operations of the first worker A1 and the second worker A2 performed in the work area R. The operation information acquired by the first acquisition unit 11 is transmitted to the storage unit 14 and stored as an operation information history 14 b. When the operation analysis system 100 includes a plurality of measurement units 30, the first acquisition unit 11 may acquire operation information from each of the plurality of measurement units 30, and it is determined from which measurement unit 30 the operation information is acquired. The operation information may be transmitted to the storage unit 14 by adding information to be identified. A specific example of the operation information will be described in detail later with reference to FIG. The movement information may be, for example, information obtained by measuring coordinate values of representative positions of the worker's body at one-second intervals.
 <第2取得部の構成>
 第2取得部12は、第1作業者A1及び第2作業者A2が動作を実行している動画を、第1撮影部20a、第2撮影部20b、第3撮影部20c及び測定部30(第4撮影部)から取得する。第2取得部12により取得された動画は、記憶部14に伝送され、動画履歴14aとして記憶される。第2取得部12は、複数の撮影部のうちいずれの撮影部から取得した動画であるかを識別する情報を付加して、動画を記憶部14に伝送してもよい。
<Configuration of Second Acquisition Unit>
The second acquiring unit 12 is a moving image in which the first worker A1 and the second worker A2 are performing the operation, the first imaging unit 20a, the second imaging unit 20b, the third imaging unit 20c, and the measuring unit 30 ( Acquired from the fourth imaging unit). The moving image acquired by the second acquisition unit 12 is transmitted to the storage unit 14 and stored as a moving image history 14 a. The second acquisition unit 12 may transmit the moving image to the storage unit 14 by adding information that identifies the moving image acquired from any of the plurality of imaging units.
 <第3取得部の構成>
 第3取得部13は、作業者の動作について比較の基準となる基準動作を定量的に示す基準動作情報を、測定部30から取得する。第3取得部13により取得された基準動作情報は、記憶部14に伝送され、基準動作情報14cとして記憶される。なお、第3取得部13は、既に記憶された動作情報履歴14bのうち、後に説明する抽出部15により抽出されなかった動作情報を、基準動作情報として取得して、基準動作であることを示す情報を付加して記憶部14に記憶してもよい。また、第3取得部13は、既に記憶された動作情報履歴14bのうち、ユーザにより指定された動作情報を、基準動作情報として取得して、基準動作であることを示す情報を付加して記憶部14に記憶してもよい。なお、基準動作情報の具体的な例については、後に図5等を用いて詳細に説明する。基準動作情報は、例えば、作業者が従うべき標準的な動作が実行された場合における作業者の身体の代表的な位置の座標値を1秒間隔で測定した情報であってよい。
<Configuration of Third Acquisition Unit>
The third acquisition unit 13 acquires, from the measurement unit 30, reference operation information that quantitatively indicates a reference operation serving as a comparison reference for the operation of the worker. The reference operation information acquired by the third acquisition unit 13 is transmitted to the storage unit 14 and stored as reference operation information 14c. Note that the third acquisition unit 13 acquires, as reference operation information, operation information not extracted by the extraction unit 15 described later among the already stored operation information history 14 b, and indicates that it is a reference operation. Information may be added and stored in the storage unit 14. Further, the third acquisition unit 13 acquires the operation information specified by the user among the already stored operation information history 14b as the reference operation information, and adds and stores information indicating that it is the reference operation. It may be stored in the unit 14. A specific example of the reference operation information will be described in detail later with reference to FIG. The reference operation information may be, for example, information obtained by measuring coordinate values of representative positions of the worker's body when the standard operation to be followed by the worker is performed at one second intervals.
 <記憶部の構成>
 記憶部14は、少なくとも、作業者の動作について比較の基準となる基準動作を定量的に示す基準動作情報14cを記憶する。本実施形態に係る動作分析システム100では、記憶部14は、動画履歴14a、動作情報履歴14b、基準動作情報14c及び対応テーブル14dを記憶する。このうち、対応テーブル14dは、抽出部15により抽出された動作情報が示す動作が実行された作業領域Rにおける位置を推定するために用いられる。対応テーブル14dについては、後に具体的な例を示して説明する。
<Configuration of Storage Unit>
The storage unit 14 stores at least reference operation information 14 c quantitatively indicating a reference operation which is a reference of comparison for the operation of the worker. In the operation analysis system 100 according to the present embodiment, the storage unit 14 stores a moving image history 14a, an operation information history 14b, a reference operation information 14c, and a correspondence table 14d. Among these, the correspondence table 14 d is used to estimate the position in the work area R in which the operation indicated by the operation information extracted by the extraction unit 15 is performed. The correspondence table 14d will be described later by showing a specific example.
 <抽出部の構成>
 抽出部15は、測定部30により測定された動作情報と、基準動作情報14cとを比較して、所定の条件を満たす動作情報を抽出する。ここで、動作情報及び基準動作情報14cは、作業者の関節の座標値をそれぞれ含んでよく、抽出部15は、動作情報に含まれる座標値と、基準動作情報14cに含まれる座標値とを比較して、所定の条件を満たす動作情報を抽出してよい。この場合、所定の条件は、動作情報に含まれる座標値と、基準動作情報14cに含まれる座標値との乖離度が閾値以上であるという条件であってよい。
<Configuration of Extraction Unit>
The extraction unit 15 compares the operation information measured by the measurement unit 30 with the reference operation information 14c, and extracts operation information that satisfies a predetermined condition. Here, the motion information and the reference motion information 14c may include the coordinate values of the worker's joints, respectively, and the extraction unit 15 may calculate the coordinate values included in the motion information and the coordinate values included in the reference motion information 14c. The operation information may be extracted to satisfy a predetermined condition by comparison. In this case, the predetermined condition may be a condition that the degree of deviation between the coordinate value included in the operation information and the coordinate value included in the reference operation information 14c is equal to or greater than a threshold.
 また、抽出部15は、測定部30により測定された動作情報と、複数の工程毎に定められた基準動作情報14cとを比較して、所定の条件を満たす動作情報を抽出してもよい。この場合、抽出される動作情報は、一つの工程に対応する。また、抽出部15は、測定部30により測定された動作情報と、複数の要素動作毎に定められた基準動作情報14cとを比較して、所定の条件を満たす動作情報を抽出してもよい。この場合、抽出される動作情報は、一つの要素動作に対応する。 Further, the extraction unit 15 may extract the operation information satisfying the predetermined condition by comparing the operation information measured by the measurement unit 30 with the reference operation information 14c determined for each of the plurality of steps. In this case, the extracted operation information corresponds to one process. Further, the extraction unit 15 may extract the operation information satisfying the predetermined condition by comparing the operation information measured by the measurement unit 30 with the reference operation information 14c determined for each of the plurality of element operations. . In this case, the extracted operation information corresponds to one element operation.
 <特定部の構成>
 特定部16は、動画履歴14aを用いて、抽出された動作情報が示す動作を作業者が実行している場面を特定する。特定部16は、動作情報が測定された時刻と、動画が撮影された時刻とを比較して、動画履歴14aのうち、抽出された動作情報が示す動作を実行している時間帯の場面を特定してよい。また、特定部16は、抽出された動作情報に付加された作業者を識別する情報に基づいて、抽出された動作情報が測定された作業者を特定してもよい。さらに、特定部16は、抽出部15により抽出された動作情報が示す工程を特定してもよいし、抽出部15により抽出された動作情報が示す要素動作を特定してもよい。
<Configuration of specific part>
The identifying unit 16 identifies a scene in which the operator is executing the operation indicated by the extracted operation information, using the moving image history 14a. The identifying unit 16 compares the time when the motion information is measured with the time when the moving image is shot, and in the moving picture history 14a, a scene of a time zone in which the operation indicated by the extracted action information is performed. You may specify. In addition, the identifying unit 16 may identify the worker for which the extracted operation information is measured, based on the information identifying the worker added to the extracted operation information. Furthermore, the specification unit 16 may specify a process indicated by the operation information extracted by the extraction unit 15 or may specify an element operation indicated by the operation information extracted by the extraction unit 15.
 <推定部の構成>
 推定部17は、抽出部15により抽出された動作情報が示す動作が実行された作業領域Rにおける位置を推定する。推定部17は、記憶部14に記憶された対応テーブル14dを参照することで、抽出部15により抽出された動作情報に対応する作業領域Rにおける位置を推定してよい。本実施形態に係る動作分析システム100では、記憶部14に、工程と撮影部との対応が示された第1対応テーブルD3と、作業者の関節の座標値と撮影部との対応が示された第2対応テーブルD4と、作業者の関節の座標値及び要素動作と撮影部との対応が示された第3対応テーブルD6と、が記憶されてよい。特定部16は、複数の動画のうち、推定部17により推定された位置で撮影された動画を特定してよい。
<Configuration of estimation unit>
The estimation unit 17 estimates the position in the work area R in which the operation indicated by the operation information extracted by the extraction unit 15 is performed. The estimation unit 17 may estimate the position in the work area R corresponding to the operation information extracted by the extraction unit 15 by referring to the correspondence table 14 d stored in the storage unit 14. In the motion analysis system 100 according to the present embodiment, the storage unit 14 indicates the first correspondence table D3 in which the correspondence between the process and the imaging unit is indicated, and the correspondence between the coordinate values of the joints of the worker and the imaging unit. A second correspondence table D4 may be stored, and a third correspondence table D6 in which the coordinate values of the joints of the worker and the correspondence between the element operation and the photographing unit are indicated. The identifying unit 16 may identify a moving image captured at a position estimated by the estimating unit 17 among the plurality of moving images.
 推定部17は、抽出部15により抽出された動作情報が示す工程に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域Rにおける位置を推定してもよい。この場合、推定部17は、工程と撮影部との対応が示された第1対応テーブルD3を参照して、抽出された動作情報が示す動作を作業者が実行している作業領域Rにおける位置を推定してよい。 The estimation unit 17 may estimate the position in the work area R where the worker is executing the operation indicated by the extracted operation information based on the process indicated by the operation information extracted by the extraction unit 15. In this case, the estimation unit 17 refers to the first correspondence table D3 in which the correspondence between the process and the imaging unit is indicated, and the position in the work area R where the operator performs the operation indicated by the extracted operation information. You may estimate
 推定部17は、抽出部15により抽出された動作情報に含まれる座標値に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域Rにおける位置を推定してもよい。この場合、推定部17は、作業者の関節の座標値と撮影部との対応が示された第2対応テーブルD4を参照して、抽出された動作情報が示す動作を作業者が実行している作業領域Rにおける位置を推定してよい。 The estimation unit 17 may estimate the position in the work area R where the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the operation information extracted by the extraction unit 15 . In this case, the estimation unit 17 refers to the second correspondence table D4 in which the correspondence between the coordinate values of the joint of the worker and the imaging unit is indicated, and the worker executes the operation indicated by the extracted operation information. The position in the working area R may be estimated.
 推定部17は、抽出部15により抽出された動作情報に含まれる座標値及び抽出された動作情報が示す要素動作に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域Rにおける位置を推定してもよい。この場合、推定部17は、作業者の関節の座標値及び要素動作と撮影部との対応が示された第3対応テーブルD6を参照して、抽出された動作情報が示す動作を作業者が実行している作業領域Rにおける位置を推定してよい。 The estimation unit 17 is a task in which the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the operation information extracted by the extraction unit 15 and the element operation indicated by the extracted operation information. The position in the region R may be estimated. In this case, the estimation unit 17 refers to the third correspondence table D6 in which the coordinate value of the joint of the worker and the correspondence between the element operation and the imaging unit are displayed, and the worker performs the operation indicated by the extracted operation information. The position in the work area R being executed may be estimated.
 なお、動作分析システム100は、作業者が標準的ではない動作を行う頻度に基づいて、作業者の疲れ、焦りを評価してもよい。疲れ、焦りの評価は、例えば、ある作業者について検出された標準的ではない動作の発生頻度が、全ての作業者に関する平均発生頻度からどの程度乖離しているか評価することにより行ってもよい。また、動作分析システム100は、評価された作業者の疲れ、焦りに基づいて、作業者の作業負担を調整してもよい。より具体的には、評価された作業者の疲れ、焦りが大きくなるほど、製造ラインの流れ速度を遅くするように調整して、作業者の作業負担を軽減し、不良品の発生を防止することとしてもよい。 Note that the motion analysis system 100 may evaluate the worker's tiredness and intimacy based on the frequency at which the worker performs a non-standard motion. The evaluation of fatigue and irritation may be performed, for example, by evaluating how often the frequency of occurrence of non-standard motion detected for a certain worker deviates from the average frequency of occurrence for all the workers. In addition, the motion analysis system 100 may adjust the work load of the worker based on the tiredness and inconsistencies of the evaluated worker. More specifically, adjustment is made to slow down the flow speed of the production line as fatigue and insecurity of the evaluated worker increase, thereby reducing the workload of the worker and preventing the generation of defective products. It may be
 [ハードウェア構成]
 次に、図3を用いて、本実施形態に係る動作分析装置10のハードウェア構成の一例を説明する。動作分析装置10は、演算装置に相当するCPU(Central Processing Unit)10aと、記憶部14に相当するRAM(Random Access Memory)10bと、記憶部14に相当するROM(Read only Memory)10cと、通信部10dと、入力部10eと、表示部10fとを有する。これらの各構成は、バスを介して相互にデータ送受信可能に接続される。なお、本例では動作分析装置10が一台のコンピュータで構成される場合について説明するが、動作分析装置10は、複数のコンピュータを用いて実現されてもよい。
[Hardware configuration]
Next, an example of the hardware configuration of the operation analysis device 10 according to the present embodiment will be described using FIG. 3. The operation analysis device 10 includes a central processing unit (CPU) 10 a corresponding to an arithmetic device, a random access memory (RAM) 10 b corresponding to the storage unit 14, and a read only memory (ROM) 10 c corresponding to the storage unit 14. A communication unit 10d, an input unit 10e, and a display unit 10f are provided. Each of these configurations is mutually connected so as to be able to transmit and receive data via a bus. In addition, although the case where the motion analysis apparatus 10 is comprised by one computer is demonstrated in this example, the motion analysis apparatus 10 may be implement | achieved using several computers.
 CPU10aは、RAM10b又はROM10cに記憶されたプログラムの実行に関する制御やデータの演算、加工を行う制御部である。CPU10aは、動作情報と基準動作情報とを比較して、所定の条件を満たす動作情報を抽出し、動画のうち、抽出された動作情報が示す動作を実行している部分を特定するプログラム(動作分析プログラム)を実行する演算装置である。CPU10aは、入力部10eや通信部10dから種々の入力データを受け取り、入力データの演算結果を表示部10fに表示したり、RAM10bやROM10cに格納したりする。 The CPU 10a is a control unit that performs control related to the execution of a program stored in the RAM 10b or the ROM 10c, and performs calculation and processing of data. The CPU 10a compares the operation information with the reference operation information to extract the operation information satisfying the predetermined condition, and identifies a part of the moving image that is executing the operation indicated by the extracted operation information (operation (operation) Is an arithmetic unit that executes an analysis program). The CPU 10a receives various input data from the input unit 10e and the communication unit 10d, and displays the calculation result of the input data on the display unit 10f or stores it in the RAM 10b or the ROM 10c.
 RAM10bは、記憶部14のうちデータの書き換えが可能なものであり、例えば半導体記憶素子で構成されてよい。RAM10bは、CPU10aが実行する動作分析プログラムや、動画履歴14a、動作情報履歴14b、基準動作情報14c及び対応テーブル14dといったデータを記憶する。 The RAM 10 b is a part of the storage unit 14 in which data can be rewritten, and may be formed of, for example, a semiconductor storage element. The RAM 10 b stores an operation analysis program executed by the CPU 10 a and data such as a moving image history 14 a, an operation information history 14 b, a reference operation information 14 c, and a correspondence table 14 d.
 ROM10cは、記憶部14のうちデータの読み出しが可能なものであり、例えば半導体記憶素子で構成されてよい。ROM10cは、例えば動作分析プログラムや、書き換えが行われないデータを記憶する。 The ROM 10 c is one of the storage units 14 that can read data, and may be configured of, for example, a semiconductor storage element. The ROM 10 c stores, for example, an operation analysis program and data not to be rewritten.
 通信部10dは、動作分析装置10を外部機器に接続するインターフェースである。通信部10dは、第1撮影部20a、第2撮影部20b、第3撮影部20c及び測定部30と例えばLAN(Local Area Network)により接続されて、第1撮影部20a、第2撮影部20b及び第3撮影部20cから動画を受信し、測定部30から動画及び動作情報を受信してよい。また、通信部10dは、インターネットに接続されて、インターネットを介して動画を受信したり、動作情報を受信したりしてもよい。また、通信部10dは、インターネットを介して、撮影された動画のうち、抽出部15により抽出された動作情報が示す動作を実行している部分を、外部機器に送信してもよい。 The communication unit 10 d is an interface that connects the operation analysis device 10 to an external device. The communication unit 10d is connected to the first imaging unit 20a, the second imaging unit 20b, the third imaging unit 20c, and the measuring unit 30, for example, by a LAN (Local Area Network), and is connected to the first imaging unit 20a and the second imaging unit 20b. And, the moving image may be received from the third imaging unit 20c, and the moving image and operation information may be received from the measuring unit 30. The communication unit 10 d may be connected to the Internet to receive a moving image or receive operation information via the Internet. In addition, the communication unit 10d may transmit, to the external device, a portion of the captured moving image that is performing the operation indicated by the operation information extracted by the extraction unit 15 through the Internet.
 入力部10eは、ユーザからデータの入力を受け付けるものであり、例えば、キーボード、マウス及びタッチパネルを含んでよい。 The input unit 10 e receives an input of data from the user, and may include, for example, a keyboard, a mouse, and a touch panel.
 表示部10fは、CPU10aによる演算結果を視覚的に表示するものであり、例えば、LCD(Liquid Crystal Display)により構成されてよい。表示部10fに表示される画面の例については、後に詳細に説明する。 The display unit 10 f visually displays the calculation result by the CPU 10 a, and may be configured of, for example, an LCD (Liquid Crystal Display). An example of the screen displayed on the display unit 10 f will be described in detail later.
 動作分析プログラムは、RAM10bやROM10c等のコンピュータによって読み取り可能な記憶媒体に記憶されて提供されてもよいし、通信部10dにより接続される通信ネットワークを介して提供されてもよい。動作分析装置10では、CPU10aが動作分析プログラムを実行することにより、図2を用いて説明した様々な動作が実現される。なお、これらの物理的な構成は例示であって、必ずしも独立した構成でなくてもよい。例えば、動作分析装置10は、CPU10aとRAM10bやROM10cが一体化したLSI(Large-Scale Integration)を備えていてもよい。 The operation analysis program may be stored in a computer-readable storage medium such as the RAM 10 b or the ROM 10 c and provided, or may be provided via a communication network connected by the communication unit 10 d. In the operation analysis apparatus 10, the CPU 10a executes the operation analysis program to realize various operations described with reference to FIG. Note that these physical configurations are exemplifications and may not necessarily be independent configurations. For example, the operation analysis apparatus 10 may include an LSI (Large-Scale Integration) in which the CPU 10a, the RAM 10b, and the ROM 10c are integrated.
 §3 動作例
 図4は、本実施形態に係る動作分析システム100により測定される動作情報D1の一例を示す図である。同図では、測定部30によって、1秒間隔で複数の作業者の関節の座標値を測定した例を示している。
Section 3 Operation Example FIG. 4 is a view showing an example of operation information D1 measured by the operation analysis system 100 according to the present embodiment. The figure shows an example in which the coordinate values of joints of a plurality of workers are measured at intervals of one second by the measuring unit 30.
 本例の動作情報D1では、「時刻」、「作業者ID」、「右手(X座標)」、「右手(Y座標)」、「右手(Z座標)」及び「左手(X座標)」といった項目を一例として示している。ここで、「時刻」は、測定部30により動作情報が測定された日時を表し、例えば上から2行目及び6行目に「2017/9/1 8:43:21」と示されており、2017年9月1日の午前8時43分21秒に測定された動作情報であることが表されている。同様に、上から3行目には「2017/9/1 8:43:22」と示されており、2017年9月1日の午前8時43分22秒に測定された動作情報であることが表されている。また、上から4行目には「2017/9/1 8:43:23」と示されており、2017年9月1日の午前8時43分23秒に測定された動作情報であることが表されている。なお、5行目及び7行目に示された「・・・」という記載は、複数のデータが含まれていることを示す省略記号である。 In the operation information D1 of this example, "time", "worker ID", "right hand (X coordinate)", "right hand (Y coordinate)", "right hand (Z coordinate)" and "left hand (X coordinate)" Items are shown as an example. Here, “time” indicates the date and time when the operation information was measured by the measurement unit 30, and for example, “2017/9/1 8:43:21” is shown in the second and sixth lines from the top. , And it is represented that it is operation information measured at 8:43:21 on September 1, 2017. Similarly, the third line from the top indicates “2017/9/1 8:43:22”, which is operation information measured at 8:43:22 on September 1, 2017 Is represented. In addition, the fourth line from the top indicates "2017/9/1 8:43:23", and it is operation information measured at 8:43:23 on September 1, 2017 Is represented. Note that the description “...” Shown in the fifth and seventh lines is an ellipsis indicating that a plurality of data are included.
 「作業者ID」は、測定部30により動作情報が測定された複数の作業者を識別する情報である。本例の場合、作業者IDは、上から2行目~4行目について「A1」であり、上から6行目について「A2」である。すなわち、上から2行目~4行目の動作情報は、第1作業者A1について測定された動作情報であり、上から6行目の動作情報は、第2作業者A2について測定された動作情報であることが示されている。 The “worker ID” is information for identifying a plurality of workers whose operation information has been measured by the measurement unit 30. In the case of this example, the worker ID is “A1” for the second to fourth lines from the top and “A2” for the sixth line from the top. That is, the operation information on the second to fourth lines from the top is the operation information measured for the first worker A1, and the operation information on the sixth line from the top is the operation measured for the second worker A2 It is shown to be information.
 「右手(X座標)」は、作業者の右手関節について、測定部30を原点とする座標系のX軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、右手のX座標値は、第1作業者A1について、8時43分21秒に「463」であり、8時43分22秒に「533」であり、8時43分23秒に「483」であることが示されている。このことから、第1作業者は、X軸方向に7cm右手を動かしたことが読み取れる。また、第2作業者A2について、右手のX座標値は、8時43分21秒に「416」であることが示されている。 The “right hand (X coordinate)” indicates the position on the X axis of the coordinate system with the measuring unit 30 as the origin, for the right hand joint of the worker. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the X coordinate value of the right hand is “463” at 8:43:21 for the first worker A1, “533” at 8:43:22, and 8:43:23. It is shown that it is "483" in second. From this, it can be read that the first worker has moved the right hand by 7 cm in the X-axis direction. Also, for the second worker A2, the X coordinate value of the right hand is shown to be “416” at 8:43:21.
 「右手(Y座標)」は、作業者の右手関節について、測定部30を原点とする座標系のY軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、右手のY座標値は、第1作業者A1について、8時43分21秒に「574」であり、8時43分22秒に「977」であり、8時43分23秒に「830」であることが示されている。このことから、第1作業者は、Y軸方向に40cmほど大きく右手を動かし、15cmほど位置を戻したことが読み取れる。また、第2作業者A2について、右手のY座標値は、8時43分21秒に「965」であることが示されている。 The “right hand (Y coordinate)” indicates the position on the Y axis of the coordinate system with the measuring unit 30 as the origin, for the right hand joint of the worker. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the Y coordinate value of the right hand is “574” at 8:43:21 for the first worker A1, “977” at 8:43:22, and 8:43:23. It is shown that it is "830" in second. From this, it can be read that the first worker moved the right hand by about 40 cm in the Y-axis direction and returned the position by about 15 cm. In addition, for the second worker A2, the Y coordinate value of the right hand is shown to be “965” at 8:43:21.
 「右手(Z座標)」は、作業者の右手関節について、測定部30を原点とする座標系のZ軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、右手のZ座標値は、第1作業者A1について、8時43分21秒に「531」であり、8時43分22秒に「341」であり、8時43分23秒に「624」であることが示されている。このことから、第1作業者は、Z座標のマイナス方向に9cmほど右手を動かし、Z座標のプラス方向に28cmほど右手を動かしたことが読み取れる。また、第2作業者A2について、右手のZ座標値は、8時43分21秒に「408」であることが示されている。 The “right hand (Z coordinate)” indicates the position of the operator's right joint in the Z axis of the coordinate system with the measuring unit 30 as the origin. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the Z coordinate value of the right hand is “531” at 8:43:21 for the first worker A1, “341” at 8:43:22, and 8:43:23. It is shown that the second is "624". From this, it can be read that the first worker moved the right hand about 9 cm in the minus direction of the Z coordinate and moved the right hand about 28 cm in the plus direction of the Z coordinate. Also, for the second worker A2, the Z coordinate value of the right hand is shown to be “408” at 8:43:21.
 「左手(X座標)」は、作業者の左手関節について、測定部30を原点とする座標系のX軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、左手のX座標値は、第1作業者A1について、8時43分21秒に「327」であり、8時43分22秒に「806」であり、8時43分23秒に「652」であることが示されている。このことから、第1作業者は、X軸方向に7cm左手を動かし、すぐに元の位置に戻したことが読み取れる。また、第2作業者A2について、左手のX座標値は、8時43分21秒に「374」であることが示されている。 The “left hand (X coordinate)” indicates the position on the X axis of the coordinate system with the measuring unit 30 as the origin with respect to the left hand joint of the worker. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the X coordinate value of the left hand is “327” at 8:43:21 for the first worker A1, “806” at 8:43:22, 8:43:23. It is shown that it is "652" in second. From this, it can be read that the first operator moved the left hand by 7 cm in the X-axis direction and immediately returned to the original position. In addition, for the second worker A2, the X coordinate value of the left hand is shown to be “374” at 8:43:21.
 本例では、動作情報D1に含まれる他の関節の座標値の記載を省略しているが、動作情報D1は、手首、ひじ、肩、頭、腰、ひざ、足首等の各関節の3次元座標値を含んでもよい。また、本例では、動作情報D1は、2名の作業者の動作情報を含んでいるが、3名以上の作業者の動作情報を含んでもよいし、複数の測定部30により測定された複数の作業者毎の動作情報を含んでもよい。 In this example, the description of coordinate values of other joints included in the movement information D1 is omitted, but the movement information D1 is a three-dimensional movement of each joint such as wrist, elbow, shoulder, head, waist, knee, ankle and so on. Coordinate values may be included. Further, in this example, the operation information D1 includes the operation information of two workers, but may include the operation information of three or more workers, and a plurality of measured by the plurality of measurement units 30 Operation information for each worker may be included.
 図5は、本実施形態に係る動作分析システム100により記憶される第1基準動作情報D2の例を示す図である。同図では、複数の工程毎に定められた基準動作情報を示しており、作業者が標準的な動作を行った場合の関節の座標値を1秒間隔で記録した例を示している。 FIG. 5 is a diagram showing an example of the first reference operation information D2 stored by the operation analysis system 100 according to the present embodiment. In the figure, reference operation information defined for each of a plurality of steps is shown, and an example is shown in which coordinate values of joints when an operator performs a standard operation are recorded at one second intervals.
 本例の第1基準動作情報D2では、「開始からの経過時間」、「工程」、「右手(X座標)」、「右手(Y座標)」、「右手(Z座標)」及び「左手(X座標)」といった項目を一例として示している。ここで、「開始からの経過時間」は、一つの工程の開始から終了までの経過時間を秒単位で示している。例えば、上から2行目に「00:00:00」と示されており、工程の開始時における基準動作情報を示す行であることが表されている。同様に、上から3行目には「00:00:01」と示されており、工程の開始から1秒後の基準動作情報であることが表されている。また、上から5行目には「00:00:00」と示されており、異なる工程の開始時における基準動作情報を示す行であることが表されている。同様に、上から6行目には「00:00:01」と示されており、工程の開始から1秒後の基準動作情報であることが表されている。なお、4行目及び7行目に示された「・・・」という記載は、複数のデータが含まれていることを示す省略記号である。 In the first reference motion information D2 of this example, "elapsed time from start", "step", "right hand (X coordinate)", "right hand (Y coordinate)", "right hand (Z coordinate)" and "left hand ( Items such as X coordinate) are shown as an example. Here, the "elapsed time from the start" indicates the elapsed time from the start to the end of one process in seconds. For example, “00:00:00” is shown in the second line from the top, which indicates that the line indicates reference operation information at the start of the process. Similarly, the third line from the top indicates “00:00:01”, which indicates that it is reference operation information one second after the start of the process. Further, the fifth line from the top shows “00:00:00”, which indicates that it is a line indicating reference operation information at the start of a different process. Similarly, the sixth line from the top indicates “00:00:01”, which indicates that it is reference operation information one second after the start of the process. Note that the description “...” Shown in the fourth and seventh lines is an ellipsis indicating that a plurality of data are included.
 「工程」は、基準動作情報がいずれの工程に関するものか示す情報である。例えば、上から2行目及び3行目には、「1.組立」と示されており、これらの行の基準動作情報は、組立工程に関するものであることが表されている。同様に、上から5行目及び6行目には、「5.梱包」と示されており、これらの行の基準動作情報は、梱包工程に関するものであることが表されている。 The “step” is information indicating which step the reference operation information relates to. For example, in the second and third lines from the top, "1. Assembly" is shown, and it is indicated that the reference operation information of these lines relates to the assembly process. Similarly, in the fifth and sixth lines from the top, "5. package" is shown, and it is indicated that the reference operation information of these lines relates to the packing process.
 「右手(X座標)」は、作業者の右手関節について、測定部30を原点とする座標系のX軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、標準的な右手のX座標値は、組立工程について、開始時に「463」であり、1秒後に「533」であることが示されている。また、梱包工程について、標準的な右手のX座標値は、開始時に「416」であり、1秒後に「796」であることが示されている。 The “right hand (X coordinate)” indicates the position on the X axis of the coordinate system with the measuring unit 30 as the origin, for the right hand joint of the worker. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the standard right hand X coordinate value is shown to be "463" at the start and "533" after one second for the assembly process. Also, for the packaging process, it is shown that the standard right hand X coordinate value is “416” at the start and “796” after one second.
 「右手(Y座標)」は、作業者の右手関節について、測定部30を原点とする座標系のY軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、標準的な右手のY座標値は、組立工程について、開始時に「574」であり、1秒後に「977」であることが示されている。また、梱包工程について、標準的な右手のY座標値は、開始時に「965」であり、1秒後に「595」であることが示されている。 The “right hand (Y coordinate)” indicates the position on the Y axis of the coordinate system with the measuring unit 30 as the origin, for the right hand joint of the worker. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the standard right hand Y coordinate value is shown to be "574" at the beginning and "977" after one second for the assembly process. Also, for the packaging process, it is shown that the standard right hand Y coordinate value is "965" at the beginning and "595" after one second.
 「右手(Z座標)」は、作業者の右手関節について、測定部30を原点とする座標系のZ軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、標準的な右手のZ座標値は、組立工程について、開始時に「531」であり、1秒後に「341」であることが示されている。また、梱包工程について、標準的な右手のZ座標値は、開始時に「408」であり、1秒後に「949」であることが示されている。 The “right hand (Z coordinate)” indicates the position of the operator's right joint in the Z axis of the coordinate system with the measuring unit 30 as the origin. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the standard right-hand Z coordinate value is shown to be "531" at the beginning and "341" after one second for the assembly process. Also, for the packaging process, the standard right-hand Z coordinate value is shown to be "408" at the start and "949" after one second.
 「左手(X座標)」は、作業者の左手関節について、測定部30を原点とする座標系のX軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、標準的な左手のX座標値は、組立工程について、開始時に「327」であり、1秒後に「806」であることが示されている。また、梱包工程について、標準的な左手のX座標値は、開始時に「374」であり、1秒後に「549」であることが示されている。 The “left hand (X coordinate)” indicates the position on the X axis of the coordinate system with the measuring unit 30 as the origin with respect to the left hand joint of the worker. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the standard left hand X coordinate value is shown to be "327" at the beginning and "806" after one second for the assembly process. Also, for the packaging process, it is shown that the standard left hand X coordinate value is "374" at the start and "549" after one second.
 本例では、第1基準動作情報D2に含まれる他の関節の座標値の記載を省略しているが、第1基準動作情報D2は、手首、ひじ、肩、頭、腰、ひざ、足首等の各関節及び身体の代表的な位置の3次元座標値を含んでもよい。また、本例では、第1基準動作情報D2は、2つの工程の基準動作情報を含んでいるが、3つ以上の工程の基準動作情報を含んでもよい。 In this example, the description of coordinate values of other joints included in the first reference motion information D2 is omitted, but the first reference motion information D2 includes a wrist, an elbow, a shoulder, a head, a hip, a knee, an ankle, etc. And three-dimensional coordinate values of representative positions of each joint and body. Further, in the present example, the first reference operation information D2 includes reference operation information of two steps, but may include reference operation information of three or more steps.
 図6は、本実施形態に係る動作分析システム100により記憶される第1対応テーブルD3の例を示す図である。同図では、複数の工程と、複数の撮影部との対応関係を記録した例を示している。 FIG. 6 is a diagram showing an example of the first correspondence table D3 stored by the behavior analysis system 100 according to the present embodiment. In the figure, the example which recorded the correspondence of several process and several imaging | photography part is shown.
 本例の第1対応テーブルD3では、「工程」及び「撮影部」といった項目を一例として示している。第1対応テーブルD3には、複数の工程と、複数の撮影部との対応関係が、一対一で示されている。例えば、「1.組立」という工程について、対応する撮影部は「第1撮影部」(第1撮影部20a)であることが示されており、「5.梱包」という工程について、対応する撮影部は「第3撮影部」(第3撮影部20c)であることが示されている。 In the first correspondence table D3 of this example, items such as "process" and "shooting unit" are shown as an example. In the first correspondence table D3, correspondences between a plurality of processes and a plurality of imaging units are shown in one-to-one correspondence. For example, it is shown that the corresponding imaging unit is the "first imaging unit" (the first imaging unit 20a) for the process "1. assembly", and the imaging corresponding to the process "5. packing" It is shown that the unit is a "third imaging unit" (third imaging unit 20c).
 なお、第1対応テーブルD3は、複数の工程と、複数の撮影部との対応関係を、一対多で示すものであってもよい。例えば、組立工程に対して、2以上の撮影部が対応していてもよい。その場合、組立工程を異なる角度、異なる距離から撮影した2以上の撮影部が示されてよい。 The first correspondence table D3 may indicate the correspondence between a plurality of processes and a plurality of imaging units in a one-to-many manner. For example, two or more imaging units may correspond to the assembly process. In that case, two or more imaging units in which the assembly process is taken at different angles and different distances may be shown.
 図7は、本実施形態に係る動作分析システム100により実行される動作情報の抽出処理の第1例のフローチャートである。動作情報の抽出処理の第1例は、動作情報D1と第1基準動作情報D2とを比較して、所定の条件を満たす動作情報を抽出する処理である。 FIG. 7 is a flowchart of a first example of the process of extracting operation information performed by the operation analysis system 100 according to the present embodiment. The first example of the process of extracting the operation information is a process of comparing the operation information D1 and the first reference operation information D2 and extracting the operation information satisfying the predetermined condition.
 動作分析システム100は、はじめに、動作情報D1に含まれる各行の座標値と、第1基準動作情報D2のうち所定の工程の基準動作情報に含まれる開始座標値との差を算出する(S10)。ここで、開始座標値とは、「開始からの経過時刻」が「00:00:00」である場合の各関節の座標値である。 The motion analysis system 100 first calculates the difference between the coordinate value of each row included in the motion information D1 and the start coordinate value included in the reference motion information of a predetermined step of the first reference motion information D2 (S10) . Here, the start coordinate value is a coordinate value of each joint when “the elapsed time from the start” is “00:00:00”.
 動作分析システム100は、次に、座標値の差が閾値以下となる動作情報D1の行を、所定の工程の開始行と特定する(S11)。ここで、座標値の差が閾値以下であるか否かは、座標値の差の絶対値が閾値以下であるかによって判断してよい。このような処理は、動作情報D1の各行の座標値が、いずれの工程に対応するものであるか予め明らかでないために必要となる。 Next, the motion analysis system 100 specifies the row of the motion information D1 for which the difference between the coordinate values is equal to or less than the threshold as the start row of the predetermined process (S11). Here, whether or not the difference between coordinate values is equal to or less than the threshold may be determined based on whether the absolute value of the difference between coordinate values is equal to or less than the threshold. Such processing is necessary because it is not clear in advance which coordinate value of each row of the motion information D1 corresponds to.
 動作分析システム100は、続いて、動作情報D1に含まれる各行の座標値と、第1基準動作情報D2のうち所定の工程の基準動作情報に含まれる終了座標値との差を算出する(S12)。ここで、終了座標値とは、所定の工程に関する最終行に示された各関節の座標値である。動作分析システム100は、座標値の差が閾値以下となる動作情報の行を、所定の工程の終了行と特定する(S13)。ここで、座標値の差が閾値以下であるか否かは、座標値の差の絶対値が閾値以下であるかによって判断してよい。 Subsequently, the motion analysis system 100 calculates a difference between the coordinate value of each row included in the motion information D1 and the end coordinate value included in the reference motion information of the predetermined process in the first reference motion information D2 (S12). ). Here, the end coordinate value is the coordinate value of each joint shown in the last line of the predetermined process. The behavior analysis system 100 specifies the row of the behavior information for which the difference between the coordinate values is equal to or less than the threshold as the end row of the predetermined process (S13). Here, whether or not the difference between coordinate values is equal to or less than the threshold may be determined based on whether the absolute value of the difference between coordinate values is equal to or less than the threshold.
 動作分析システム100は、動作情報D1に含まれる複数の行のうち、特定された開始行から終了行までを切り出す(S14)。この際、切り出される行数は、第1基準動作情報D2のうち所定の工程の基準動作情報の行数と必ずしも一致しない。そこで、動作分析システム100は、動作情報D1から切り出した動作情報の行数が、第1基準動作情報D2のうち所定の工程の基準動作情報の行数と一致するように補間又は間引きする(S15)。当然ながら、動作情報D1から切り出した動作情報の行数が、第1基準動作情報D2のうち所定の工程の基準動作情報の行数より少ない場合に補間し、動作情報D1から切り出した動作情報の行数が、第1基準動作情報D2のうち所定の工程の基準動作情報の行数より多い場合に間引きしてよい。なお、補間や間引きの方法は任意であるが、例えば、補間を行う場合には、前後の行の平均値で補間してよいし、間引きを行う場合は、座標値の変化が比較的少ない行を間引くこととしてよい。 The motion analysis system 100 cuts out from the identified start row to the end row among the plurality of rows included in the motion information D1 (S14). At this time, the number of lines to be cut out does not necessarily match the number of lines of reference operation information of a predetermined process in the first reference operation information D2. Therefore, the motion analysis system 100 performs interpolation or thinning so that the number of rows of motion information extracted from the motion information D1 matches the number of rows of reference motion information of a predetermined process in the first reference motion information D2 (S15) ). As a matter of course, when the number of lines of motion information cut out from the motion information D1 is smaller than the number of lines of reference motion information of a predetermined step in the first reference motion information D2, interpolation is performed and motion information cut out from the motion information D1 If the number of rows is larger than the number of rows of reference operation information of a predetermined process in the first reference operation information D2, the number of rows may be reduced. In addition, although the method of interpolation or thinning-out is arbitrary, when performing interpolation, for example, you may interpolate by the average value of the line of back and front, and when thinning-out, the line with a comparatively small change of a coordinate value is comparatively It is good to thin out the
 動作分析システム100は、次に、補間又は間引きした動作情報の座標値と、所定の工程の基準動作情報の座標値との乖離度を算出する(S16)。ここでいう乖離度は、作業者の動作が、作業者が従うべき標準的な動作からどの程度ずれているかを示す数値である。乖離度は、関節毎に算出してもよく、複数の関節を組み合わせた単位(右手など)で算出してもよい。例えば、関節毎の乖離度は、ある時刻に関する関節の3次元座標値について、動作情報の座標値(x、y、z)と、所定の工程の基準動作情報の座標値(X,Y,Z)との差を二乗して平方根をとり、((x-X)+(y-Y)+(z-Z)1/2という値を求め、この値を全ての時間にわたって総和することで算出してよい。もっとも、乖離度は他の値によって算出してもよい。 Next, the motion analysis system 100 calculates the degree of deviation between the coordinate value of the motion information interpolated or thinned out and the coordinate value of the reference motion information of the predetermined process (S16). Here, the degree of deviation is a numerical value indicating how much the operation of the operator deviates from the standard operation to be followed by the operator. The degree of divergence may be calculated for each joint, or may be calculated in a unit (such as the right hand) in which a plurality of joints are combined. For example, the degree of deviation for each joint is the coordinate value (x, y, z) of the motion information and the coordinate value (X, Y, Z) of the reference motion information of a predetermined process for the three-dimensional coordinate value of the joint at a certain time. ) And the square root to obtain the value ((x-X) 2 + (y-Y) 2 + (z-Z) 2 ) 1/2 , and this value is summed over all time You may calculate by doing. However, the degree of deviation may be calculated by another value.
 乖離度が閾値以上である場合(S17:YES)、動作分析システム100は、特定された開始行から終了行までの動作情報を抽出する(S18)。なお、抽出される動作情報は、補間又は間引きがされていない状態の動作情報であってよい。ここで、乖離度を関節毎に算出した場合、閾値も関節毎に設定されてよく、関節毎に算出された乖離度のうちいずれか1つが閾値以上である場合に、動作情報を抽出してもよいし、2以上の関節について算出された乖離度が閾値以上である場合に、動作情報を抽出することとしてもよい。 If the degree of divergence is equal to or greater than the threshold (S17: YES), the behavior analysis system 100 extracts the behavior information from the identified start line to the end line (S18). The operation information to be extracted may be operation information in a state where interpolation or thinning is not performed. Here, when the degree of divergence is calculated for each joint, the threshold may also be set for each joint, and when any one of the degrees of divergence calculated for each joint is equal to or more than the threshold, the motion information is extracted The motion information may be extracted if the degree of divergence calculated for two or more joints is equal to or greater than a threshold.
 動作情報を抽出した後、動画のうち抽出された動作情報が示す動作を実行している部分を表示する(S19)。動画の表示処理については、次図を用いて詳細に説明する。最後に、第1基準動作情報D2に含まれる全ての行程について分析が終了したか判断し(S20)、全ての工程について分析が終了した場合(S20:YES)、動作情報の抽出処理の第1例が終了する。 After the motion information is extracted, a portion of the moving image, which is indicated by the extracted motion information, is displayed (S19). The moving image display process will be described in detail using the following figure. Finally, it is determined whether analysis has been completed for all steps included in the first reference operation information D2 (S20), and if analysis is completed for all steps (S20: YES), the first process of extracting operation information The example ends.
 なお、本例では、動作情報D1に含まれる各行の座標値と、第1基準動作情報D2のうち所定の工程の基準動作情報に含まれる開始座標値との差を算出することで所定の工程の開始行を特定し、動作情報D1に含まれる各行の座標値と、第1基準動作情報D2のうち所定の工程の基準動作情報に含まれる終了座標値との差を算出することで所定の工程の終了行を特定しているが、動作分析システム100は、他の方法で動作情報を切り出してもよい。例えば、工程毎に作業台が割り当てられている場合等、工程毎に作業者の移動範囲が限定されている場合、動作分析システム100は、作業者の体の中心を表す座標値が所定の範囲に収まっている動作情報の区間を、所定の工程に対応する動作情報の区間として切出してもよい。 In this example, the predetermined process is performed by calculating the difference between the coordinate value of each row included in the operation information D1 and the start coordinate value included in the reference operation information of the predetermined process of the first reference operation information D2. The specified start line is specified, and the difference between the coordinate value of each row included in the operation information D1 and the end coordinate value included in the reference operation information of the predetermined step in the first reference operation information D2 is calculated. Although the end line of the process is specified, the movement analysis system 100 may cut out movement information by other methods. For example, when the work range is assigned for each process, or when the movement range of the worker is limited for each process, the motion analysis system 100 determines that the coordinate value representing the center of the worker's body is a predetermined range The section of the motion information contained in may be cut out as a section of motion information corresponding to a predetermined process.
 図8は、本実施形態に係る動作分析システム100により実行される動画表示処理の第1例のフローチャートである。動画表示処理の第1例は、動作情報の抽出処理の第1例において、動作情報を抽出した後、動画のうち抽出された動作情報が示す動作を実行している部分を表示する処理(S19)の一例である。 FIG. 8 is a flowchart of a first example of the moving image display process executed by the behavior analysis system 100 according to the present embodiment. In the first example of the moving image display process, in the first example of the process of extracting the operation information, after the operation information is extracted, a process of displaying a portion of the moving image that is executing the operation indicated by the extracted operation information (S19 An example).
 はじめに、動作分析システム100は、第1対応テーブルD3を参照して、抽出された動作情報が示す工程を実行している位置を推定する(S190)。例えば、抽出された動作情報が示す工程が梱包工程の場合、第1対応テーブルD3の「工程」の列のうち「5.梱包」を特定し、対応する撮影部である「第3撮影部20c」により、当該工程が実行された位置を推定する。位置の推定は、例えば、複数の撮影部と、複数の撮影部が設置された作業領域Rにおける位置との対応関係に基づいて行ってもよいし、撮影部の特定をもって位置の推定としてもよい。 First, the motion analysis system 100 refers to the first correspondence table D3 to estimate the position at which the process indicated by the extracted motion information is performed (S190). For example, when the process indicated by the extracted operation information is a packing process, “5. packing” is identified in the “process” column of the first correspondence table D3, and the corresponding “imaging unit 20c” is ", To estimate the position at which the process is performed. The position may be estimated based on, for example, the correspondence between a plurality of imaging units and the position in the work area R in which the plurality of imaging units are installed, or the position may be estimated based on the specification of the imaging units. .
 次に、動作分析システム100は、測定部30(第4撮影部)により撮影された作業領域R全体の動画である第1動画及び作業領域Rよりも狭い領域について推定された位置で撮影された第2動画について、抽出された動作情報が示す動作を実行している時間帯の部分を表示する(S191)。第1動画及び第2動画のうち、抽出された動作情報が示す動作を実行している時間帯の部分は、抽出された動作情報の開始行と終了行の時刻に基づいて、第1動画及び第2動画を切り出すことで特定される。例えば、抽出された動作情報が示す工程が梱包工程の場合、測定部30(第4撮影部)により撮影された作業領域R全体の動画である第1動画と、第1対応テーブルD3を用いて特定された第3撮影部20cにより撮影された作業領域Rよりも狭い領域の動画である第2動画とについて、動作分析システム100は、抽出された動作情報の開始行と終了行の時刻に基づいて、第1動画及び第2動画を切り出し、表示部10fに表示する。 Next, the motion analysis system 100 is photographed at a position estimated for a first moving image, which is a moving image of the entire work area R photographed by the measurement unit 30 (fourth image pickup unit), and an area narrower than the work area R. A portion of a time zone in which the operation indicated by the extracted operation information is executed is displayed for the second moving image (S191). Of the first moving image and the second moving image, the portion of the time zone in which the operation indicated by the extracted operation information is being executed is based on the time of the start line and the end line of the extracted operation information. It is specified by cutting out the second moving image. For example, when the process indicated by the extracted operation information is a packing process, using the first moving image, which is a moving image of the entire work area R captured by the measuring unit 30 (fourth imaging unit), and the first correspondence table D3 For the second moving image which is a moving image of a region narrower than the work region R captured by the specified third imaging unit 20c, the behavior analysis system 100 determines the time based on the start row and the end row of the extracted operation information. The first moving image and the second moving image are cut out and displayed on the display unit 10 f.
 さらに、動作分析システム100は、抽出された動作情報が示す工程を表示し(S192)、抽出された動作情報が測定された作業者の識別情報を表示する(S193)。例えば、抽出された動作情報が示す工程が梱包工程であり、当該工程を実行した作業者が第1作業者A1である場合、動作分析システム100は、切り出された第1動画及び第2動画と併せて、動画の内容が梱包工程であることと、工程を実行する作業者が第1作業者A1であることを表示部10fに表示する。なお、工程の表示は、複数の工程を識別可能な情報によって行われてもよい。例えば、本例では、工程に通し番号が付加されており、梱包工程を表すために番号「5」を表示してもよい。また、作業者の表示は、複数の作業者を識別可能な情報によって行われてもよい。例えば、本例では、複数の作業者に作業者IDが付加されており、第1作業者A1を表すためにID「A1」を表示してもよい。以上により、動画表示処理の第1例が終了する。 Furthermore, the motion analysis system 100 displays the process indicated by the extracted motion information (S192), and displays the identification information of the worker for which the extracted motion information is measured (S193). For example, when the process indicated by the extracted operation information is a packing process, and the worker who has executed the process is the first worker A1, the operation analysis system 100 determines the first moving image and the second moving image that have been cut out. At the same time, the display unit 10f displays that the content of the moving image is the packing process and that the worker who executes the process is the first worker A1. In addition, the display of a process may be performed by the information which can identify a several process. For example, in the present example, a serial number is added to the process, and the number "5" may be displayed to represent the packing process. In addition, the display of the worker may be performed by information that can identify a plurality of workers. For example, in the present example, a worker ID may be added to a plurality of workers, and the ID “A1” may be displayed to represent the first worker A1. Thus, the first example of the moving image display process ends.
 図9は、本実施形態に係る動作分析システム100により表示される画面DPの一例である。画面DPは、動画表示処理の第1例において、第1動画及び推定された位置で撮影された第2動画について、抽出された動作情報が示す動作を作業者が実行している時間帯の場面を表示(S191)した場合の例である。 FIG. 9 is an example of the screen DP displayed by the motion analysis system 100 according to the present embodiment. In the first example of the moving image display process, the screen DP is a scene of a time zone during which the operator performs an operation indicated by the extracted operation information with respect to the first moving image and the second moving image shot at the estimated position. Is displayed (S191).
 画面DPは、概要DP1、全体動画DP2及び手元動画DP3を含む。概要DP1は、抽出された動作情報の概要を示す情報である。概要DP1には、抽出された動作情報が示す動作が「標準外動作」であり、開始時刻が「2017/7/7 10:10:44.138」、すなわち2017年7月7日の午前10時10分44.138秒であり、終了時刻が「2017/7/7 10:12:44.435」、すなわち2017年7月7日の午前10時12分44.435秒であり、抽出された動作情報が示す工程は「梱包工程」であり、所要時間は「5.5s」(5.5秒)であることが示されている。なお、概要DP1には、抽出された動作情報が示す工程の名称ではなく、抽出された動作情報が示す工程を識別する情報を示してもよい。例えば、各工程に付与された通し番号等のIDを示すこととしてもよい。 The screen DP includes a summary DP1, an entire moving picture DP2, and a hand moving picture DP3. The outline DP1 is information indicating an outline of the extracted operation information. In the outline DP1, the operation indicated by the extracted operation information is “non-standard operation”, and the start time is “2017/7/7 10: 10: 44.138”, that is, 10 am on July 7, 2017. 10 minutes 44.138 seconds and the end time is “2017/7/7 10: 12: 44.435”, that is, 10: 12: 44.435 seconds on July 7, 2017, and extracted The process indicated by the operation information is the "packaging process", and the required time is "5.5 s" (5.5 seconds). The outline DP1 may indicate not the name of the process indicated by the extracted operation information but information identifying the process indicated by the extracted operation information. For example, an ID such as a serial number assigned to each process may be indicated.
 全体動画DP2は、測定部30(第4撮影部)により作業領域R全体を撮影した第1動画であり、動画画面の右下に開始時刻として「2017/7/7 10:10:44.138」が示されている。全体動画DP2により、複数の作業者が動作を実行している様子を全体的に把握することができる。また、全体動画DP2には、測定部30により検出された複数の作業者の関節の位置を骨格モデルによって示している。これにより、複数の作業者の関節の座標が妥当な位置で測定されていることが確認できる。 The whole moving image DP2 is a first moving image obtained by photographing the entire work area R by the measuring unit 30 (fourth image pickup unit), and it is "2017/7/7 10: 10: 44.138" at the lower right of the moving image screen. "It is shown. The entire moving image DP2 makes it possible to generally understand how a plurality of workers are performing an operation. Further, in the general moving image DP2, the positions of joints of a plurality of workers detected by the measurement unit 30 are indicated by a skeletal model. This makes it possible to confirm that the coordinates of joints of a plurality of workers are measured at appropriate positions.
 手元動画DP3は、本例の場合、第3撮影部20cにより作業領域Rより狭い領域を撮影した第2動画であり、動画画面の右下に開始時刻として「2017/7/7 10:10:44.138」が示されている。また、動画画面の上に、「梱包工程」及び「作業者ID:A1」と表示され、抽出された動作情報が示す工程を識別する情報と、抽出された動作情報が測定された作業者の識別情報とが表示されている。作業者の手元をクローズアップした手元動画DP3により、作業者が実際に行った動作の詳細を確認することができる。 In the case of this example, the hand moving image DP3 is a second moving image obtained by capturing an area narrower than the work area R by the third imaging unit 20c, and as the start time at the lower right of the moving image screen, “2017/7/7 10:10: 44.138 "is shown. In addition, “packaging process” and “worker ID: A1” are displayed on the moving image screen, and information identifying the process indicated by the extracted operation information and the worker whose extracted operation information was measured Identification information is displayed. The details of the operation actually performed by the worker can be confirmed by the hand moving image DP3 in which the worker's hand is closed up.
 本実施形態に係る動作分析システム100によれば、複数の作業者の動作をそれぞれ定量的に示す複数の動作情報のうち、基準動作情報との比較において所定の条件を満たす動作情報を抽出して、抽出された動作情報が測定された作業者を識別する情報を表示することで、複数の作業者のうち所定の条件を満たす特定の動作を実行した作業者を識別することができる。本例の場合、第1作業者A1が標準外の動作を行ったことを確認することができる。 According to the motion analysis system 100 according to the present embodiment, the motion information satisfying the predetermined condition in comparison with the reference motion information is extracted from the plurality of pieces of motion information quantitatively indicating the motions of the plurality of workers. By displaying information for identifying the worker for whom the extracted motion information has been measured, it is possible to identify a worker who has executed a specific motion satisfying a predetermined condition among a plurality of workers. In the case of this example, it can be confirmed that the first worker A1 has performed an operation other than the standard.
 また、作業領域Rを撮影した第1動画のうち、抽出された動作情報が示す動作を作業者が実行している時間帯の場面を表示することで、所定の条件を満たす特定の動作を全体的に確認することができ、作業領域Rよりも狭い領域を撮影した第2動画のうち、抽出された動作情報が示す動作を作業者が実行している時間帯の場面を表示することで、所定の条件を満たす特定の動作の詳細を確認することができる。 In addition, by displaying a scene of a time zone in which the worker is executing the operation indicated by the extracted operation information in the first moving image obtained by photographing the work area R, the specific operation satisfying the predetermined condition is entirely performed. In the second moving image obtained by capturing an area narrower than the work area R, by displaying the scene of the time zone during which the operator is executing the operation indicated by the extracted operation information, It is possible to confirm the details of a specific operation that meets a predetermined condition.
 推定部17により、抽出された動作情報が示す動作が実行された作業領域Rにおける位置を推定することで、所定の条件を満たす特定の動作が実行された位置を推定することができ、作業領域における複数の位置で撮影された複数の第2動画のうち、推定された位置で撮影された第2動画を表示することで、推定された位置で行われた動作の詳細を確認することができる。 By estimating the position in the work area R where the operation indicated by the extracted motion information is performed by the estimation unit 17, the position where the specific operation satisfying the predetermined condition is performed can be estimated. By displaying the second moving image captured at the estimated position among the plurality of second moving images captured at multiple positions in, it is possible to confirm the details of the operation performed at the estimated position. .
 また、作業者の動作を定量的に示す動作情報のうち、複数の工程毎に定められた基準動作情報との比較において所定の条件を満たす動作情報を抽出して、抽出された動作情報が示す工程を識別する情報を表示することで、所定の条件を満たす特定の動作がいずれの工程で実行された動作であるか確認することができる。本例の場合、梱包工程において標準外の動作が行われたことを確認することができる。 In addition, among the operation information quantitatively indicating the operation of the worker, the operation information satisfying the predetermined condition is extracted in comparison with the reference operation information determined for each of the plurality of steps, and the extracted operation information indicates By displaying the information identifying the process, it is possible to confirm in which process the specific operation satisfying the predetermined condition is the operation performed. In the case of this example, it can be confirmed that the non-standard operation has been performed in the packing process.
 第1対応テーブルD3を用いて、抽出された動作情報が示す工程に基づいて、抽出された動作情報が示す動作を実行している作業領域Rにおける位置を推定することで、作業領域Rにおける複数の位置で撮影された複数の第2動画のうち、抽出された動作情報が示す工程が行われた位置で撮影された第2動画を表示することができ、当該工程で行われた動作の詳細を確認することができる。 By estimating the position in the work area R executing the operation indicated by the extracted operation information based on the process indicated by the extracted operation information using the first correspondence table D 3 Among the plurality of second moving images shot at the position of the second moving image, the second moving image shot at the position at which the process indicated by the extracted operation information is performed can be displayed, and the details of the operation performed in the step Can be confirmed.
 また、動作情報に含まれる関節の座標値と、基準動作情報に含まれる関節の座標値とを比較することで、動作情報が基準動作情報からどの程度乖離しているかを正確に評価することができ、所定の条件を満たす動作情報を適切に抽出することができる。 Also, by comparing the coordinate values of the joints included in the movement information with the coordinate values of the joints included in the reference movement information, it is possible to accurately evaluate how much the movement information deviates from the reference movement information. It is possible to properly extract motion information that satisfies a predetermined condition.
 図10は、本実施形態に係る動作分析システム100により記憶される第2対応テーブルD4の例を示す図である。同図では、複数の関節の座標範囲と、複数の撮影部との対応関係を記録した例を示している。 FIG. 10 is a diagram showing an example of the second correspondence table D4 stored by the behavior analysis system 100 according to the present embodiment. In the figure, the example which recorded the correspondence of the coordinate range of several joints and several imaging | photography part is shown.
 本例の第2対応テーブルD4では、「右手範囲(X座標)」、「右手範囲(Y座標)」及び「撮影部」といった項目を一例として示している。第2対応テーブルD4には、複数の関節の3次元座標の範囲と、複数の撮影部との対応関係が、一対一で示されている。例えば、「右手範囲(X座標)」について、「600~700」、「右手範囲(Y座標)」について「200~300」の場合に、対応する撮影部は「第1撮影部」(第1撮影部20a)であることが示されている。なお、本例では、右手の座標範囲のうちX座標とY座標の範囲を示しているが、第2対応テーブルD4は、一行に、手首、ひじ、肩、頭、腰、ひざ、足首等の各関節の3次元座標値の範囲を含んでいてよい。 In the second correspondence table D4 of this example, items such as “right hand range (X coordinate)”, “right hand range (Y coordinate)”, and “shooting unit” are shown as an example. In the second correspondence table D4, the correspondence relationship between the range of the three-dimensional coordinates of the plurality of joints and the plurality of imaging units is shown in one-to-one correspondence. For example, in the case of “600 to 700” for “right hand range (X coordinate)” and “200 to 300” for “right hand range (Y coordinate)”, the corresponding imaging unit is “first imaging unit” (first It is shown that the photographing unit 20a). In this example, although the range of the X coordinate and the Y coordinate in the coordinate range of the right hand is shown, the second correspondence table D4 includes a wrist, an elbow, a shoulder, a head, a waist, a knee, an ankle, etc. in one row. A range of three-dimensional coordinate values of each joint may be included.
 なお、第2対応テーブルD4は、複数の関節の3次元座標の範囲と、複数の撮影部との対応関係を、一対多で示すものであってもよい。例えば、複数の関節の3次元座標の範囲に対して、2以上の撮影部が対応していてもよい。その場合、作業者の動作を異なる角度、異なる距離から撮影した2以上の撮影部が示されてよい。 The second correspondence table D4 may indicate the correspondence between the ranges of the three-dimensional coordinates of the plurality of joints and the plurality of imaging units in a one-to-many manner. For example, two or more imaging units may correspond to the range of three-dimensional coordinates of a plurality of joints. In that case, two or more imaging | photography parts which image | photographed an operator's operation | movement from different angles and different distances may be shown.
 図11は、本実施形態に係る動作分析システム100により実行される動画表示処理の第2例のフローチャートである。動画表示処理の第2例は、動作情報の抽出処理の第1例において、動作情報を抽出した後、動画のうち抽出された動作情報が示す動作を実行している部分を表示する処理(S19)の他の例である。 FIG. 11 is a flowchart of a second example of the moving image display process executed by the behavior analysis system 100 according to the present embodiment. The second example of the moving image display process is a process of displaying the portion of the moving image that is executing the operation indicated by the extracted operation information after the operation information is extracted in the first example of the operation information extraction process (S19 Another example of).
 はじめに、動作分析システム100は、第2対応テーブルD4を参照して、第2対応テーブルD4のうち抽出された動作情報に含まれる座標値を最も多く含む行を探索する(S195)。例えば、抽出された動作情報に含まれる右手のX座標が550~650の範囲にある場合、第2対応テーブルD4の上から2行目に示された「右手範囲(X座標)」が、抽出された動作情報に含まれる座標値を最も多く含む行として特定される。同様に、右手のY座標及びZ座標、他の関節の3次元座標について、第2対応テーブルD4に示された座標範囲と比較して、最も重複の多い行を探索する。 First, the motion analysis system 100 refers to the second correspondence table D4 to search for a row including the largest number of coordinate values included in the motion information extracted from the second correspondence table D4 (S195). For example, when the X coordinate of the right hand included in the extracted motion information is in the range of 550 to 650, the “right hand range (X coordinate)” shown in the second row from the top of the second correspondence table D4 is extracted It is identified as a line including the largest number of coordinate values included in the selected motion information. Similarly, with regard to the Y coordinate and Z coordinate of the right hand, and the three-dimensional coordinates of other joints, a row with the most overlap is searched as compared with the coordinate range shown in the second correspondence table D4.
 次に、動作分析システム100は、探索された行を参照して、抽出された動作情報が示す動作を実行している位置を推定する(S196)。例えば、探索された行が第2対応テーブルD4の上から2行目である場合、対応する撮影部である「第1撮影部20a」により、動作が実行された位置を推定する。位置の推定は、例えば、複数の撮影部と、複数の撮影部が設置された作業領域Rにおける位置との対応関係に基づいて行ってもよいし、撮影部の特定をもって位置の推定としてもよい。 Next, the motion analysis system 100 refers to the searched line to estimate the position at which the motion indicated by the extracted motion information is being executed (S196). For example, when the searched line is the second line from the top of the second correspondence table D4, the position at which the operation has been performed is estimated by the “first imaging unit 20a” that is the corresponding imaging unit. The position may be estimated based on, for example, the correspondence between a plurality of imaging units and the position in the work area R in which the plurality of imaging units are installed, or the position may be estimated based on the specification of the imaging units. .
 動作分析システム100は、測定部30(第4撮影部)により撮影された作業領域R全体の動画である第1動画及び作業領域Rよりも狭い領域について推定された位置で撮影された第2動画について、抽出された動作情報が示す動作を実行している時間帯の部分を表示する(S197)。第1動画及び第2動画のうち、抽出された動作情報が示す動作を実行している時間帯の部分は、抽出された動作情報の開始行と終了行の時刻に基づいて、第1動画及び第2動画を切り出すことで特定される。 The motion analysis system 100 includes a first moving image, which is a moving image of the entire work area R taken by the measurement unit 30 (fourth imaging unit), and a second moving image taken at a position estimated for an area narrower than the work area R. The portion of the time zone in which the operation indicated by the extracted operation information is executed is displayed (S 197). Of the first moving image and the second moving image, the portion of the time zone in which the operation indicated by the extracted operation information is being executed is based on the time of the start line and the end line of the extracted operation information. It is specified by cutting out the second moving image.
 さらに、動作分析システム100は、抽出された動作情報が測定された作業者の識別情報を表示する(S198)。例えば、抽出された動作情報が示す動作を実行した作業者が第1作業者A1である場合、動作分析システム100は、切り出された第1動画及び第2動画と併せて、動作を実行する作業者が第1作業者A1であることを表示部10fに表示する。以上により、動画表示処理の第2例が終了する。 Furthermore, the motion analysis system 100 displays the identification information of the worker for which the extracted motion information has been measured (S198). For example, when the worker who has executed the operation indicated by the extracted operation information is the first worker A1, the operation analysis system 100 performs the operation to execute the operation together with the first moving image and the second moving image which have been cut out. The display unit 10f displays that the worker is the first worker A1. Thus, the second example of the moving image display process ends.
 本実施形態に係る動作分析システム100によれば、抽出された動作情報に含まれる関節の座標値に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域における位置を推定することで、抽出された動作情報が示す動作が行われた位置を正確に推定することができる。 According to the motion analysis system 100 according to the present embodiment, based on the coordinate values of the joints included in the extracted motion information, the position in the work area where the worker is executing the motion indicated by the motion information is By estimating, the position where the motion indicated by the extracted motion information is performed can be accurately estimated.
 図12は、本実施形態に係る動作分析システム100により記憶される第2基準動作情報D5の例を示す図である。同図では、複数の要素動作毎に定められた基準動作情報を示しており、作業者が標準的な動作を行った場合の関節の座標値を1秒間隔で記録した例を示している。 FIG. 12 is a diagram showing an example of the second reference motion information D5 stored by the motion analysis system 100 according to the present embodiment. In the figure, reference motion information defined for each of a plurality of element motions is shown, and an example is shown in which coordinate values of joints when a worker performs a standard motion are recorded at one second intervals.
 本例の第2基準動作情報D5では、「開始からの経過時間」、「要素動作」、「右手(X座標)」、「右手(Y座標)」、「右手(Z座標)」及び「左手(X座標)」といった項目を一例として示している。ここで、「開始からの経過時間」は、一つの工程の開始から終了までの経過時間を秒単位で示している。例えば、上から2行目に「00:00:00」と示されており、要素動作の開始時における基準動作情報を示す行であることが表されている。同様に、上から3行目には「00:00:01」と示されており、要素動作の開始から1秒後の基準動作情報であることが表されている。また、上から5行目には「00:00:00」と示されており、異なる要素動作の開始時における基準動作情報を示す行であることが表されている。同様に、上から6行目には「00:00:01」と示されており、要素動作の開始から1秒後の基準動作情報であることが表されている。なお、4行目及び7行目に示された「・・・」という記載は、複数のデータが含まれていることを示す省略記号である。 In the second reference motion information D5 of this example, "elapsed time from start", "element motion", "right hand (X coordinate)", "right hand (Y coordinate)", "right hand (Z coordinate)" and "left hand" An item such as (X coordinate) is shown as an example. Here, the "elapsed time from the start" indicates the elapsed time from the start to the end of one process in seconds. For example, the second line from the top is indicated as "00:00:00", which represents a line indicating reference operation information at the start of an element operation. Similarly, the third line from the top indicates “00:00:01”, which indicates that it is reference operation information one second after the start of the element operation. Further, the fifth line from the top indicates “00:00:00”, which represents a line indicating reference operation information at the start of the different element operation. Similarly, the sixth line from the top indicates "00:00:01", which indicates that it is reference operation information one second after the start of the element operation. Note that the description “...” Shown in the fourth and seventh lines is an ellipsis indicating that a plurality of data are included.
 「要素動作」は、基準動作情報がいずれの要素動作に関するものか示す情報である。例えば、上から2行目及び3行目には、「部品ピック」と示されており、これらの行の基準動作情報は、部品のピッキングに関するものであることが表されている。同様に、上から5行目及び6行目には、「配置」と示されており、これらの行の基準動作情報は、部品の配置に関するものであることが表されている。 The “element operation” is information indicating which element operation the reference operation information relates to. For example, in the second and third lines from the top, "part pick" is indicated, and it is indicated that the reference operation information of these lines relates to picking of parts. Similarly, in the fifth and sixth lines from the top, "arrangement" is shown, and it is indicated that the reference operation information of these lines relates to the arrangement of parts.
 「右手(X座標)」は、作業者の右手関節について、開始地点を原点とする座標系のX軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、標準的な右手のX座標値は、部品ピックの要素動作について、開始時に「0」であり、1秒後に「533」であることが示されている。また、配置の要素動作について、標準的な右手のX座標値は、開始時に「0」であり、1秒後に「796」であることが示されている。 The “right hand (X coordinate)” indicates the position of the operator's right joint in the X axis of the coordinate system with the start point as the origin. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the standard right-handed X-coordinate value is shown to be "0" at the beginning and "533" one second after the element operation of the part pick. In addition, for the element operation of the arrangement, it is shown that the standard right hand X coordinate value is "0" at the start and "796" after one second.
 「右手(Y座標)」は、作業者の右手関節について、開始地点を原点とする座標系のY軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、標準的な右手のY座標値は、部品ピックの要素動作について、開始時に「0」であり、1秒後に「977」であることが示されている。また、配置の要素動作について、標準的な右手のY座標値は、開始時に「0」であり、1秒後に「595」であることが示されている。 The “right hand (Y coordinate)” indicates the position on the Y axis of the coordinate system with the start point as the origin for the right hand joint of the worker. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the standard right-handed Y coordinate value is shown to be "0" at the beginning and "977" one second after the element operation of the part pick. Also, for the element operation of the arrangement, it is shown that the standard right hand Y coordinate value is "0" at the start and "595" after one second.
 「右手(Z座標)」は、作業者の右手関節について、開始地点を原点とする座標系のZ軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、標準的な右手のZ座標値は、部品ピックの要素動作について、開始時に「0」であり、1秒後に「341」であることが示されている。また、配置の要素動作について、標準的な右手のZ座標値は、開始時に「0」であり、1秒後に「949」であることが示されている。 The “right hand (Z coordinate)” indicates the position of the operator's right joint in the Z axis of the coordinate system with the start point as the origin. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the standard right-hand Z coordinate value is shown to be "0" at the beginning and "341" after one second for the element operation of the part pick. Also, for the element operation of the arrangement, the standard right hand Z coordinate value is shown to be "0" at the start and "949" after one second.
 「左手(X座標)」は、作業者の左手関節について、開始地点を原点とする座標系のX軸における位置を示している。なお、単位は任意であるが、本例ではmm(ミリメートル)である。本例の場合、標準的な左手のX座標値は、部品ピックの要素動作について、開始時に「0」であり、1秒後に「806」であることが示されている。また、配置の要素動作について、標準的な左手のX座標値は、開始時に「0」であり、1秒後に「549」であることが示されている。 The “left hand (X coordinate)” indicates the position of the operator's left joint in the X axis of the coordinate system with the start point as the origin. In addition, although a unit is arbitrary, it is mm (millimeter) in this example. In the case of this example, the standard left hand X coordinate value is shown to be "0" at the beginning and "806" one second after the element operation of the part pick. Also, for the element operation of the arrangement, it is shown that the standard left hand X coordinate value is "0" at the start and "549" after one second.
 本例では、第2基準動作情報D5に含まれる他の関節の座標値の記載を省略しているが、第2基準動作情報D5は、手首、ひじ、肩、頭、腰、ひざ、足首等の各関節の3次元座標値を含んでもよい。また、本例では、第2基準動作情報D5は、2つの要素動作の基準動作情報を含んでいるが、3つ以上の要素動作の基準動作情報を含んでもよい。 In this example, the description of the coordinate values of other joints included in the second reference motion information D5 is omitted, but the second reference motion information D5 includes a wrist, an elbow, a shoulder, a head, a hip, a knee, an ankle, etc. May include three-dimensional coordinate values of each joint. Also, in the present example, the second reference motion information D5 includes reference motion information of two element motions, but may include reference motion information of three or more element motions.
 図13は、本実施形態に係る動作分析システム100により実行される動作情報の抽出処理の第2例のフローチャートである。動作情報の抽出処理の第2例は、動作情報D1と第2基準動作情報D5とを比較して、所定の条件を満たす動作情報を抽出する処理である。 FIG. 13 is a flowchart of a second example of the process of extracting operation information performed by the operation analysis system 100 according to the present embodiment. The second example of the process of extracting the operation information is a process of comparing the operation information D1 and the second reference operation information D5 to extract the operation information satisfying the predetermined condition.
 動作分析システム100は、はじめに、動作情報D1に含まれる隣り合う座標値に基づき、各関節の速度を算出する(S40)。関節の速度は、例えば、1秒間隔で測定された座標値の場合、隣り合う行の座標値の差を1秒で割ることで算出してよい。 The motion analysis system 100 first calculates the speed of each joint based on the adjacent coordinate values included in the motion information D1 (S40). For example, in the case of coordinate values measured at one-second intervals, the joint velocity may be calculated by dividing the difference between the coordinate values of adjacent rows by one second.
 動作分析システム100は、次に、いずかの関節の速度が閾値以下となった行を、要素動作が移り変わる行と特定する(S41)。例えば、部品ピックの要素動作から部品を配置する要素動作に移り変わる場合、手の関節の速度が、部品ピックを終えて配置に移る際にほとんどゼロになるため、いずかの関節の速度が閾値以下となった行を、要素動作が移り変わる行と特定することができる。 Next, the motion analysis system 100 identifies a row in which the velocity of any joint is less than or equal to the threshold as a row in which element motion changes (S41). For example, when transitioning from an element motion of a part pick to an element motion of placing a part, the speed of a hand joint becomes almost zero when finishing picking a part and moving to placement, so the speed of any joint is thresholded The following line can be identified as a line in which element operation changes.
 動作分析システム100は、動作情報D1について、要素動作が移り変わる行を全て特定した後、動作情報D1から、一つの要素動作の開始行から終了行までを切り出す(S42)。この際、切り出される行数は、第2基準動作情報D5のうち所定の要素動作の基準動作情報の行数と必ずしも一致しない。そこで、動作分析システム100は、動作情報D1から切り出した動作情報の行数が、第2基準動作情報D5のうち所定の要素動作の基準動作情報の行数と一致するように補間又は間引きする(S43)。 The motion analysis system 100 specifies all the lines where the element motion changes for the motion information D1, and then cuts out from the motion information D1 from the start line to the end line of one element motion (S42). At this time, the number of lines to be cut out does not necessarily match the number of lines of reference operation information of a predetermined element operation in the second reference operation information D5. Therefore, the motion analysis system 100 performs interpolation or thinning so that the number of rows of motion information extracted from the motion information D1 matches the number of rows of reference motion information of a predetermined element motion in the second reference motion information D5 ( S43).
 動作分析システム100は、次に、補間又は間引きした動作情報の座標値と、所定の要素動作の基準動作情報の座標値との乖離度を算出する(S44)。乖離度は、関節毎に算出してよい。乖離度は、ある時刻に関する関節の3次元座標値について、動作情報の座標値(x、y、z)と、所定の工程の基準動作情報の座標値(X,Y,Z)との差を二乗して平方根をとり、((x-X)+(y-Y)+(z-Z)1/2という値を求め、この値を全ての時間にわたって総和することで算出してよい。もっとも、乖離度は他の値によって算出してもよい。 Next, the motion analysis system 100 calculates the degree of deviation between the coordinate value of the motion information interpolated or thinned out and the coordinate value of the reference motion information of the predetermined element motion (S44). The degree of divergence may be calculated for each joint. The degree of divergence is the difference between the coordinate value (x, y, z) of the motion information and the coordinate value (X, Y, Z) of the reference motion information of a predetermined process for the three-dimensional coordinate value of the joint at a certain time. The square is taken and the square root is taken, the value of ((x-X) 2 + (y-Y) 2 + (z-Z) 2 ) 1/2 is obtained, and this value is calculated by summing over all time You may However, the degree of deviation may be calculated by another value.
 乖離度が閾値以上である場合(S45:YES)、動作分析システム100は、特定された開始行から終了行までの動作情報を抽出する(S46)。なお、抽出される動作情報は、補間又は間引きがされていない状態の動作情報であってよい。ここで、乖離度を関節毎に算出した場合、閾値も関節毎に設定されてよく、関節毎に算出された乖離度のうちいずれか1つが閾値以上である場合に、動作情報を抽出してもよいし、2以上の関節について算出された乖離度が閾値以上である場合に、動作情報を抽出することとしてもよい。 If the divergence degree is equal to or more than the threshold (S45: YES), the behavior analysis system 100 extracts the behavior information from the identified start line to the end line (S46). The operation information to be extracted may be operation information in a state where interpolation or thinning is not performed. Here, when the degree of divergence is calculated for each joint, the threshold may also be set for each joint, and when any one of the degrees of divergence calculated for each joint is equal to or more than the threshold, the motion information is extracted The motion information may be extracted if the degree of divergence calculated for two or more joints is equal to or greater than a threshold.
 動作分析システム100は、動作情報を抽出した後、動画のうち抽出された動作情報が示す動作を実行している部分を表示する(S47)。動画の表示処理については、次図を用いて詳細に説明する。動作分析システム100は、最後に、第2基準動作情報D5に含まれる全ての要素動作について分析が終了したか判断し(S48)、全ての要素動作について分析が終了した場合(S48:YES)、動作情報の抽出処理の第2例が終了する。 After the motion analysis system 100 extracts the motion information, the motion analysis system 100 displays a portion of the moving image in which the motion indicated by the extracted motion information is being executed (S47). The moving image display process will be described in detail using the following figure. The motion analysis system 100 finally determines whether analysis of all element motions included in the second reference motion information D5 is completed (S48), and analysis of all element motions is completed (S48: YES), The second example of the operation information extraction process ends.
 図14は、本実施形態に係る動作分析システム100により実行される動画表示処理の第3例のフローチャートである。動画表示処理の第3例は、動作情報の抽出処理の第2例において、動作情報を抽出した後、動画のうち抽出された動作情報が示す動作を実行している部分を表示する処理(S47)の一例である。 FIG. 14 is a flowchart of a third example of the moving image display process executed by the behavior analysis system 100 according to the present embodiment. The third example of the moving image display process is a process of displaying the portion of the moving image indicated by the extracted operation information that is executing the operation after the operation information is extracted in the second example of the operation information extraction process (S47 An example).
 はじめに、動作分析システム100は、第2対応テーブルD4を参照して、第2対応テーブルD4のうち抽出された動作情報に含まれる座標値を最も多く含む行を探索する(S470)。例えば、抽出された動作情報に含まれる右手のX座標が550~650の範囲にある場合、第2対応テーブルD4の上から2行目に示された「右手範囲(X座標)」が、抽出された動作情報に含まれる座標値を最も多く含む行として特定される。同様に、右手のY座標及びZ座標、他の関節の3次元座標について、第2対応テーブルD4に示された座標範囲と比較して、最も重複の多い行を探索する。 First, the motion analysis system 100 refers to the second correspondence table D4 to search for a row including the largest number of coordinate values included in the extracted motion information in the second correspondence table D4 (S470). For example, when the X coordinate of the right hand included in the extracted motion information is in the range of 550 to 650, the “right hand range (X coordinate)” shown in the second row from the top of the second correspondence table D4 is extracted It is identified as a line including the largest number of coordinate values included in the selected motion information. Similarly, with regard to the Y coordinate and Z coordinate of the right hand, and the three-dimensional coordinates of other joints, a row with the most overlap is searched as compared with the coordinate range shown in the second correspondence table D4.
 次に、動作分析システム100は、探索された行を参照して、抽出された動作情報が示す動作を実行している位置を推定する(S471)。例えば、探索された行が第2対応テーブルD4の上から2行目である場合、対応する撮影部である「第1撮影部20a」により、動作が実行された位置を推定する。位置の推定は、例えば、複数の撮影部と、複数の撮影部が設置された作業領域Rにおける位置との対応関係に基づいて行ってもよいし、撮影部の特定をもって位置の推定としてもよい。 Next, the motion analysis system 100 estimates the position at which the motion indicated by the extracted motion information is being executed, with reference to the searched line (S471). For example, when the searched line is the second line from the top of the second correspondence table D4, the position at which the operation has been performed is estimated by the “first imaging unit 20a” that is the corresponding imaging unit. The position may be estimated based on, for example, the correspondence between a plurality of imaging units and the position in the work area R in which the plurality of imaging units are installed, or the position may be estimated based on the specification of the imaging units. .
 動作分析システム100は、測定部30(第4撮影部)により撮影された作業領域R全体の動画である第1動画及び作業領域Rよりも狭い領域について推定された位置で撮影された第2動画について、抽出された動作情報が示す動作を実行している時間帯の部分を表示する(S472)。第1動画及び第2動画のうち、抽出された動作情報が示す動作を実行している時間帯の部分は、抽出された動作情報の開始行と終了行の時刻に基づいて、第1動画及び第2動画を切り出すことで特定される。 The motion analysis system 100 includes a first moving image, which is a moving image of the entire work area R taken by the measurement unit 30 (fourth imaging unit), and a second moving image taken at a position estimated for an area narrower than the work area R. The portion of the time zone in which the operation indicated by the extracted operation information is executed is displayed (S 472). Of the first moving image and the second moving image, the portion of the time zone in which the operation indicated by the extracted operation information is being executed is based on the time of the start line and the end line of the extracted operation information. It is specified by cutting out the second moving image.
 さらに、動作分析システム100は、抽出された動作情報が示す要素動作を表示し(S473)、抽出された動作情報が測定された作業者の識別情報を表示する(S474)。例えば、抽出された動作情報が示す要素動作が部品ピックであり、当該要素動作を実行した作業者が第1作業者A1である場合、動作分析システム100は、切り出された第1動画及び第2動画と併せて、動画の内容が部品ピックであることと、要素動作を実行する作業者が第1作業者A1であることを表示部10fに表示する。なお、要素動作の表示は、複数の要素動作を識別可能な情報によって行われてもよい。また、作業者の表示は、複数の作業者を識別可能な情報によって行われてもよい。以上により、動画表示処理の第3例が終了する。 Furthermore, the motion analysis system 100 displays the element motion indicated by the extracted motion information (S 473), and displays the identification information of the worker for which the extracted motion information is measured (S 474). For example, if the element motion indicated by the extracted motion information is a component pick and the worker who has executed the element motion is the first worker A1, the motion analysis system 100 determines that the first moving image and the second motion image have been cut out. In addition to the moving image, the display unit 10f displays that the content of the moving image is a component pick and that the worker who executes the element operation is the first worker A1. Note that the display of the element operation may be performed by information that can identify a plurality of element operations. In addition, the display of the worker may be performed by information that can identify a plurality of workers. Thus, the third example of the video display process ends.
 本実施形態に係る動作分析システム100によれば、抽出された動作情報に含まれる関節の座標値及び抽出された動作情報が示す要素動作に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域における位置を推定することで、作業領域における複数の位置で撮影された複数の第2動画のうち、抽出された動作情報が示す要素動作が行われた位置で撮影された第2動画を表示することができ、当該要素動作の詳細を確認することができる。また、基準動作情報を複数の要素動作毎に定めることで、作業者が実行している要素動作に抜け、漏れが生じていないかを容易に確認することができ、作業者が適切な動作を行っているか確認する負担を軽減することができる。 According to the motion analysis system 100 according to the present embodiment, the worker is caused to exhibit the motion indicated by the extracted motion information based on the coordinate value of the joint included in the extracted motion information and the element motion indicated by the extracted motion information. Is estimated at the position where the element motion indicated by the extracted motion information is performed among the plurality of second moving images shot at the plurality of positions in the work area by estimating the position in the work area being executed by The second moving image can be displayed, and details of the element operation can be confirmed. In addition, by defining the reference operation information for each of a plurality of element operations, it is possible to easily check whether there is a leak occurring in the element operation being executed by the operator, and the operator can perform an appropriate operation. It is possible to reduce the burden of checking whether you are going.
 図15は、本実施形態に係る動作分析システム100により記憶される第3対応テーブルD6の例を示す図である。同図では、複数の関節の座標範囲及び複数の要素動作と、複数の撮影部との対応関係を記録した例を示している。 FIG. 15 is a diagram showing an example of the third correspondence table D6 stored by the behavior analysis system 100 according to the present embodiment. In the figure, the example which recorded the correspondence of the coordinate range of several joints, several element operation | movement, and several imaging | photography part is shown.
 本例の第3対応テーブルD6では、「右手範囲(X座標)」、「右手範囲(Y座標)」、「要素動作」、「撮影部」及び「備考」といった項目を一例として示している。第3対応テーブルD6には、複数の関節の3次元座標の範囲及び複数の要素動作と、複数の撮影部との対応関係が、一対一で示されている。例えば、「右手範囲(X座標)」について、「600~700」、「右手範囲(Y座標)」について「200~300」であって、要素動作が「部品ピック」の場合に、対応する撮影部は「第1撮影部」(第1撮影部20a)であることが示されている。ここで、「備考」には、「斜め上から撮影」と示されている。また、「右手範囲(X座標)」について、「600~700」、「右手範囲(Y座標)」について「200~300」であって、要素動作が「配置」の場合に、対応する撮影部は「第2撮影部」(第2撮影部20b)であることが示されている。ここで、「備考」には、「真上から撮影」と示されている。このように、関節の3次元座標の範囲が同一であっても、要素動作によって適切な撮影方向や撮影距離が異なる場合に、要素動作によりさらに分類することで、作業者の動作の詳細をより確認しやすい動画を特定することができる。なお、本例では、右手の座標範囲のうちX座標とY座標の範囲を示しているが、第3対応テーブルD6は、一行に、手首、ひじ、肩、頭、腰、ひざ、足首等の各関節の3次元座標値の範囲を含んでいてよい。 In the third correspondence table D6 of this example, items such as “right hand range (X coordinate)”, “right hand range (Y coordinate)”, “element operation”, “shooting unit”, and “remarks” are shown as an example. In the third correspondence table D6, the correspondence relationship between the range of the three-dimensional coordinates of the plurality of joints and the plurality of element operations and the plurality of imaging units is shown in a one-on-one manner. For example, for “right hand range (X coordinate),” “600 to 700”, “200 right hand for“ right hand range (Y coordinate) ”and“ 200 to 300 ”, and corresponding element operation is“ part pick ” It is shown that the unit is a "first imaging unit" (first imaging unit 20a). Here, "Remarks" indicates "shooting from diagonally above". In addition, for “right hand range (X coordinate)”, “200 to 300” for “600 to 700” and “right hand range (Y coordinate)”, and the element operation is “arrangement”, the corresponding imaging unit Is shown to be the "second imaging unit" (second imaging unit 20b). Here, "Remarks" indicates that "shooting from right above". As described above, even when the range of the three-dimensional coordinates of the joint is the same, when the appropriate imaging direction and imaging distance differ depending on the element operation, the operator's operation details are further classified by further classifying according to the element operation It is possible to identify videos that are easy to check. In this example, the range of the X coordinate and the Y coordinate in the coordinate range of the right hand is shown, but the third correspondence table D6 includes the wrist, elbow, shoulder, head, hip, knee, ankle, etc. in one row. A range of three-dimensional coordinate values of each joint may be included.
 なお、第3対応テーブルD6は、複数の関節の3次元座標の範囲及び複数の要素動作と、複数の撮影部との対応関係を、一対多で示すものであってもよい。例えば、複数の関節の3次元座標の範囲及び要素動作に対して、2以上の撮影部が対応していてもよい。その場合、作業者の動作を異なる角度、異なる距離から撮影した2以上の撮影部が示されてよい。 The third correspondence table D6 may indicate the correspondence between the ranges of the three-dimensional coordinates of the plurality of joints and the plurality of element operations and the plurality of imaging units in a one-to-many manner. For example, two or more imaging units may correspond to the range of three-dimensional coordinates of a plurality of joints and element motion. In that case, two or more imaging | photography parts which image | photographed an operator's operation | movement from different angles and different distances may be shown.
 図16は、本実施形態に係る動作分析システム100により実行される動画表示処理の第4例のフローチャートである。動画表示処理の第4例は、動作情報の抽出処理の第2例において、動作情報を抽出した後、動画のうち抽出された動作情報が示す動作を実行している部分を表示する処理(S47)の他の例である。 FIG. 16 is a flowchart of a fourth example of the moving image display process executed by the behavior analysis system 100 according to the present embodiment. The fourth example of the moving picture display process is a process of displaying the portion of the moving picture that is executing the operation indicated by the extracted action information after the action information is extracted in the second example of the action information extraction process (S47 Another example of).
 はじめに、動作分析システム100は、第3対応テーブルD6を参照して、第3対応テーブルD6のうち抽出された動作情報に含まれる座標値を最も多く含む行を探索する(S475)。例えば、抽出された動作情報に含まれる右手のX座標が550~650の範囲にある場合、第3対応テーブルD6の上から2行目及び3行目に示された「右手範囲(X座標)」が、抽出された動作情報に含まれる座標値を最も多く含む行として特定される。同様に、右手のY座標及びZ座標、他の関節の3次元座標について、第3対応テーブルD6に示された座標範囲と比較して、最も重複の多い行を探索する。 First, the motion analysis system 100 refers to the third correspondence table D6 to search for a row including the largest number of coordinate values included in the motion information extracted from the third correspondence table D6 (S475). For example, when the X coordinate of the right hand included in the extracted motion information is in the range of 550 to 650, “right hand range (X coordinate) shown in the second and third rows from the top of the third correspondence table D6 “Is identified as the line including the largest number of coordinate values included in the extracted motion information. Similarly, with regard to the Y coordinate and Z coordinate of the right hand, and the three-dimensional coordinates of other joints, a row with the most overlap is searched as compared with the coordinate range shown in the third correspondence table D6.
 次に、動作分析システム100は、探索された行のうち、抽出された動作情報が示す要素動作に対応する行を特定する(S471)。例えば、第3対応テーブルD6の上から2行目及び3行目が、抽出された動作情報に含まれる座標値を最も多く含む行として探索された場合に、抽出された動作情報が示す要素動作が部品ピックであれば、第3対応テーブルD6の上から2行目を、抽出された動作情報が示す要素動作に対応する行として特定する。 Next, the motion analysis system 100 specifies a row corresponding to the element motion indicated by the extracted motion information among the searched rows (S471). For example, when the second and third rows from the top of the third correspondence table D6 are searched as a row including the largest number of coordinate values included in the extracted operation information, an element operation indicated by the extracted operation information If is a component pick, the second line from the top of the third correspondence table D6 is specified as a line corresponding to the element operation indicated by the extracted operation information.
 動作分析システム100は、特定された行を参照して、抽出された動作情報が示す動作を実行している位置を推定する(S477)。例えば、特定された行が第3対応テーブルD6の上から2行目である場合、対応する撮影部である「第1撮影部20a」により、動作が実行された位置を推定する。位置の推定は、例えば、複数の撮影部と、複数の撮影部が設置された作業領域Rにおける位置との対応関係に基づいて行ってもよいし、撮影部の特定をもって位置の推定としてもよい。 The motion analysis system 100 estimates the position at which the motion indicated by the extracted motion information is being executed, with reference to the identified line (S477). For example, when the specified line is the second line from the top of the third correspondence table D6, the “first imaging unit 20a” that is the corresponding imaging unit estimates the position at which the operation has been performed. The position may be estimated based on, for example, the correspondence between a plurality of imaging units and the position in the work area R in which the plurality of imaging units are installed, or the position may be estimated based on the specification of the imaging units. .
 動作分析システム100は、測定部30(第4撮影部)により撮影された作業領域R全体の動画である第1動画及び作業領域Rよりも狭い領域について推定された位置で撮影された第2動画について、抽出された動作情報が示す動作を実行している時間帯の部分を表示する(S478)。第1動画及び第2動画のうち、抽出された動作情報が示す動作を実行している時間帯の部分は、抽出された動作情報の開始行と終了行の時刻に基づいて、第1動画及び第2動画を切り出すことで特定される。 The motion analysis system 100 includes a first moving image, which is a moving image of the entire work area R taken by the measurement unit 30 (fourth imaging unit), and a second moving image taken at a position estimated for an area narrower than the work area R. The portion of the time zone in which the operation indicated by the extracted operation information is executed is displayed (S 478). Of the first moving image and the second moving image, the portion of the time zone in which the operation indicated by the extracted operation information is being executed is based on the time of the start line and the end line of the extracted operation information. It is specified by cutting out the second moving image.
 さらに、動作分析システム100は、抽出された動作情報が示す要素動作を表示し(S479)、抽出された動作情報が測定された作業者の識別情報を表示する(S4710)。例えば、抽出された動作情報が示す要素動作が部品ピックであり、当該要素動作を実行した作業者が第1作業者A1である場合、動作分析システム100は、切り出された第1動画及び第2動画と併せて、動画の内容が部品ピックであることと、要素動作を実行する作業者が第1作業者A1であることを表示部10fに表示する。なお、要素動作の表示は、複数の要素動作を識別可能な情報によって行われてもよい。また、作業者の表示は、複数の作業者を識別可能な情報によって行われてもよい。以上により、動画表示処理の第4例が終了する。 Furthermore, the motion analysis system 100 displays the element motion indicated by the extracted motion information (S479), and displays the identification information of the worker for which the extracted motion information is measured (S4710). For example, if the element motion indicated by the extracted motion information is a component pick and the worker who has executed the element motion is the first worker A1, the motion analysis system 100 determines that the first moving image and the second motion image have been cut out. In addition to the moving image, the display unit 10f displays that the content of the moving image is a component pick and that the worker who executes the element operation is the first worker A1. Note that the display of the element operation may be performed by information that can identify a plurality of element operations. In addition, the display of the worker may be performed by information that can identify a plurality of workers. Thus, the fourth example of the video display process ends.
 本実施形態に係る動作分析システム100によれば、抽出された動作情報に含まれる関節の座標値及び抽出された動作情報が示す要素動作に基づいて、抽出された動作情報が示す動作を作業者が実行している作業領域における位置を推定することで、作業領域における複数の位置で撮影された複数の第2動画のうち、抽出された動作情報が示す要素動作が行われた位置で撮影された第2動画を表示することができ、当該要素動作の詳細を確認することができる。 According to the motion analysis system 100 according to the present embodiment, the worker is caused to exhibit the motion indicated by the extracted motion information based on the coordinate value of the joint included in the extracted motion information and the element motion indicated by the extracted motion information. Is estimated at the position where the element motion indicated by the extracted motion information is performed among the plurality of second moving images shot at the plurality of positions in the work area by estimating the position in the work area being executed by The second moving image can be displayed, and details of the element operation can be confirmed.
 §4 変形例
 図17は、本実施形態の変形例に係る動作分析システム100により実行される表示態様の選択処理のフローチャートである。本実施形態に係る動作分析システム100は、図9に示すように、抽出された動作情報の概要、作業領域R全体を撮影した第1動画(全体動画)及び作業者の手元を撮影した第2動画(手元動画)を表示する。本変形例に係る動作分析システム100は、ユーザの選択に応じて、表示部10fに表示する内容を変更することができる。
4 Modified Example FIG. 17 is a flowchart of a process of selecting a display mode performed by the behavior analysis system 100 according to a modified example of the present embodiment. The motion analysis system 100 according to the present embodiment, as shown in FIG. 9, is an outline of the extracted motion information, a first moving image (whole moving image) obtained by capturing the entire work area R, and a second obtained by capturing the hand of the operator. Display videos (videos at hand). The behavior analysis system 100 according to the present modification can change the content displayed on the display unit 10 f according to the user's selection.
 動作分析システム100は、基準動作を併せて表示するか否かを判定する(S50)。基準動作を併せて表示するか否かは、ユーザの入力部10eによる入力に基づいて指定されてよい。基準動作を併せて表示する場合(S50:YES)、第1動画を表示するとともに、第2動画のうち、抽出された動作情報が示す動作を実行している部分と、基準動作情報が示す基準動作を実行している動画とを表示する(S51)。ここで、基準動作情報が示す基準動作を実行している動画は、予め撮影されて記憶部14に記憶されたものであってもよいし、動画履歴14aのうち抽出部15により抽出されなかった動画から選択されてもよい。 The motion analysis system 100 determines whether to display the reference motion together (S50). It may be specified based on the input by the user's input unit 10e whether or not the reference operation is displayed together. When the reference motion is displayed together (S50: YES), the first moving image is displayed, and a portion of the second moving image that is executing the motion indicated by the extracted motion information and the reference indicated by the reference motion information The moving image on which the operation is being performed is displayed (S51). Here, the moving image executing the reference operation indicated by the reference operation information may be one captured in advance and stored in the storage unit 14 or may not be extracted by the extraction unit 15 in the moving image history 14 a It may be selected from videos.
 動画のうち抽出された動作情報が示す動作を実行している部分と、基準動作を実行している動画とを表示することで、所定の条件を満たす特定の動作と、基準動作との比較が容易となる。 By displaying a portion of the video that is performing the motion indicated by the extracted motion information and the motion video that is performing the reference motion, the comparison between the specific motion and the reference motion satisfying the predetermined condition is performed. It becomes easy.
 また、動作分析システム100は、動作情報を示すグラフを併せて表示するか否かを判定する(S52)。動作情報を示すグラフを併せて表示するか否かは、ユーザの入力部10eによる入力に基づいて指定されてよい。動作情報を示すグラフを併せて表示する場合(S52:YES)、第1動画を表示するとともに、第2動画のうち、抽出された動作情報が示す動作を作業者が実行している場面と、抽出された動作情報を示すグラフとを表示する(S53)。抽出された動作情報を示すグラフは、任意の態様のグラフであってよいが、例えば、各関節の座標値を縦軸に示し、経過時間を横軸に示すグラフであってよい。 In addition, the behavior analysis system 100 determines whether to display a graph indicating the behavior information together (S52). It may be specified based on the input by the user's input unit 10e whether or not the graph indicating the operation information is displayed together. When displaying the graph which shows operation information together (S52: YES), while displaying the 1st animation, the scene where the worker is performing the operation which the extracted operation information shows among the 2nd animation, A graph showing the extracted operation information is displayed (S53). The graph showing the extracted motion information may be a graph of any aspect, but may be a graph showing coordinate values of each joint on the vertical axis and an elapsed time on the horizontal axis, for example.
 抽出された動作情報を示すグラフと、抽出された動作情報が示す動作を作業者が実行している動画の場面とを表示することで、所定の条件を満たす特定の動作が実行された場面を、動画とグラフという異なる観点で確認することができる。 By displaying a graph indicating the extracted motion information and a scene of a moving image in which the operator is executing the motion indicated by the extracted motion information, the scene in which the specific motion satisfying the predetermined condition is executed is displayed. It can be confirmed from different points of view, animation and graph.
 また、動作分析システム100は、重畳画像を表示するか否かを判定する(S54)。重畳画像を表示するか否かは、ユーザの入力部10eによる入力に基づいて指定されてよい。重畳画像を表示する場合(S54:YES)、第1動画を表示するとともに、第2動画のうち、抽出された動作情報が示す動作を作業者が実行している場面に含まれる複数のフレームを重畳させて一枚の画像として表示する(S55)。重畳画像は、例えば、第2動画のうち、抽出された動作情報が示す動作を作業者が実行している場面に含まれる複数のフレームを間引きし、透過処理して、一枚の画像に合成することで生成してよい。 In addition, the motion analysis system 100 determines whether to display a superimposed image (S54). Whether or not to display the superimposed image may be designated based on an input from the input unit 10 e of the user. When the superimposed image is displayed (S54: YES), the first moving image is displayed, and a plurality of frames included in the scene where the operator is executing the operation indicated by the extracted operation information in the second moving image are displayed. It is superimposed and displayed as a single image (S55). For the superimposed image, for example, a plurality of frames included in a scene where the operator is executing the motion indicated by the extracted motion information in the second moving image is thinned out and subjected to transmission processing, and synthesized into one image. You may generate by doing.
 動画のうち抽出された動作情報が示す動作を作業者が実行している場面に含まれる複数のフレームを重畳させて一枚の画像として表示することで、所定の条件を満たす特定の動作が実行された場面の全体を一目で確認することができる。 By superimposing a plurality of frames included in the scene where the operator is executing an operation indicated by the extracted motion information in the moving image and displaying it as a single image, a specific operation satisfying a predetermined condition is executed. You can see at a glance the whole of the scene taken.
 基準動作を併せて表示せず(S50:NO)、動作情報を示すグラフを併せて表示せず(S52:NO)、重畳画像を表示しない場合(S54:NO)、第1動画を表示するとともに、第2動画のうち、抽出された動作情報が示す動作を実行している部分を表示する。この場合の画面の一例が、図9に示した画面DPである。以上により、表示態様の選択処理が終了する。 When the reference motion is not displayed together (S50: NO), the graph showing the motion information is not displayed together (S52: NO), and the superimposed image is not displayed (S54: NO), the first moving image is displayed. The part which is performing the operation | movement which the extracted operation | movement information shows among 2nd moving images is displayed. An example of the screen in this case is the screen DP shown in FIG. Thus, the selection process of the display mode is completed.
 図18は、本変形例に係る動作分析システム100により表示される画面DPの一例である。画面DPは、表示態様の選択処理において、動作情報を示すグラフを併せて表示(S53)した場合の例である。 FIG. 18 is an example of a screen DP displayed by the behavior analysis system 100 according to the present modification. The screen DP is an example in the case where a graph indicating operation information is displayed together (S53) in the display mode selection process.
 画面DPは、グラフDP4、全体動画DP2及び手元動画DP3を含む。本例のグラフDP4は、抽出された動作情報を示すグラフであり、右手のX座標値(「右手X」)、Y座標値(「右手Y」)及びZ座標値(「右手Z」)を縦軸に示し、経過時間を横軸に示す動作信号グラフである。グラフDP4では、右手のX座標値(「右手X」)を実線で示し、Y座標値(「右手Y」)を破線で示し、Z座標値(「右手Z」)を一点鎖線で示している。グラフDP4には、右手の座標値の時系列変化が、抽出された動作情報の期間について表されている。 The screen DP includes a graph DP4, an entire moving picture DP2, and a hand moving picture DP3. The graph DP4 of this example is a graph showing the extracted motion information, and the X coordinate value ("right hand X"), Y coordinate value ("right hand Y") and Z coordinate value ("right hand Z") of the right hand It is an operation | movement signal graph which is shown on a vertical axis | shaft and elapsed time is shown on a horizontal axis. In the graph DP4, the X coordinate value (“right hand X”) of the right hand is indicated by a solid line, the Y coordinate value (“right hand Y”) is indicated by a broken line, and the Z coordinate value (“right hand Z”) is indicated by an alternate long and short dash line . In the graph DP4, the time series change of the coordinate value of the right hand is represented for the period of the extracted operation information.
 全体動画DP2は、測定部30(第4撮影部)により作業領域R全体を撮影した第1動画であり、動画画面の右下に開始時刻として「2017/7/7 10:10:44.138」が示されている。全体動画DP2により、複数の作業者が動作を実行している様子を全体的に把握することができる。また、全体動画DP2には、測定部30により検出された複数の作業者の関節の位置を骨格モデルによって示している。これにより、複数の作業者の関節の座標が妥当な位置で測定されていることが確認できる。 The whole moving image DP2 is a first moving image obtained by photographing the entire work area R by the measuring unit 30 (fourth image pickup unit), and it is "2017/7/7 10: 10: 44.138" at the lower right of the moving image screen. "It is shown. The entire moving image DP2 makes it possible to generally understand how a plurality of workers are performing an operation. Further, in the general moving image DP2, the positions of joints of a plurality of workers detected by the measurement unit 30 are indicated by a skeletal model. This makes it possible to confirm that the coordinates of joints of a plurality of workers are measured at appropriate positions.
 手元動画DP3は、本例の場合、第3撮影部20cにより作業領域Rより狭い領域を撮影した第2動画であり、動画画面の右下に開始時刻として「2017/7/7 10:10:44.138」が示されている。また、動画画面の上に、「梱包工程」及び「作業者ID:A1」と表示され、抽出された動作情報が示す工程を識別する情報と、抽出された動作情報が測定された作業者の識別情報とが表示されている。作業者の手元をクローズアップした手元動画DP3により、作業者が実際に行った動作の詳細を確認することができる。 In the case of this example, the hand moving image DP3 is a second moving image obtained by capturing an area narrower than the work area R by the third imaging unit 20c, and as the start time at the lower right of the moving image screen, “2017/7/7 10:10: 44.138 "is shown. In addition, “packaging process” and “worker ID: A1” are displayed on the moving image screen, and information identifying the process indicated by the extracted operation information and the worker whose extracted operation information was measured Identification information is displayed. The details of the operation actually performed by the worker can be confirmed by the hand moving image DP3 in which the worker's hand is closed up.
 なお、表示態様の選択処理において、基準動作を併せて表示(S51)する場合には、第1動画及び推定された位置で撮影された第2動画について、抽出された動作情報が示す動作を実行している時間帯の部分を表示するとともに、基準動作を実行している第1動画及び第2動画を表示してよい。その場合、第1動画同士が隣接し、第2動画同士が隣接するように表示して、比較を容易にしてもよい。 In addition, in the selection process of the display mode, when the reference motion is displayed together (S51), the motion indicated by the extracted motion information is executed for the first motion image and the second motion image captured at the estimated position. While displaying the part of the time slot | zone which is carrying out, the 1st moving image and 2nd moving image which are performing reference operation may be displayed. In that case, the first moving images may be adjacent to each other, and the second moving images may be displayed adjacent to each other to facilitate comparison.
 また、重畳画像を表示(S55)する場合には、第1動画について、抽出された動作情報が示す動作を実行している時間帯の部分を表示するとともに、推定された位置で撮影された第2動画について、複数のフレームを重畳させて一枚の画像として表示してよい。 Further, in the case of displaying the superimposed image (S55), the first moving image displays the portion of the time zone in which the operation indicated by the extracted operation information is displayed, and the first image captured at the estimated position. For two moving images, a plurality of frames may be superimposed and displayed as a single image.
 以上説明した実施形態は、本発明の理解を容易にするためのものであり、本発明を限定して解釈するためのものではない。実施形態が備える各要素並びにその配置、材料、条件、形状及びサイズ等は、例示したものに限定されるわけではなく適宜変更することができる。また、実施形態で示した構成同士を部分的に置換し又は組み合わせることが可能である。 The embodiments described above are for the purpose of facilitating the understanding of the present invention, and are not for the purpose of limiting the present invention. The elements included in the embodiment and the arrangement, the material, the conditions, the shape, the size, and the like of the elements are not limited to those illustrated, and can be changed as appropriate. Further, the configurations shown in the embodiments can be partially substituted or combined.
 動作分析システム100は、製造ラインのある作業領域において実行された作業者の動作を示す動作情報を測定するものに限られない。例えば、動作分析システム100は、ゴルフクラブのスイング等、スポーツを行う人物の動作であったり、演劇等の身体を用いたパフォーマンスを行う人物の動作であったりを示す動作情報を測定し、同時に動画を撮影して、人物が従うべき動作を示す基準動作情報と動作情報とを比較し、例えば基準動作情報からの乖離が閾値以上であること等の所定の条件を満たす動作情報を抽出して、抽出された動作情報が示す動作を人物が実行している動画の場面を表示してもよい。動作分析システム100は、手本となる動作が定義される場合に用いることができ、手本となる動作を示す動作情報を基準動作情報として事前に記憶部に記憶して、測定された動作情報と比較することで、手本から逸脱した動作を実行している動画の場面を表示することができる。 The motion analysis system 100 is not limited to one that measures motion information indicating a worker's motion performed in a certain work area of a manufacturing line. For example, the motion analysis system 100 measures motion information indicating a motion of a person performing sports, such as a swing of a golf club, or a motion of a person performing a performance using a body such as a theater, etc. To compare the motion information with the reference motion information indicating the motion to be followed by the person, and, for example, to extract motion information satisfying a predetermined condition such as deviation from the reference motion information being equal to or greater than a threshold value; You may display the scene of the moving image in which the person is performing the operation | movement which the extracted operation | movement information shows. The motion analysis system 100 can be used when a motion that serves as a model is defined, and stores motion information indicating a motion that serves as a model in the storage unit in advance as reference motion information, and the measured motion information By comparing with, it is possible to display a moving image scene in which an operation deviated from the example is performed.
 なお、本実施形態及び変形例では、動作分析システム100が、複数の撮影部を備え、複数の撮影部により撮影された動画を用いて、抽出された動作情報が示す動作を作業者が実行している場面を表示する場合について説明したが、動作分析システムは、動画以外の情報を抽出するものであってもよい。 In the present embodiment and the modification, the motion analysis system 100 includes a plurality of imaging units, and the operator executes an operation indicated by the extracted operation information using moving images captured by the plurality of imaging units. Although the case of displaying the scene being displayed is described, the motion analysis system may extract information other than the moving image.
 図19は、本開示の他の実施形態に係る動作分析システム100gの機能ブロックを示す図である。他の実施形態に係る動作分析システム100gは、ある作業領域において実行された作業者の動作を示す動作情報を測定する測定部30と、作業者が実行する動作をセンシングする第1センサ40a、第2センサ40b及び第3センサ40cと、動作情報と基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部15と、第1センサ40a、第2センサ40b及び第3センサ40cによりセンシングされたセンシングデータを用いて、抽出された動作情報が示す動作を作業者が実行している場面を出力する出力部18と、を備えるものであってもよい。すなわち、他の実施形態に係る動作分析システム100gは、本実施形態に係る動作分析システム100の撮影部(すなわち画像センサ)を他のセンサに置き換えた構成であってもよい。センサとしては、作業者の動作をセンシングし得る任意のセンサを用いることができるが、例えば、近接センサや測距センサ等を用いることができる。なお、出力部18は、抽出された動作情報が示す動作を作業者が実行している場面を表示部10fに出力してよいが、当該場面を他の分析装置に出力してもよい。 FIG. 19 is a diagram showing functional blocks of a motion analysis system 100g according to another embodiment of the present disclosure. A motion analysis system 100g according to another embodiment includes a measurement unit 30 that measures motion information indicating a worker's motion performed in a certain work area, a first sensor 40a that senses a worker's motion to be executed, and The extraction unit 15 that extracts operation information that satisfies a predetermined condition by comparing the 2 sensors 40b and the third sensor 40c with the operation information and the reference operation information, a first sensor 40a, a second sensor 40b, and a third sensor The output unit 18 may output a scene in which the worker is performing the operation indicated by the extracted operation information using the sensing data sensed by 40c. That is, the motion analysis system 100g according to the other embodiment may have a configuration in which the imaging unit (that is, the image sensor) of the motion analysis system 100 according to the present embodiment is replaced with another sensor. As a sensor, although any sensor that can sense the operation of the worker can be used, for example, a proximity sensor or a distance measurement sensor can be used. The output unit 18 may output a scene in which the operator is executing the operation indicated by the extracted operation information to the display unit 10 f, but may output the scene to another analyzer.
 動作分析システム100gに含まれる動作分析装置10gの第2取得部12は、作業者が実行する動作をセンシングしたセンシングデータを取得してよい。また、記憶部14は、作業者が実行する動作を第1センサ40a、第2センサ40b及び第3センサ40cによりセンシングしたセンシングデータの履歴であるセンシングデータ履歴14eを記憶してよい。その他、動作分析装置10gは、本実施形態に係る動作分析装置10と同様の構成を備えてよい。 The second acquisition unit 12 of the motion analysis device 10g included in the motion analysis system 100g may obtain sensing data obtained by sensing a motion performed by the worker. In addition, the storage unit 14 may store a sensing data history 14e which is a history of sensing data sensed by the first sensor 40a, the second sensor 40b, and the third sensor 40c for an operation performed by the worker. In addition, the motion analysis device 10g may have the same configuration as the motion analysis device 10 according to the present embodiment.
 製造ラインで様々な作業が行われる場合に、作業者によって標準外の動作が行われたことを検知したい場合がある。このために、製造ラインに複数のセンサを設置して、作業者の動作をセンシングすることが考えられる。 When various operations are performed on a manufacturing line, it may be desired to detect that an operator performs an operation other than the standard. For this purpose, it is conceivable to install a plurality of sensors in the manufacturing line to sense the operation of the worker.
 しかしながら、設置するセンサの台数が増えることにより確認を要するセンシングデータの数が増えたり、個々のセンシングデータの長さが長大となったりすると、記録されたセンシングデータのいずれかの部分に特定の動作が実行された場面が記録されていたとしても、特定の動作がいつ実行されたのかが分かりづらくなり、特定の動作が実行された場面を抽出することが困難となる。 However, if the number of sensing data to be confirmed increases as the number of sensors to be installed increases, or if the length of individual sensing data becomes long, an operation specific to any part of the recorded sensing data is performed. Even if the scene where the action was performed is recorded, it becomes difficult to understand when the specific action is performed, and it becomes difficult to extract the scene where the specific action is performed.
 そこで、作業者が実行する動作をセンサによりセンシングし、センサによりセンシングされたデータのうち、作業者が、抽出された動作情報が示す動作を実行している部分を特定することで、記録されたセンシングデータの数又は長さに関わらず、特定の動作が実行された場面を抽出することができる動作分析システム、動作分析方法及び動作分析プログラムを提供することができる。ここで、特定の動作が実行された場面は、画像センサによってセンシングされた動画の場面であってもよいし、他のセンサによってセンシングされたセンシングデータの場面であってもよい。 Therefore, the operation performed by the operator is sensed by a sensor, and the data sensed by the sensor is recorded by specifying the portion executing the operation indicated by the extracted operation information. A motion analysis system, a motion analysis method, and a motion analysis program can be provided that can extract a scene in which a specific motion is performed regardless of the number or length of sensing data. Here, the scene in which the specific operation is performed may be a scene of a moving image sensed by an image sensor, or may be a scene of sensing data sensed by another sensor.
 本発明の実施形態は、以下の付記のようにも記載され得る。ただし、本発明の実施形態は、以下の付記に記載した形態に限定されない。また、本発明の実施形態は、付記間の記載を置換したり、組み合わせたりした形態であってもよい。 Embodiments of the present invention may also be described as the following appendices. However, embodiments of the present invention are not limited to the modes described in the following appendices. In addition, the embodiments of the present invention may be in the form of replacing or combining the descriptions of the supplementary notes.
 [付記1]
 ある作業領域において実行された1以上の作業者の動作を示す動作情報を測定する測定部(30)と、
 前記作業者が前記動作を実行している場面を含む動画を撮影する撮影部(20a、20b、20c、30)と、
 前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を記憶する記憶部(14)と、
 前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部(15)と、
 前記動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を表示する表示部(10f)と、
 を備える動作分析システム。
[Supplementary Note 1]
A measurement unit (30) for measuring operation information indicating an operation of one or more workers executed in a certain work area;
A shooting unit (20a, 20b, 20c, 30) for shooting a moving image including a scene in which the worker is executing the operation;
A storage unit (14) for storing reference operation information indicating a reference operation as a reference for comparison of the operation of the worker;
An extraction unit (15) for comparing the operation information and the reference operation information to extract operation information satisfying a predetermined condition;
A display unit (10f) for displaying a scene in which the worker is executing the operation indicated by the extracted operation information using the moving image;
Analysis system with
 [付記2]
 前記測定部(30)は、複数の前記作業者の動作をそれぞれ示す複数の前記動作情報を測定し、
 前記表示部(10f)は、抽出された前記動作情報が測定された前記作業者を識別する情報を表示する、
 付記1に記載の動作分析システム。
[Supplementary Note 2]
The measurement unit (30) measures a plurality of the operation information respectively indicating the operations of the plurality of the workers;
The display unit (10f) displays information for identifying the worker for which the extracted operation information has been measured.
The motion analysis system according to appendix 1.
 [付記3]
 前記撮影部(20a、20b、20c、30)は、前記作業領域を撮影した第1動画を撮影する第1撮影部(30)及び前記作業領域の一部を撮影した第2動画を撮影する第2撮影部(20a、20b、20c)を含み、
 前記表示部(10f)は、前記第1動画及び前記第2動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している時間帯の場面をそれぞれ表示する、
 付記1又は2に記載の動作分析システム。
[Supplementary Note 3]
The imaging unit (20a, 20b, 20c, 30) is configured to capture a first imaging unit (30) that captures a first moving image that captures the work area, and a second capturing that captures a portion of the work area 2 including an imaging unit (20a, 20b, 20c),
The display unit (10f) uses the first moving image and the second moving image to display a scene of a time zone in which the worker is executing the operation indicated by the extracted operation information, respectively.
The motion analysis system according to Appendix 1 or 2.
 [付記4]
 前記第2撮影部(20a、20b、20c)は、前記作業領域の複数の部分をそれぞれ撮影した複数の前記第2動画を撮影し、
 抽出された前記動作情報が示す動作が実行された前記作業領域における位置を推定する推定部(17)をさらに備え、
 前記表示部(10f)は、複数の前記第2動画のうち、推定された位置で撮影された前記第2動画を表示する、
 付記3に記載の動作分析システム。
[Supplementary Note 4]
The second imaging unit (20a, 20b, 20c) captures a plurality of second moving images obtained by capturing a plurality of portions of the work area,
It further comprises an estimation unit (17) for estimating the position in the work area where the operation indicated by the extracted operation information has been executed.
The display unit (10f) displays the second moving image captured at an estimated position among the plurality of second moving images.
The motion analysis system according to appendix 3.
 [付記5]
 前記基準動作情報は、複数の工程毎に定められており、 前記抽出部(15)は、複数の工程毎に、前記動作情報と、前記基準動作情報とを比較して、前記所定の条件を満たす動作情報を抽出し、
 前記表示部(10f)は、抽出された前記動作情報が示す工程を識別する情報を表示する、
 付記4に記載の動作分析システム。
[Supplementary Note 5]
The reference operation information is determined for each of a plurality of steps, and the extraction unit (15) compares the operation information with the reference operation information for each of a plurality of steps to determine the predetermined condition. Extract the operation information to satisfy
The display unit (10f) displays information identifying a process indicated by the extracted operation information.
The motion analysis system according to appendix 4.
 [付記6]
 前記推定部(17)は、抽出された前記動作情報が示す工程に基づいて、抽出された前記動作情報が示す動作を前記作業者が実行している前記作業領域における位置を推定する、
 付記5に記載の動作分析システム。
[Supplementary Note 6]
The estimation unit (17) estimates the position in the work area where the worker is executing the operation indicated by the extracted operation information, based on the process indicated by the extracted operation information.
The motion analysis system according to appendix 5.
 [付記7]
 前記動作情報及び前記基準動作情報は、前記作業者の関節の座標値をそれぞれ含み、
 前記抽出部(15)は、前記動作情報に含まれる前記座標値と、前記基準動作情報に含まれる前記座標値とを比較して、前記所定の条件を満たす動作情報を抽出する、
 付記4に記載の動作分析システム。
[Supplementary Note 7]
The motion information and the reference motion information each include coordinate values of joints of the worker,
The extraction unit (15) compares the coordinate value included in the movement information with the coordinate value included in the reference movement information to extract movement information satisfying the predetermined condition.
The motion analysis system according to appendix 4.
 [付記8]
 前記推定部(17)は、抽出された前記動作情報に含まれる前記座標値に基づいて、抽出された前記動作情報が示す動作を前記作業者が実行している前記作業領域における位置を推定する、
 付記7に記載の動作分析システム。
[Supplementary Note 8]
The estimation unit (17) estimates a position in the work area where the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information. ,
The motion analysis system according to appendix 7.
 [付記9]
 前記基準動作情報は、複数の要素動作毎に定められており、
 前記推定部(17)は、抽出された前記動作情報に含まれる前記座標値及び抽出された前記動作情報が示す前記要素動作に基づいて、抽出された前記動作情報が示す動作を前記作業者が実行している前記作業領域における位置を推定する、
 付記7又は8に記載の動作分析システム。
[Supplementary Note 9]
The reference operation information is determined for each of a plurality of element operations,
The estimation unit (17) is configured such that the worker indicates an operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information and the element operation indicated by the extracted operation information. Estimate the position in the working area being executed,
The motion analysis system according to appendix 7 or 8.
 [付記10]
 前記表示部(10f)は、前記動画を用いて抽出された前記動作情報が示す動作を前記作業者が実行している場面と、前記基準動作情報が示す前記基準動作を前記作業者が実行している場面を含む動画と、を表示する、
 付記1から9のいずれか一項に記載の動作分析システム。
[Supplementary Note 10]
The display unit (10f) causes the worker to execute the reference operation indicated by the reference operation information and a scene in which the operator is executing the operation indicated by the operation information extracted using the moving image. To display videos that include
The motion analysis system according to any one of appendices 1 to 9.
 [付記11]
 前記表示部(10f)は、抽出された前記動作情報を示すグラフと、前記動画のうち抽出された前記動作情報が示す動作を前記作業者が実行している前記動画の場面と、を表示する、
 付記1から10のいずれか一項に記載の動作分析システム。
[Supplementary Note 11]
The display unit (10f) displays a graph indicating the extracted operation information and a scene of the moving image in which the worker is executing the operation indicated by the operation information extracted from the moving image. ,
The operation analysis system according to any one of appendices 1 to 10.
 [付記12]
 前記表示部(10f)は、前記動画のうち抽出された前記動作情報が示す動作を前記作業者が実行している場面に含まれる複数のフレームを重畳させて一枚の画像として表示する、
 付記1から11のいずれか一項に記載の動作分析システム。
[Supplementary Note 12]
The display unit (10f) superimposes a plurality of frames included in a scene in which the operator is executing an operation indicated by the operation information extracted from the moving image and displays the frame as a single image.
The motion analysis system according to any one of appendices 1 to 11.
 [付記13]
 ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得する第1取得部(11)と、
 前記作業者が前記動作を実行している場面を含む動画を取得する第2取得部(12)と、
 前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得する第3取得部(13)と、
 前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部(15)と、
 前記動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を特定する特定部(16)と、
 を備える動作分析装置。
[Supplementary Note 13]
A first acquisition unit (11) for acquiring operation information indicating an operation of one or more workers executed in a certain work area;
A second acquisition unit (12) for acquiring a moving image including a scene in which the worker is executing the operation;
A third acquisition unit (13) that acquires reference operation information indicating a reference operation that is a reference of comparison for the operation of the worker;
An extraction unit (15) for comparing the operation information and the reference operation information to extract operation information satisfying a predetermined condition;
A specifying unit (16) for specifying a scene in which the operator is executing the operation indicated by the extracted operation information using the moving image;
Operation analysis device comprising:
 [付記14]
 ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得することと、
 前記作業者が前記動作を実行している場面を含む動画を取得することと、
 前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得することと、
 前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出することと、
 前記動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を特定することと、
 を含む動作分析方法。
[Supplementary Note 14]
Obtaining operation information indicating an operation of one or more workers performed in a work area;
Acquiring a moving image including a scene in which the worker is performing the operation;
Obtaining reference operation information indicating a reference operation which is a reference of comparison for the operation of the worker;
Extracting the operation information satisfying a predetermined condition by comparing the operation information and the reference operation information;
Specifying the scene in which the operator is executing the operation indicated by the extracted operation information using the moving image;
Operation analysis method including:
 [付記15]
 動作分析装置に備えられた演算装置を、
 ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得する第1取得部(11)、
 前記作業者が前記動作を実行している場面を含む動画を取得する第2取得部(12)、
 前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得する第3取得部(13)、
 前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部(15)、及び
 前記動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を特定する特定部(16)、
 として動作させる動作分析プログラム。
[Supplementary Note 15]
A computing device provided in the motion analysis device,
A first acquisition unit (11) for acquiring operation information indicating an operation of one or more workers executed in a certain work area;
A second acquisition unit (12) for acquiring a moving image including a scene in which the worker is executing the operation;
A third acquisition unit (13) for acquiring reference operation information indicating a reference operation serving as a reference for comparison of the operation of the worker;
An extraction unit (15) for extracting the operation information satisfying a predetermined condition by comparing the operation information with the reference operation information; and the work indicated by the operation information extracted using the moving image Identifying part (16) for identifying the scene where the person is performing,
Operation analysis program to operate as.
 [付記16]
 ある作業領域において実行された1以上の作業者の動作を示す動作情報を測定する測定部(30)と、
 前記作業者が実行する前記動作をセンシングするセンサ(40a、40b、40c、30)と、
 前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を記憶する記憶部(14)と、
 前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部(15)と、
 前記センサ(40a、40b、40c、30)によりセンシングされたセンシングデータを用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を出力する出力部(18)と、
 を備える動作分析システム。
[Supplementary Note 16]
A measurement unit (30) for measuring operation information indicating an operation of one or more workers executed in a certain work area;
A sensor (40a, 40b, 40c, 30) that senses the operation performed by the worker;
A storage unit (14) for storing reference operation information indicating a reference operation as a reference for comparison of the operation of the worker;
An extraction unit (15) for comparing the operation information and the reference operation information to extract operation information satisfying a predetermined condition;
An output unit (18) for outputting a scene in which the worker is performing an operation indicated by the extracted operation information using sensing data sensed by the sensor (40a, 40b, 40c, 30);
Analysis system with
 [付記17]
 前記測定部(30)は、複数の前記作業者の動作をそれぞれ示す複数の前記動作情報を測定し、
 前記出力部(18)は、抽出された前記動作情報が測定された前記作業者を識別する情報を出力する、
 付記16に記載の動作分析システム。
[Supplementary Note 17]
The measurement unit (30) measures a plurality of the operation information respectively indicating the operations of the plurality of the workers;
The output unit (18) outputs information for identifying the worker for which the extracted operation information has been measured.
The motion analysis system according to appendix 16.
 [付記18]
 前記センサ(40a、40b、40c、30)は、前記作業領域をセンシングした第1センシングデータをセンシングする第1センサ(30)及び前記作業領域の一部をセンシングした第2センシングデータをセンシングする第2センサ(40a、40b、40c)を含み、
 前記出力部(18)は、前記第1センシングデータ及び前記第2センシングデータを用いて、抽出された前記動作情報が示す動作を前記作業者が実行している時間帯の場面をそれぞれ出力する、
 付記16又は17に記載の動作分析システム。
[Supplementary Note 18]
The sensor (40a, 40b, 40c, 30) is configured to sense a first sensor (30) for sensing a first sensing data sensing the work area, and a second sensing data for sensing a portion of the work area Including two sensors (40a, 40b, 40c),
The output unit (18) outputs, using the first sensing data and the second sensing data, a scene of a time zone in which the worker is executing the operation indicated by the extracted operation information, respectively.
The motion analysis system according to appendix 16 or 17.
 [付記19]
 前記第2センサ(40a、40b、40c)は、前記作業領域の複数の部分をそれぞれセンシングした複数の前記第2センシングデータをセンシングし、
 抽出された前記動作情報が示す動作が実行された前記作業領域における位置を推定する推定部(17)をさらに備え、
 前記出力部(18)は、複数の前記第2センシングデータのうち、推定された位置でセンシングされた前記第2センシングデータを出力する、
 付記18に記載の動作分析システム。
[Supplementary Note 19]
The second sensor (40a, 40b, 40c) senses a plurality of second sensing data obtained by sensing a plurality of portions of the work area, respectively.
It further comprises an estimation unit (17) for estimating the position in the work area where the operation indicated by the extracted operation information has been executed.
The output unit (18) outputs the second sensing data sensed at a position estimated among the plurality of second sensing data,
The motion analysis system according to appendix 18.
 [付記20]
 前記基準動作情報は、複数の工程毎に定められており、
 前記抽出部(15)は、複数の工程毎に、前記動作情報と、前記基準動作情報とを比較して、前記所定の条件を満たす動作情報を抽出し、
 前記出力部(18)は、抽出された前記動作情報が示す工程を識別する情報を出力する、
 付記19に記載の動作分析システム。
[Supplementary Note 20]
The reference operation information is determined for each of a plurality of steps,
The extraction unit (15) compares the operation information with the reference operation information for each of a plurality of steps, and extracts the operation information satisfying the predetermined condition,
The output unit (18) outputs information identifying a process indicated by the extracted operation information.
The motion analysis system according to appendix 19.
 [付記21]
 前記推定部(17)は、抽出された前記動作情報が示す工程に基づいて、抽出された前記動作情報が示す動作を前記作業者が実行している前記作業領域における位置を推定する、
 付記20に記載の動作分析システム。
[Supplementary Note 21]
The estimation unit (17) estimates the position in the work area where the worker is executing the operation indicated by the extracted operation information, based on the process indicated by the extracted operation information.
The motion analysis system according to appendix 20.
 [付記22]
 前記動作情報及び前記基準動作情報は、前記作業者の関節の座標値をそれぞれ含み、
 前記抽出部(15)は、前記動作情報に含まれる前記座標値と、前記基準動作情報に含まれる前記座標値とを比較して、前記所定の条件を満たす動作情報を抽出する、
 付記19に記載の動作分析システム。
[Supplementary Note 22]
The motion information and the reference motion information each include coordinate values of joints of the worker,
The extraction unit (15) compares the coordinate value included in the movement information with the coordinate value included in the reference movement information to extract movement information satisfying the predetermined condition.
The motion analysis system according to appendix 19.
 [付記23]
 前記推定部(17)は、抽出された前記動作情報に含まれる前記座標値に基づいて、抽出された前記動作情報が示す動作を前記作業者が実行している前記作業領域における位置を推定する、
 付記22に記載の動作分析システム。
[Supplementary Note 23]
The estimation unit (17) estimates a position in the work area where the worker is executing the operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information. ,
A motion analysis system according to appendix 22.
 [付記24]
 前記基準動作情報は、複数の要素動作毎に定められており、
 前記推定部(17)は、抽出された前記動作情報に含まれる前記座標値及び抽出された前記動作情報が示す前記要素動作に基づいて、抽出された前記動作情報が示す動作を前記作業者が実行している前記作業領域における位置を推定する、
 付記22又は23に記載の動作分析システム。
[Supplementary Note 24]
The reference operation information is determined for each of a plurality of element operations,
The estimation unit (17) is configured such that the worker indicates an operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information and the element operation indicated by the extracted operation information. Estimate the position in the working area being executed,
The motion analysis system according to appendix 22 or 23.
 [付記25]
 前記出力部(18)は、前記センシングデータのうち抽出された前記動作情報が示す動作を前記作業者が実行している場面と、前記基準動作情報が示す前記基準動作を前記作業者が実行している場面を含むセンシングデータと、を出力する、
 付記16から24のいずれか一項に記載の動作分析システム。
[Supplementary Note 25]
The output unit (18) causes the worker to execute the reference operation indicated by the reference operation information and a scene in which the operator indicates the operation indicated by the operation information extracted from the sensing data. Output sensing data including the scene being
The motion analysis system according to any one of appendices 16 to 24.
 [付記26]
 前記出力部(18)は、抽出された前記動作情報を示すグラフと、前記センシングデータのうち抽出された前記動作情報が示す動作を前記作業者が実行している場面と、を出力する、
 付記16から25のいずれか一項に記載の動作分析システム。
[Supplementary Note 26]
The output unit (18) outputs a graph indicating the extracted operation information and a scene in which the operator is executing the operation indicated by the operation information extracted from the sensing data.
The motion analysis system according to any one of appendices 16 to 25.
 [付記27]
 ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得する第1取得部(11)と、
 前記作業者が実行する前記動作をセンシングしたセンシングデータを取得する第2取得部(12)と、
 前記作業者の動作について比較の基準となる基準動作を定量的に示す基準動作情報を取得する第3取得部(13)と、
 前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部(15)と、
 前記センシングデータを用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を出力する出力部(18)と、
 を備える動作分析装置。
[Supplementary Note 27]
A first acquisition unit (11) for acquiring operation information indicating an operation of one or more workers executed in a certain work area;
A second acquisition unit (12) for acquiring sensing data obtained by sensing the operation performed by the worker;
A third acquisition unit (13) for acquiring reference operation information quantitatively indicating a reference operation serving as a reference for comparison of the operation of the worker;
An extraction unit (15) for comparing the operation information and the reference operation information to extract operation information satisfying a predetermined condition;
An output unit (18) for outputting a scene in which the worker is executing the operation indicated by the extracted operation information using the sensing data;
Operation analysis device comprising:
 [付記28]
 ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得することと、
 前記作業者が実行する前記動作をセンシングしたセンシングデータを取得することと、
 前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得することと、
 前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出することと、
 前記センシングデータを用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を出力することと、
 を含む動作分析方法。
[Supplementary Note 28]
Obtaining operation information indicating an operation of one or more workers performed in a work area;
Obtaining sensing data obtained by sensing the operation performed by the worker;
Obtaining reference operation information indicating a reference operation which is a reference of comparison for the operation of the worker;
Extracting the operation information satisfying a predetermined condition by comparing the operation information and the reference operation information;
Outputting a scene in which the operator is executing an operation indicated by the extracted operation information using the sensing data;
Operation analysis method including:
 [付記29]
 動作分析装置に備えられた演算装置を、
 ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得する第1取得部(11)、
 前記作業者が実行する前記動作をセンシングしたセンシングデータを取得する第2取得部(12)、
 前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得する第3取得部(13)、
 前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部(15)、及び
 前記センシングデータを用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を出力する出力部(18)、
 として動作させる動作分析プログラム。
[Supplementary Note 29]
A computing device provided in the motion analysis device,
A first acquisition unit (11) for acquiring operation information indicating an operation of one or more workers executed in a certain work area;
A second acquisition unit (12) for acquiring sensing data obtained by sensing the operation performed by the worker;
A third acquisition unit (13) for acquiring reference operation information indicating a reference operation serving as a reference for comparison of the operation of the worker;
An extraction unit (15) which compares the operation information with the reference operation information and extracts operation information satisfying a predetermined condition; and the operation indicated by the operation information extracted using the sensing data An output unit (18) for outputting a scene that the operator is executing;
Operation analysis program to operate as.
 [付記30]
 ある作業領域において実行された1以上の作業者の動作を示す動作情報を測定することと、
 前記作業者が前記動作を実行している場面を含む動画を撮影することと、
 前記動作情報と、前記作業者の動作について比較の基準となる基準動作を示す基準動作情報とを比較して、所定の条件を満たす動作情報が示す動作を前記作業者が実行している前記動画の場面を表示することと、
 を含む動作分析方法。
[Supplementary Note 30]
Measuring motion information indicative of the motion of one or more workers performed in a work area;
Shooting a moving image including a scene in which the worker is performing the operation;
The moving image in which the worker is executing the operation indicated by the operation information satisfying a predetermined condition by comparing the operation information with a reference operation information indicating a reference operation serving as a reference for comparison of the operation of the worker Displaying the scene of the
Operation analysis method including:
 10…動作分析装置、10a…CPU、10b…RAM、10c…ROM、10d…通信部、10e…入力部、10f…表示部、10g…動作分析装置、11…第1取得部、12…第2取得部、13…第3取得部、14…記憶部、14a…動画履歴、14b…動作情報履歴、14c…基準動作情報、14d…対応テーブル、14e…センシングデータ履歴、15…抽出部、16…特定部、17…推定部、18…出力部、20a…第1撮影部、20b…第2撮影部、20c…第3撮影部、30…測定部、40a…第1センサ、40b…第2センサ、40c…第3センサ、100…動作分析システム、100g…動作分析システム、D1…動作情報、D2…第1基準動作情報、D3…第1対応テーブル、D4…第2対応テーブル、D5…第2基準動作情報、D6…第3対応テーブル、DP…画面、DP1…概要、DP2…第1動画、DP3…第2動画、DP4…グラフ、R…作業領域 DESCRIPTION OF SYMBOLS 10 Operation analysis apparatus 10a CPU 10b RAM 10c ROM 10d Communication unit 10e input unit 10f display unit 10g Operation analysis apparatus 11 first acquisition unit 12 second Acquisition unit, 13 third acquisition unit, 14 storage unit, 14a moving picture history, 14b operation information history, 14c reference operation information, 14d correspondence table, 14e sensing data history, 15 extraction unit, 16 Identification unit 17 Estimation unit 18 Output unit 20a First imaging unit 20b Second imaging unit 20c Third imaging unit 30 Measurement unit 40a First sensor 40b Second sensor 40c: third sensor 100: motion analysis system 100g: motion analysis system D1: motion information D2: first reference motion information D3: first correspondence table D4: second correspondence table D5: second Group Operation information, D6 ... third correspondence table, DP ... screen, DP1 ... summary, DP2 ... first video, DP3 ... second video, DP4 ... graph, R ... work area

Claims (15)

  1.  ある作業領域において実行された1以上の作業者の動作を示す動作情報を測定する測定部と、
     前記作業者が前記動作を実行している場面を含む動画を撮影する撮影部と、
     前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を記憶する記憶部と、
     前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部と、
     前記動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を表示する表示部と、
     を備える動作分析システム。
    A measurement unit that measures operation information indicating an operation of one or more workers executed in a work area;
    A shooting unit for shooting a moving image including a scene in which the worker is executing the operation;
    A storage unit storing reference operation information indicating a reference operation which is a reference of comparison of the operation of the worker;
    An extraction unit that compares the operation information with the reference operation information to extract operation information that satisfies a predetermined condition;
    A display unit that displays a scene in which the worker is executing an operation indicated by the extracted operation information using the moving image;
    Analysis system with
  2.  前記測定部は、複数の前記作業者の動作をそれぞれ示す複数の前記動作情報を測定し、
     前記表示部は、抽出された前記動作情報が測定された前記作業者を識別する情報を表示する、
     請求項1に記載の動作分析システム。
    The measurement unit measures a plurality of pieces of operation information respectively indicating the operations of the plurality of workers.
    The display unit displays information for identifying the worker for which the extracted operation information has been measured.
    The motion analysis system according to claim 1.
  3.  前記撮影部は、前記作業領域を撮影した第1動画を撮影する第1撮影部及び前記作業領域の一部を撮影した第2動画を撮影する第2撮影部を含み、
     前記表示部は、前記第1動画及び前記第2動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している時間帯の場面をそれぞれ表示する、
     請求項1又は2に記載の動作分析システム。
    The imaging unit includes a first imaging unit that captures a first moving image that captures the work area, and a second imaging unit that captures a second moving image that captures a portion of the work area.
    The display unit displays a scene of a time zone during which the worker is executing the operation indicated by the extracted operation information using the first moving image and the second moving image.
    The motion analysis system according to claim 1.
  4.  前記第2撮影部は、前記作業領域の複数の部分をそれぞれ撮影した複数の前記第2動画を撮影し、
     抽出された前記動作情報が示す動作が実行された前記作業領域における位置を推定する推定部をさらに備え、
     前記表示部は、複数の前記第2動画のうち、推定された位置で撮影された前記第2動画を表示する、
     請求項3に記載の動作分析システム。
    The second imaging unit captures a plurality of second moving images obtained by capturing a plurality of portions of the work area, respectively.
    It further comprises an estimation unit for estimating the position in the work area where the operation indicated by the extracted operation information is performed,
    The display unit displays the second moving image captured at an estimated position among the plurality of second moving images.
    The motion analysis system according to claim 3.
  5.  前記基準動作情報は、複数の工程毎に定められており、
     前記抽出部は、複数の工程毎に、前記動作情報と、前記基準動作情報とを比較して、前記所定の条件を満たす動作情報を抽出し、
     前記表示部は、抽出された前記動作情報が示す工程を識別する情報を表示する、
     請求項4に記載の動作分析システム。
    The reference operation information is determined for each of a plurality of steps,
    The extraction unit compares the operation information with the reference operation information for each of a plurality of steps, and extracts the operation information satisfying the predetermined condition,
    The display unit displays information identifying a process indicated by the extracted operation information.
    The motion analysis system according to claim 4.
  6.  前記推定部は、抽出された前記動作情報が示す工程に基づいて、抽出された前記動作情報が示す動作を前記作業者が実行している前記作業領域における位置を推定する、
     請求項5に記載の動作分析システム。
    The estimation unit estimates the position in the work area where the worker is executing the operation indicated by the extracted operation information, based on the process indicated by the extracted operation information.
    The motion analysis system according to claim 5.
  7.  前記動作情報及び前記基準動作情報は、前記作業者の関節の座標値をそれぞれ含み、
     前記抽出部は、前記動作情報に含まれる前記座標値と、前記基準動作情報に含まれる前記座標値とを比較して、前記所定の条件を満たす動作情報を抽出する、
     請求項4に記載の動作分析システム。
    The motion information and the reference motion information each include coordinate values of joints of the worker,
    The extraction unit compares the coordinate value included in the movement information with the coordinate value included in the reference movement information to extract movement information satisfying the predetermined condition.
    The motion analysis system according to claim 4.
  8.  前記推定部は、抽出された前記動作情報に含まれる前記座標値に基づいて、抽出された前記動作情報が示す動作を前記作業者が実行している前記作業領域における位置を推定する、
     請求項7に記載の動作分析システム。
    The estimation unit estimates a position in the work area where the worker is executing the operation indicated by the extracted operation information, based on the coordinate value included in the extracted operation information.
    The motion analysis system according to claim 7.
  9.  前記基準動作情報は、複数の要素動作毎に定められており、
     前記推定部は、抽出された前記動作情報に含まれる前記座標値及び抽出された前記動作情報が示す前記要素動作に基づいて、抽出された前記動作情報が示す動作を前記作業者が実行している前記作業領域における位置を推定する、
     請求項7又は8に記載の動作分析システム。
    The reference operation information is determined for each of a plurality of element operations,
    The operator performs an operation indicated by the extracted operation information based on the coordinate value included in the extracted operation information and the element operation indicated by the extracted operation information. Estimate the position in the work area
    The motion analysis system according to claim 7 or 8.
  10.  前記表示部は、前記動画を用いて抽出された前記動作情報が示す動作を前記作業者が実行している場面と、前記基準動作情報が示す前記基準動作を前記作業者が実行している場面を含む動画と、を表示する、
     請求項1から9のいずれか一項に記載の動作分析システム。
    The display unit displays a scene in which the operator is executing an operation indicated by the operation information extracted using the moving image, and a scene in which the operator is executing the reference operation indicated by the reference operation information. Display videos, including
    The motion analysis system according to any one of claims 1 to 9.
  11.  前記表示部は、抽出された前記動作情報を示すグラフと、抽出された前記動作情報が示す動作を前記作業者が実行している前記動画の場面と、を表示する、
     請求項1から10のいずれか一項に記載の動作分析システム。
    The display unit displays a graph indicating the extracted operation information, and a scene of the moving image in which the operator is executing the operation indicated by the extracted operation information.
    The motion analysis system according to any one of claims 1 to 10.
  12.  前記表示部は、前記動画のうち抽出された前記動作情報が示す動作を前記作業者が実行している場面に含まれる複数のフレームを重畳させて一枚の画像として表示する、
     請求項1から11のいずれか一項に記載の動作分析システム。
    The display unit superimposes a plurality of frames included in a scene in which the operator is executing an operation indicated by the operation information extracted from the moving image and displays the frame as a single image.
    The motion analysis system according to any one of claims 1 to 11.
  13.  ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得する第1取得部と、
     前記作業者が前記動作を実行している場面を含む動画を取得する第2取得部と、
     前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得する第3取得部と、
     前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部と、
     前記動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を特定する特定部と、
     を備える動作分析装置。
    A first acquisition unit that acquires operation information indicating an operation of one or more workers executed in a certain work area;
    A second acquisition unit that acquires a moving image including a scene in which the worker is performing the operation;
    A third acquisition unit configured to acquire reference operation information indicating a reference operation serving as a reference for comparison of the operation of the worker;
    An extraction unit that compares the operation information with the reference operation information to extract operation information that satisfies a predetermined condition;
    An identifying unit that identifies a scene in which the worker is executing the operation indicated by the extracted operation information using the moving image;
    Operation analysis device comprising:
  14.  ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得することと、
     前記作業者が前記動作を実行している場面を含む動画を取得することと、
     前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得することと、
     前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出することと、
     前記動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を特定することと、
     を含む動作分析方法。
    Obtaining operation information indicating an operation of one or more workers performed in a work area;
    Acquiring a moving image including a scene in which the worker is performing the operation;
    Obtaining reference operation information indicating a reference operation which is a reference of comparison for the operation of the worker;
    Extracting the operation information satisfying a predetermined condition by comparing the operation information and the reference operation information;
    Specifying the scene in which the operator is executing the operation indicated by the extracted operation information using the moving image;
    Operation analysis method including:
  15.  動作分析装置に備えられた演算装置を、
     ある作業領域において実行された1以上の作業者の動作を示す動作情報を取得する第1取得部、
     前記作業者が前記動作を実行している場面を含む動画を取得する第2取得部、
     前記作業者の動作について比較の基準となる基準動作を示す基準動作情報を取得する第3取得部、
     前記動作情報と、前記基準動作情報とを比較して、所定の条件を満たす動作情報を抽出する抽出部、及び
     前記動画を用いて、抽出された前記動作情報が示す動作を前記作業者が実行している場面を特定する特定部、
     として動作させるための動作分析プログラム。
    A computing device provided in the motion analysis device,
    A first acquisition unit that acquires operation information indicating an operation of one or more workers executed in a certain work area;
    A second acquisition unit that acquires a moving image including a scene in which the worker is performing the operation;
    A third acquisition unit configured to acquire reference operation information indicating a reference operation serving as a reference for comparison of the operation of the worker;
    An extraction unit that compares the operation information with the reference operation information and extracts operation information that satisfies a predetermined condition; Identifying section that identifies the scene
    Analysis program for operating as.
PCT/JP2018/047812 2018-01-12 2018-12-26 Motion-analyzing system, motion-analyzing device, motion analysis method, and motion analysis program WO2019138877A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-003475 2018-01-12
JP2018003475A JP6951685B2 (en) 2018-01-12 2018-01-12 Motion analysis system, motion analysis device, motion analysis method and motion analysis program

Publications (1)

Publication Number Publication Date
WO2019138877A1 true WO2019138877A1 (en) 2019-07-18

Family

ID=67218589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/047812 WO2019138877A1 (en) 2018-01-12 2018-12-26 Motion-analyzing system, motion-analyzing device, motion analysis method, and motion analysis program

Country Status (2)

Country Link
JP (1) JP6951685B2 (en)
WO (1) WO2019138877A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210225029A1 (en) * 2018-05-16 2021-07-22 Panasonic Intellectual Property Management Co., Ltd. Work analyzing system and work analyzing method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7392312B2 (en) * 2019-07-29 2023-12-06 コニカミノルタ株式会社 Production performance recording system and production performance recording program
JP7442066B2 (en) 2019-11-06 2024-03-04 パナソニックIpマネジメント株式会社 Work display method and work display device
WO2023148970A1 (en) * 2022-02-07 2023-08-10 日本電気株式会社 Management device, management method, and computer-readable medium
JP2024021467A (en) * 2022-08-03 2024-02-16 オムロン株式会社 Work feature display device, work feature display method, and work feature display program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009020897A (en) * 2002-09-26 2009-01-29 Toshiba Corp Image analysis method, image analysis apparatus, image analysis program
JP2009110239A (en) * 2007-10-30 2009-05-21 Toshiba Corp System and method for analysis of working operation
JP2016042332A (en) * 2014-08-19 2016-03-31 大日本印刷株式会社 Work operation inspection system
WO2016147770A1 (en) * 2015-03-19 2016-09-22 日本電気株式会社 Monitoring system and monitoring method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006277549A (en) * 2005-03-30 2006-10-12 Toshiba Corp Field work authentication management system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009020897A (en) * 2002-09-26 2009-01-29 Toshiba Corp Image analysis method, image analysis apparatus, image analysis program
JP2009110239A (en) * 2007-10-30 2009-05-21 Toshiba Corp System and method for analysis of working operation
JP2016042332A (en) * 2014-08-19 2016-03-31 大日本印刷株式会社 Work operation inspection system
WO2016147770A1 (en) * 2015-03-19 2016-09-22 日本電気株式会社 Monitoring system and monitoring method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210225029A1 (en) * 2018-05-16 2021-07-22 Panasonic Intellectual Property Management Co., Ltd. Work analyzing system and work analyzing method
US11842511B2 (en) * 2018-05-16 2023-12-12 Panasonic Intellectual Property Management Co., Ltd. Work analyzing system and work analyzing method

Also Published As

Publication number Publication date
JP2019125023A (en) 2019-07-25
JP6951685B2 (en) 2021-10-20

Similar Documents

Publication Publication Date Title
WO2019138877A1 (en) Motion-analyzing system, motion-analyzing device, motion analysis method, and motion analysis program
JP5528151B2 (en) Object tracking device, object tracking method, and object tracking program
US9357181B2 (en) Tracking assistance device, a tracking assistance system and a tracking assistance method
US9251599B2 (en) Tracking assistance device, a tracking assistance system and a tracking assistance method that enable a monitoring person to perform a task of correcting tracking information
WO2021131552A1 (en) Operation analyzing device and operation analyzing method
US20090290753A1 (en) Method and system for gaze estimation
US20230101893A1 (en) Estimation device, learning device, teaching data creation device, estimation method, learning method, teaching data creation method, and recording medium
US10783376B2 (en) Information processing apparatus
CN111401340B (en) Method and device for detecting motion of target object
JP7216679B2 (en) Information processing device and judgment result output method
JP2020086929A (en) Temperature processing apparatus and temperature processing method
JP7004218B2 (en) Motion analysis device, motion analysis method, motion analysis program and motion analysis system
JP6928880B2 (en) Motion analysis device, motion analysis method, motion analysis program and motion analysis system
JP2020134242A (en) Measuring method, measuring device and program
JP7318814B2 (en) DATA GENERATION METHOD, DATA GENERATION PROGRAM AND INFORMATION PROCESSING DEVICE
WO2017003424A1 (en) Metric 3d stitching of rgb-d data
WO2020152879A1 (en) Operation analysis device, operation analysis method, operation analysis program, and operation analysis system
US11854214B2 (en) Information processing apparatus specifying a relationship between a sensor and an object included in image data, and method and non-transitory computer-readable storage medium
JP2018180894A (en) Information processing device, information processing method, and program
CN113836991A (en) Motion recognition system, motion recognition method, and storage medium
TWI823327B (en) Monitoring area setting device, monitoring area setting method and monitoring area setting program
JP7419969B2 (en) Generation method, generation program, and information processing device
WO2023176163A1 (en) Work support device, work support method, and work support program
WO2024029411A1 (en) Work feature amount display device, work feature amount display method, and work feature amount display program
WO2022190206A1 (en) Skeletal recognition method, skeletal recognition program, and gymnastics scoring assistance system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18899137

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18899137

Country of ref document: EP

Kind code of ref document: A1