WO2018087844A1 - Work recognition device and work recognition method - Google Patents

Work recognition device and work recognition method Download PDF

Info

Publication number
WO2018087844A1
WO2018087844A1 PCT/JP2016/083243 JP2016083243W WO2018087844A1 WO 2018087844 A1 WO2018087844 A1 WO 2018087844A1 JP 2016083243 W JP2016083243 W JP 2016083243W WO 2018087844 A1 WO2018087844 A1 WO 2018087844A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
work
body part
worker
association
Prior art date
Application number
PCT/JP2016/083243
Other languages
French (fr)
Japanese (ja)
Inventor
玄太 吉村
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2016/083243 priority Critical patent/WO2018087844A1/en
Priority to JP2018546920A priority patent/JP6444573B2/en
Priority to TW106107761A priority patent/TW201818297A/en
Publication of WO2018087844A1 publication Critical patent/WO2018087844A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present invention relates to a work recognition device and a work recognition method for recognizing work contents by an operator.
  • Patent Document 1 captures the motion of a worker who performs a work with both hands as a subject, and in particular, the motion of both hands of the worker is based on the motion of a reference worker that is a reference subject.
  • a motion analysis device for analysis is disclosed.
  • the coordinates of both hands are acquired for each frame of the video, and the locus of both hands is acquired by tracking the coordinates.
  • the left hand or right hand coordinates where the coordinates cannot be taken are estimated from the coordinates where the coordinates have disappeared.
  • the present invention has been made to solve the above-described problems, and an object of the present invention is to provide a work recognition device and a work recognition method for recognizing work contents by an operator with high accuracy.
  • the work recognition apparatus detects a part of a worker's body based on the sensor data acquisition unit that acquires sensor data and the sensor data acquired by the sensor data acquisition unit.
  • a body part information acquisition unit that acquires body part information about the body, an object information acquisition unit that acquires object information about the object based on the sensor data acquired by the sensor data acquisition unit, and body part information acquisition
  • An association unit for associating an object with a body part of an operator who performed an operation based on the body part information acquired by the unit and the object information acquired by the object information acquisition unit;
  • a recognition result analysis unit for recognizing work performed by the worker based on association information related to the association result associated by the association unit.
  • the work performed by the worker is recognized by associating the object with the body part of the worker who performed the work using the object. It can be recognized with high accuracy.
  • FIG. 1 It is a figure explaining an example of the whole structure of the work recognition system provided with the work recognition apparatus which concerns on Embodiment 1 of this invention. It is a block diagram of the work recognition apparatus which concerns on Embodiment 1 of this invention. It is a figure explaining an example of the calculation method of the association score by the score calculation part in this Embodiment 1.
  • FIG. It is a figure explaining an example of the correlation score calculation operation
  • FIG. 6A and 6B are diagrams showing an example of a hardware configuration of the work recognition apparatus according to Embodiment 1 of the present invention.
  • 3 is a diagram illustrating an example of a hardware configuration of a control device according to Embodiment 1.
  • FIG. It is a flowchart explaining operation
  • FIG. 9 is a flowchart illustrating details of an operation of an associating unit in step ST805 of FIG.
  • Embodiment 1 it is a figure explaining the output example of the recognition result of the whole operation
  • FIG. 1 is a diagram illustrating an example of the overall configuration of a work recognition system including a work recognition device 100 according to Embodiment 1 of the present invention.
  • the work recognizer 100 is based on, for example, video information received from the visible light camera 111 or the like, or sensing information received from the acceleration sensor 114 or the like. Then, the work performed using the object 131 is recognized, and the work recognition result and the work analysis result based on the work recognition result are output to an external device or the like.
  • the number of body parts 121 used when the worker 120 performs work is not limited to one.
  • the worker 120 may perform work using a plurality of body parts 121 such as a right hand and a left hand, for example. Also, the number of objects 131 used when the worker 120 performs the work is not limited to one, and the worker 120 may perform work using a plurality of objects 131. That is, in the first embodiment, the worker 120 performs work using one or more body parts 121 and one or more objects 131.
  • the work recognition system includes a work recognition device 100, a visible light camera 111, an infrared camera 112, a depth sensor 113, an acceleration sensor 114, a gyro sensor 115, a display 151, and a speaker 152. And a storage device 153 and a control device 154.
  • the visible light camera 111, the infrared camera 112, and the depth sensor 113 are imaging devices that capture the periphery of the worker 120, and transmit visible light images, infrared images, and depth images to the work recognition device 100, respectively.
  • the vicinity of the worker 120 is a preset range, and at least the body part 121 and the object 131 used when the worker 120 works are mainly moved during the work. A range is included.
  • the work recognition system includes the visible light camera 111, the infrared camera 112, and the depth sensor 113 as the imaging device.
  • the work recognition system only needs to include at least one of these.
  • a sensor in which two or more of the visible light camera 111, the infrared camera 112, and the depth sensor 113 are integrated may be used.
  • an imaging device that can capture an image around the worker 120 may be provided.
  • each of the visible light camera 111, the infrared camera 112, and the depth sensor 113 there is no restriction
  • a marker may be attached to the body part 121 or the object 131 of the worker 120.
  • a bar code or a marker printed with a characteristic figure a reflective marker that shines when reflected by visible light or infrared light, a color marker using a characteristic color, an infrared marker that emits infrared light, or the like is used. be able to.
  • the work recognition apparatus 100 can detect, track, or recognize the body part 121 or the object 131 of the worker 120 with high accuracy by detecting, tracking, or recognizing the marker. Details of the work recognition device 100 will be described later. When using a marker, all the markers to be used may be the same marker, or may be different for each body part 121 or object 131 of the worker 120.
  • the acceleration sensor 114 is attached to one or both of the body part 121 and the object 131 of the worker 120, and transmits time series information of acceleration to the work recognition device 100.
  • the gyro sensor 115 is attached to one or both of the body part 121 and the object 131 of the worker 120, and transmits time-series information on angular acceleration to the work recognition device 100.
  • the work recognition system includes an acceleration sensor 114 and a gyro sensor 115 as sensors capable of sensing the movement of the body part 121 or the object 131 of the worker 120.
  • the present invention is not limited to this, and the work recognition system only needs to include at least one of these.
  • the work recognition system may include a sensor that can sense the movement of the body part 121 and the object 131 of the worker 120.
  • the visible light camera 111, the infrared camera 112, the depth sensor 113, the acceleration sensor 114, and the gyro sensor 115 are collectively referred to as sensors, and data obtained from these sensors are collectively referred to as sensor data. . That is, image data such as a visible light image obtained by the visible light camera 111, an infrared image obtained by the infrared camera 112, and a depth image obtained by the depth sensor 113 are also included in the sensor data.
  • the work recognition device 100 recognizes the work by capturing the movement of the body part 121 and the object 131 of the worker 120 based on at least one of the sensor data received from the sensor, and recognizes the work recognition result or the work recognition.
  • the work analysis result using the result is output to at least one of the display 151, the speaker 152, the storage device 153, or the control device 154.
  • the display 151 outputs the work recognition result output from the work recognition device 100 or the work analysis result using the work recognition result as a video or the like.
  • the speaker 152 outputs the work recognition result output from the work recognition apparatus 100 or the work analysis result using the work recognition result by video or voice.
  • the storage device 153 stores the work recognition result output from the work recognition device 100 or the work analysis result using the work recognition result.
  • the work recognition device 100 causes the storage device 153 to store the work recognition result for a long time.
  • the work recognition apparatus 100 may store a plurality of work recognition results in the storage device 153.
  • the manager who manages the work of the worker 120 refers to the work recognition result stored in the storage device 153 and performs work analysis or creates a work analysis report.
  • the manager or the like can perform the work analysis later instead of the real-time work recognition.
  • the work recognition result stored in the storage device 153 may be used by other methods.
  • control device 154 performs various controls based on a work recognition result output from the work recognition device 100 or a control signal corresponding to a work analysis result using the work recognition result. Specifically, for example, the control device 154 controls the robot to assist the work of the worker 120 based on the work recognition result and the like. This makes it possible to supply necessary parts or tools to the worker 120 according to the work situation, or to assist the work so that the work delay can be recovered when the work of the worker 120 is slow. This is only an example, and the control device 154 may perform other controls.
  • the display 151, the speaker 152, the storage device 153, and the control device 154 are collectively referred to as an output device.
  • the work recognition system according to the first embodiment includes a display 151, a speaker 152, a storage device 153, and a control device 154 as shown in FIG. The system may include at least one of these.
  • FIG. 2 is a configuration diagram of the work recognition apparatus 100 according to Embodiment 1 of the present invention.
  • the work recognition apparatus 100 includes a sensor data acquisition unit 210, a body part information acquisition unit 220, an object information acquisition unit 230, an association unit 240, a recognition result analysis unit 250, and an output control unit. 260.
  • the sensor data acquisition unit 210 acquires sensor data from at least one of the visible light camera 111, the infrared camera 112, the depth sensor 113, the acceleration sensor 114, and the gyro sensor 115.
  • the sensor data acquisition unit 210 outputs the acquired sensor data to the body part information acquisition unit 220 and the object information acquisition unit 230.
  • the body part information acquisition unit 220 detects the body part 121 of the worker 120 based on the sensor data output by the sensor data acquisition unit 210 and acquires body part information regarding the body part 121 of the worker 120.
  • the body part information acquisition unit 220 causes the body part information storage unit 270 to store the acquired body part information related to the body part 121 of the worker 120.
  • the body part information acquisition unit 220 includes a body part detection unit 221, a body part tracking unit 222, and a body part recognition unit 223.
  • the body part detection unit 221 detects the body part 121 of the worker 120 based on the sensor data output from the sensor data acquisition part 210 and acquires the position coordinates of the part 121.
  • the position coordinates of the part 121 may be, for example, the coordinates of an arbitrary point of the detected body part 121, or may be the coordinates of the upper right and lower left corners of a rectangle enclosed so as to include the body part 121. It is possible to appropriately set which point the position coordinate of the body part 121 of the person 120 is set to.
  • the body part detection unit 221 may use the position coordinates of the body part 121 of the worker 120 as two-dimensional coordinates on the acquired image or three-dimensional coordinates estimated using the depth information.
  • the origin and the coordinate system of the position coordinates of the part 121 detected by the body part detection unit 221 are arbitrary.
  • the body part detection unit 221 may have a two-dimensional coordinate system in which the upper left of the acquired video is the origin, the right direction is the X axis, and the lower direction is the Y axis, or a specific point in the work area is the origin, and the vertical A three-dimensional coordinate system with the upward direction as the Z-axis may be used.
  • the body part detection part 221 shall acquire the position coordinate of the body part 121 of the operator 120 by arbitrary methods.
  • the body part detection unit 221 calculates the position coordinates of the body part 121 of the worker 120 by an existing feature point based detection method such as SURF (Speed Up Robust Features) or HOG (Hisgrams of Oriented Gradients).
  • the position coordinates of the body part 121 of the worker 120 may be acquired by a model-based detection method such as a neutral network.
  • the body part detection unit 221 may employ a detection method using only a single frame image acquired from the sensor data acquisition unit 210, or may employ a detection method using a plurality of frames of video. Good.
  • the body part detection unit 221 may acquire the position coordinates of the body part 121 of the worker 120 by any detection method using any one of an optical image, an infrared image, a depth image, and the like.
  • the position coordinates of the body part 121 of the worker 120 may be acquired by a detection method using a combination of infrared video or depth video.
  • the body part detection unit 221 acquires position coordinates for each detected part 121.
  • the body part detection unit 221 stores the acquired position coordinates of the body part 121 of the worker 120 in the body part information storage unit 270 in association with the acquisition date information.
  • the acquisition date / time information may be information on the acquisition date / time associated with the video frame acquired from the sensor data acquisition unit 210.
  • the body part information storage unit 270 stores the position coordinates and the acquisition date information in association with each part 121.
  • the body part tracking unit 222 acquires the locus of the body part 121 of the worker 120 detected by the body part detection unit 221. Specifically, the body part tracking unit 222 tracks the body part 121 of the worker 120 from which the body part detection unit 221 has acquired the position coordinates based on the sensor data acquired from the sensor, and moves the part 121. Get the later position coordinates. The body part tracking unit 222 tracks the position coordinates of the body part 121 of the worker 120 based on the information about the part 121 stored in the body part information storage unit 270 by the body part detection unit 221. When there are a plurality of parts 121 detected by the body part detection unit 221, the body part tracking unit 222 acquires the position coordinates after movement for each of the plurality of parts 121.
  • the body part tracking unit 222 may track the part 121 by any tracking method.
  • the body part tracking unit 222 may track the part 121 using an update template matching, an active search, a Mean-shift method, or an existing region-based tracking method using a particle filter.
  • the part 121 may be tracked using a KLT method (Kanade-Lucas-Tomasi Feature Tracker) or a feature point-based tracking method such as SURF Tracking.
  • the body part tracking unit 222 may track the part 121 by a tracking method using any one of an optical image, an infrared image, a depth image, and the like, or an optical image, an infrared image, a depth image, etc.
  • the part 121 may be tracked by a detection method using a combination of the above.
  • the body part tracking unit 222 acquires the acquired position coordinates after the movement of the body part 121 of the worker 120, that is, the information on the trajectory of the body part 121 of the worker 120, along with information on the acquisition date and time of the position coordinates.
  • the body part information storage unit 270 stores the information in association with the part 121 information.
  • the body part recognition unit 223 recognizes the type, shape, or state of the body part 121 of the worker 120 that the body part detection unit 221 acquires the position coordinates and stores in the body part information storage unit 270.
  • the type of body part 121 of the worker 120 refers to, for example, a right hand, a left hand, an elbow, a face, and the like. Note that this is only an example, as long as the body part 121 of the worker 120 is specified.
  • the shape or state of the body part 121 of the worker 120 is, for example, the shape of the right hand holding the object 131 when the worker 120 is holding the object 131 with the palm of the right hand, or the worker A state in which 120 is holding the object 131 or the like.
  • the body part recognition unit 223 may recognize the type, shape, or state of the part 121 by an arbitrary recognition method.
  • the type, shape, or state of the part 121 may be recognized by an existing feature point-based detection method such as SURF or HOG, or the part 121 may be recognized by a model-based detection method such as a neutral network.
  • the type, shape, or state may be recognized.
  • the body part recognizing unit 223 may recognize the type, shape, or state of the part 121 by any recognition method using an optical image, an infrared image, or a depth image.
  • the type, shape, or state of the part 121 may be recognized by a recognition method using a combination of video and depth video.
  • the body part recognition unit 223 stores the information on the recognition result of the part 121 in the body part information storage unit 270 in association with the information on the part 121.
  • Information relating to the part 121 stored in the body part information storage unit 270 by the body part detection unit 221, the body part tracking unit 222, and the body part recognition unit 223 becomes the body part information.
  • the body part information by the body part detection unit 221, the body part tracking unit 222, and the body part recognition unit 223 is stored for each series of operations.
  • the object information acquisition unit 230 detects the object 131 based on the sensor data output from the sensor data acquisition unit 210 and acquires object information related to the object 131.
  • the object information acquisition unit 230 causes the object information storage unit 280 to store the acquired object information regarding the object 131.
  • the object information acquisition unit 230 includes an object detection unit 231, an object tracking unit 232, and an object recognition unit 233.
  • the object detection unit 231 detects the object 131 based on the sensor data output from the sensor data acquisition unit 210 and acquires the position coordinates of the object 131.
  • the position coordinates of the object 131 may be, for example, the coordinates of an arbitrary point of the detected object 131, or may be the coordinates of the upper right and lower left of a rectangle enclosed so as to include the object 131.
  • the coordinates of which point can be set as appropriate.
  • the object detection unit 231 may use the position coordinates of the object 131 as two-dimensional coordinates on the acquired video or three-dimensional coordinates estimated using depth information.
  • the origin and coordinate system of the position coordinates of the object 131 detected by the object detection unit 231 are arbitrary.
  • the object detection unit 231 may be a two-dimensional coordinate system in which the upper left of the acquired image is the origin, the right direction is the X axis, and the lower direction is the Y axis.
  • a three-dimensional coordinate system with the direction as the Z-axis may be used.
  • the object detection unit 231 acquires the position coordinates of the object 131 by an arbitrary method.
  • the object detection unit 231 may acquire the position coordinates of the object 131 by an existing feature point-based detection method such as SURF (Speed Up Robust Features) or HOG (Hisgrams of Oriented Gradients), for example.
  • the position coordinates of the object 131 may be acquired by a model-based detection method such as a neutral network.
  • the object detection unit 231 may employ a detection method using only a single frame image acquired from the sensor data acquisition unit 210, or may employ a detection method using a plurality of frames of video. .
  • the object detection unit 231 may acquire the position coordinates of the object 131 by a detection method using any one of an optical image, an infrared image, a depth image, and the like, or an optical image, an infrared image, or a depth image.
  • the position coordinates of the object 131 may be acquired by a detection method using a combination of the above.
  • the object detection unit 231 acquires position coordinates for each detected object 131.
  • the object detection unit 231 stores the acquired position coordinates of the object 131 in the object information storage unit 280 in association with the acquisition date information.
  • the acquisition date / time information may be information on the acquisition date / time associated with the video frame acquired from the sensor data acquisition unit 210.
  • the object detection unit 231 stores the position coordinates and the acquisition date information in the object information storage unit 280 in association with each object 131.
  • the object tracking unit 232 acquires the trajectory of the object 131 detected by the object detection unit 231. Specifically, the object tracking unit 232 tracks the object 131 from which the object detection unit 231 has acquired the position coordinates based on the sensor data acquired from the sensor, and acquires the position coordinates after the movement of the object 131. The object tracking unit 232 tracks the position coordinates of the object 131 based on the information regarding the object 131 stored in the object information storage unit 280 by the object detection unit 231. When there are a plurality of objects 131 detected by the object detection unit 231, the object tracking unit 232 acquires the position coordinates after movement for each of the plurality of objects 131.
  • the object tracking unit 232 may track the object 131 by any tracking method.
  • the object tracking unit 232 may track the part 121 using an update template matching, an active search, a Mean-shift method, or an existing region-based tracking method using a particle filter or the like.
  • the object 131 may be tracked using a feature point-based tracking method such as a method (Kanade-Lucas-Tomasi Feature Tracker) or SURF Tracking.
  • the object tracking unit 232 may track the object 131 by a tracking method using any one of an optical image, an infrared image, a depth image, and the like, and an optical image, an infrared image, a depth image, or the like.
  • the object 131 may be tracked by a detection method used in combination.
  • the object tracking unit 232 stores the acquired position coordinates after the movement of the object 131, that is, information on the trajectory of the object 131, together with information on the acquisition date and time of the position coordinates, in the object information storage unit 280.
  • the information is stored in association with the information.
  • the object recognition unit 233 recognizes the type, shape, or state of the object 131 that the object detection unit 231 acquires the position coordinates and stores in the object information storage unit 280.
  • the type of the object 131 is, for example, information for specifying the object 131 used when the worker 120 performs the assembly work, and refers to each part to be assembled or an assembly tool.
  • the shape or state of the object 131 means, for example, when the worker 120 performs an assembly operation using a certain component, the direction of the component, a state where the component is incorporated in the board, or the like. Note that this is only an example, and the same applies to the case where the worker 120 performs a work using a tool, for example.
  • the object recognition unit 233 may recognize the type, shape, or state of the object 131 by any recognition method.
  • the type, shape, or state of the object 131 may be recognized by an existing feature point-based detection method such as SURF or HOG, or the object 131 may be detected by a model-based detection method such as a neutral network.
  • the type, shape, or state may be recognized.
  • the object recognition unit 233 may recognize the type, shape, or state of the object 131 by a recognition method using any one of an optical image, an infrared image, and a depth image, and the optical image, the infrared image
  • the type, shape, or state of the object 131 may be recognized by a recognition method using a combination of depth images.
  • the object recognition unit 233 stores information on the recognition result of the object 131 in the object information storage unit 280 in association with the information on the object 131.
  • Information relating to the object 131 stored in the object information storage unit 280 by the object detection unit 231, the object tracking unit 232, and the object recognition unit 233 becomes object information.
  • the object information by the object detection unit 231, the object tracking unit 232, and the object recognition unit 233 is stored for each series of operations.
  • the associating unit 240 stores the body part information related to the body part 121 of the worker 120 stored in the body part information storage unit 270 by the body part information acquisition unit 220 and the object information acquisition unit 230 stores it in the object information storage unit 280. Based on the object information regarding the object 131, the association between the object 131 and the body part 121 of the worker 120 who performed the work using the object 131 is performed. The associating unit 240 outputs the associating result to the recognition result analyzing unit 250.
  • the associating unit 240 includes a score calculating unit 241, a matching unit 242, and a position correcting unit 243.
  • the score calculation unit 241 includes the body part information related to the body part 121 of the worker 120 stored in the body part information storage unit 270 by the body part information acquisition unit 220 and the object information acquisition unit 230 stored in the object information storage unit 280. Using the stored object information regarding the object 131, an association score representing the degree of association between the body part 121 of the worker 120 and the object 131 is calculated. The degree of association between the body part 121 of the worker 120 and the object 131 indicates which worker body part 121 is likely to be associated with the movement of the object 131.
  • the score calculation unit 241 may calculate the association score by any method. For example, the score calculation unit 241 may increase the association score if the position coordinate between the body part 121 of the worker 120 and the object 131 is close, or the movement direction of the body part 121 of the worker 120 If the moving direction of the object 131 is close, the association score may be increased. In addition, for example, the score calculation unit 241 may increase the association score if the body part 121 of the worker 120 has a shape that holds the object 131.
  • FIG. 3 is a diagram for explaining an example of an association score calculation method by the score calculation unit 241 in the first embodiment.
  • the body part information acquisition unit 220 detects the part X and the part Y as the body part 121 and acquires the body part information
  • the object information acquisition unit 230 acquires the objects A to D as the object 131. It is assumed that object information has been detected. Also, in FIG. 3, it is assumed that information on the locus of the position coordinates for 10 seconds of the parts X and Y and the objects A to D is obtained, and the position for 10 seconds in FIG. Show.
  • the score calculation unit 241 sets the object A higher in the association score with the part X that is closer to the object A than the association score with the part Y.
  • the score calculation unit 241 causes the association score between the object A and the part X closer to the object A to be more than the association score with the part Y. Is also set high.
  • the score calculation unit 241 sets the association score between the object B and the part Y high. In this way, the score calculation unit 241 determines which body part 121 is likely to be related to the movement of the object 131 with respect to the combination of the body part 121 and the object 131 of the worker 120 by any method. Calculate the association score to represent. Note that, as described above, the calculation method of the association score by the score calculation unit 241 described with reference to FIG. 3 is merely an example, and the score calculation unit 241 may calculate the association score by other methods. The strength of the association with respect to the combination of the working object 131 and the body part 121 of the worker 120 may be set in accordance with a preset criterion.
  • the score calculation unit 241 determines the position coordinates of the object 131.
  • the association score may be calculated by combining the vanishing point and the detection point and interpolating the locus of the position coordinates.
  • FIG. 4 is a diagram for explaining an example of position coordinate locus interpolation and post-interpolation association score calculation operations by the score calculation unit 241 in the first embodiment.
  • the score calculation unit 241 interpolates the locus of the position coordinates for the object A in which the locus shown in FIG. 3 is interrupted using FIG. 4, and the association score with the parts X and Y for the object A obtained by interpolating the locus.
  • An example of the operation for calculating the will be described.
  • the score calculation unit 241 interpolates the locus of the position coordinates of the object A by combining the vanishing point and the detection point of the object A as indicated by the dotted line in FIG.
  • the score calculation unit 241 calculates an association score between the object A and the parts X and Y in the interpolated position coordinate trajectory.
  • the part Y is closer to the part Y than the part X with respect to the position coordinates of the object A three seconds after the interpolation.
  • the position X disappearance point and the position coordinate detection point that is, the part X for which a higher association score is calculated with the object A after 2 seconds and after 8 seconds. Therefore, for example, the score calculation unit 241 determines that there is a high possibility that the part X is used for the work accompanying the movement of the object A even while the locus of the position coordinates is interrupted.
  • the association score with the part X is set higher than the association score with the part Y.
  • the score calculation unit 241 not only detects the positional relationship between the interpolated position coordinate and the surrounding part 121 while the position coordinate of the object 131 is interrupted, but also the vanishing point and the detection point of the object 131, that is, the position
  • the association score can be calculated in consideration of the association with the body part 121 of the worker 120 before and after the coordinates are interrupted.
  • the interpolation of the position coordinate trajectory by the score calculation unit 241 and the correlation score calculation method after the interpolation described with reference to FIG. 4 are merely examples, and the score calculation unit 241 uses other methods. You may make it perform the calculation of the correlation score after interpolation of the locus
  • the score calculation unit 241 may increase the association score if the body part 121 of the worker 120 moves in the same direction. Specifically, for example, it is assumed that the operator 120 moves a part V with the right hand from point a to point d, from point a ⁇ point b ⁇ point c ⁇ point d. At that time, since the object detection unit 231 cannot detect the component V at the point b, the object tracking unit 232 cannot track the component V halfway, and the object detection unit 231 detects the component V again at the point c. The object tracking unit 232 can resume tracking of the part V. In this case, the point b is a vanishing point and the point c is a detection point.
  • the object tracking unit 232 can acquire the locus of position coordinates from point a to point b and the locus of position coordinates from point c to point d, but obtains the locus of position coordinates from point b to point c. It is assumed that it is not possible to do so. Further, it is assumed that the object recognition unit 233 cannot recognize whether the object 131 from the point a ⁇ the point b and the object 131 from the point c ⁇ the point d are the same object 131 or different objects 131.
  • the score calculation unit 241 tries to interpolate the trajectory of the object 131 from the point b ⁇ the point c by, for example, linear interpolation, and the object 131 of the point a ⁇ the point b ⁇ the point c ⁇ the point d. If the movement direction matches the movement direction of the right hand, it is determined that the part V has been moved from point a ⁇ point b ⁇ point c ⁇ point d with the right hand, and the association score between the part V and the right hand is increased. .
  • the matching unit 242 can easily select the association when the matching unit 242 determines the combination of the body part 121 and the object 131 of the worker 120, and as a result, the point b ⁇ The discontinuity of the locus of the point c can be interpolated well.
  • the matching unit 242 will be described later.
  • the score calculation unit 241 determines that the vanishing point and the object 131 at the detection point are the same, the score calculation unit 241 increases the association score between the vanishing point and the part 121 associated with the object 131 at the detection point. Also good. For example, in the example described above, the object recognition unit 233 recognizes that the object 131 from the point a ⁇ the point b and the object 131 from the point c ⁇ the point d are both components V. In this case, the score calculation unit 241 interpolates the locus of position coordinates from point b to point c, and increases the association score between the component V and the right hand.
  • the score calculation unit 241 increases the score associated with the fact that the object 131 is not moved in the body part 121 of the worker 120 if the difference between the position coordinates of the vanishing point and the detection point is equal to or less than a certain value. It may be. For example, it is assumed that the object detection unit 231 cannot detect the component W for a moment because the left hand passes over the stationary component W for a moment. If the position of the part W is not changed before and after the object detection unit 231 can no longer detect the part W, the score calculation unit 241 determines that the part W is not moved with the left hand, and the part W If the is not moved with the left hand, increase the associated score. The score calculation unit 241 may lower the association score between the component W and the left hand.
  • the score calculation unit 241 is only required to perform the interpolation of the locus of the position coordinates of the object 131 and the setting of the association score after the interpolation in accordance with a preset criterion.
  • the score calculation unit 241 outputs information on the calculated association score to the matching unit 242.
  • the matching unit 242 uses the association score calculated by the score calculation unit 241 to combine the body part 121 of the worker 120 and the object 131 so as to maximize the association score within a range where the matching as the work can be achieved. To decide. That is, based on the association result in which the body part 121 of the worker 120 and the object 131 are associated with each other, the object 131 is combined with the body part 121 of the worker 120 that is determined to be most related to the object 131. Will be. Conditions for achieving consistency as work are arbitrary.
  • the matching unit 242 prevents the same body part 121 from being present at a position coordinate separated by a certain value or more at the same time, or prevents the same object 131 from being associated with a plurality of conflicting operations at the same time. Is a condition for achieving consistency.
  • the matching unit 242 determines a combination of the body part 121 and the object 131 of the worker 120 that satisfies the condition and maximizes the sum of the association scores.
  • the matching unit 242 determines the combination of the parts X and Y with respect to the objects A to D detected in 10 seconds after 0 to 10 seconds as shown in FIGS.
  • the score calculation unit 241 sets a high association score with the part X for 0 to 10 seconds later. Therefore, the matching unit 242 determines that the object A has been moved according to the part X, and combines the object A and the part X. Further, the position coordinates of the object B after 0 to 3 seconds and after 8 to 10 seconds are not changed, and the score for the association with the region Y is high by the score calculation unit 241 until after 3 to 8 seconds. Is set. Therefore, the matching unit 242 combines the object B and the part Y on the assumption that the object B starts to move after 3 seconds by the part Y and has moved to the position coordinates detected after 8 seconds.
  • the score calculation unit 241 prioritizes the proximity of the position coordinates for the object A after 3 seconds. Assume that the association score with the part Y is calculated higher than the association score with the part X. Since the interpolation position of the object A after 3 seconds based on the data of only the object A is closer to the part Y than the part X, as shown in FIG. Score calculation can occur. In this case, a high association score is set for the part Y in both the object A and the object B after 3 seconds. Therefore, the matching unit 242 determines whether the part Y should be combined with the object A or B within a range where matching as work can be performed.
  • the matching unit 242 may determine whether the combination of the object B and the parts X and Y is matched from the combination of the front and rear objects B and the parts X and Y.
  • the part X has a high association score with the object A. That is, it can be said that the object A is more consistent as a work if the work is performed as a series of operations using the part X. Therefore, the matching unit 242 combines the object B and the part Y after 3 seconds.
  • the objects C and D have no movement in position coordinates for 10 seconds.
  • the parts X and Y are detected around 8 to 10 seconds later, and the parts X and Y are set to have an association score higher than that of the object C, respectively, and an operation involving movement is performed.
  • the matching unit 242 determines a range in which the matching as the work can be obtained by an appropriate method set in advance, and the combination of the body part 121 and the object 131 of the worker 120. As long as it comes to decide.
  • the matching unit 242 outputs the determined combination of the body part 121 and the object 131 of the worker 120 to the position correction unit 243.
  • the position correction unit 243 uses the combination of the body part 121 and the object 131 of the worker 120 determined by the matching unit 242 to determine the position coordinates of the body part 121 and / or the object 131 of the worker 120 or both. to correct.
  • the position correction unit 243 may correct the position coordinates of the body part 121 and / or the object 131 of the worker 120 using any correction method. For example, when the position coordinates of the object 131 are interrupted, the position correction unit 243 may interpolate the position coordinates of the object 131 in association with the locus of the part 121 combined with the object 131. On the contrary, when the locus of the position coordinate of the part 121 is interrupted, the position coordinate of the part 121 may be interpolated in association with the locus of the object 131 combined with the part 121.
  • FIG. 5 is a diagram for explaining an example of the position coordinate correction operation by the position correction unit 243 in the first embodiment.
  • the position correction unit 243 has the position coordinates interpolated by the score calculation unit 241 and combined with the part X by the matching unit 242 as shown in FIG. An operation for correcting the position coordinates until after will be described.
  • the position correction unit 243 adds the position coordinates to the locus of the position coordinates of the part X combined with the object A. to correct.
  • the position coordinate of the object A becomes a locus similar to that of the part X until 2 to 8 seconds later.
  • the operation of the position correction unit 243 described with reference to FIG. 5 is merely an example, and the position correction unit 243 may correct the position coordinates by other methods.
  • the position correction unit 243 causes the position correction unit 243 to copy the locus as it is, and associates the locus with the start point and the end point so that the start point and the end point of the locus are not discontinuous.
  • the position coordinates of the part 121 or the object 131 may be corrected.
  • the part Z is moved from the point e ⁇ the point f ⁇ the point g ⁇ the point h with the right hand, the locus of the part Z is interrupted at the point f ⁇ the point g, and the position coordinates between the interruptions are interpolated.
  • the position correction unit 243 interpolates the locus of the position coordinates of the part Z from the position e ⁇ the position f with the locus of the position coordinates of the right hand at the same time.
  • the position coordinate of the part Z and the position coordinate of the right hand may not coincide with each other, it is assumed that the locus of the position coordinate of the part Z becomes unnaturally discontinuous from the point f to the point g. .
  • the position correction unit 243 adjusts the part Z so that the part Z is natural, that is, continuous, in accordance with the positions of the start point and the end point so that the start point and end point of the path are not discontinuous.
  • the position coordinates may be corrected.
  • the locus of the position coordinates of the object 131 is interrupted, considering the association between the body part 121 of the worker 120 and the object 131, the body part of the worker 120, or The trajectory of the object 131 can be interpolated.
  • the left hand or right hand coordinates that cannot be taken are estimated from the coordinates of the location where the coordinates have disappeared, and the correspondence between the object and the body part of the worker is not taken into consideration.
  • the locus can be interpolated with higher accuracy.
  • the position correction unit 243 outputs the corrected position coordinates of the body part 121 and the object 131 of the worker 120 and information on the combination result to the recognition result analysis unit 250 as association information.
  • amendment part 243 outputs the position coordinate before correction
  • the recognition result analysis unit 250 recognizes the recognition result of the entire work through a series of work based on the association information output by the association unit 240. Further, the recognition result analysis unit 250 analyzes the work based on the recognition result of the whole work.
  • the recognition result of the entire work refers to association information in which the object 131 and the part 121 that performed the work using the object 131 are associated with each other, and is information indicating a series of work flows. Then, the recognition result analysis unit 250 outputs to the output control unit 260 information regarding the recognition result of the entire work or the analysis result obtained by analyzing the recognition result.
  • the recognition result analysis unit 250 includes a success / failure determination unit 251, a type identification unit 252, a content comparison unit 253, and a condition determination unit 254.
  • the success / failure determination unit 251 determines whether or not the prescribed work has been performed using the association information between the body part 121 of the worker 120 and the object 131 determined by the association unit 240. Specifically, for example, for each object 131, a work list in which work to be performed by the worker 120 using the object 131 is registered in advance, and the stored work list and the association information are stored. Based on this, the success / failure determination unit 251 determines whether the worker 120 has performed an operation to be performed.
  • the success / failure determination unit 251 may determine whether or not the prescribed work has been performed by another method.
  • the success / failure determination unit 251 outputs a determination result on whether or not the prescribed work has been performed to the output control unit 260.
  • the type identifying unit 252 identifies which of the prescribed work types corresponds to using the association information between the body part 121 of the worker 120 and the object 131 determined by the associating unit 240. Specifically, for example, for each work, work type information in which a procedure of which object 131 is used to perform what operation is linked and classified is stored in advance, and the stored work is stored. Based on the type information and the association information, the type identification unit 252 identifies which of the prescribed work types the work performed by the worker 120 corresponds. Note that this is merely an example, and the type identification unit 252 may identify which of the prescribed work types corresponds to other methods.
  • the type identifying unit 252 outputs to the output control unit 260 an identification result indicating which of the prescribed work types corresponds.
  • the content comparison unit 253 uses the association information determined by the association unit 240 between the body part 121 of the worker 120 and the object 131 to compare the work performed by the worker 120 with a work different from the work performed by the worker 120.
  • the contents of work performed by the person 120 are analyzed. Specifically, for example, work by another worker who serves as a model is stored in advance as a work history, and based on the stored work history and the association information, the content comparison unit 253 The contents are compared, and, for example, it is determined whether there is a missing work, there is a useless work, the work time is slow, or the work is faster than the work of other workers. This is merely an example, and the content comparison unit 253 may compare the work content by other methods.
  • the content comparison unit 253 outputs the work content comparison result to the output control unit 260.
  • the condition determination unit 254 determines whether or not the work content satisfies a specified condition, using the association information between the body part 121 and the object 131 of the worker 120 determined by the association unit 240. Specifically, based on the work type information as described above and the association information stored in advance, the condition determination unit 254 determines whether the time taken by the worker 120 for the work is too long. Determine. The conditions can be set as appropriate, and the condition determination unit 254 may determine whether or not other conditions are satisfied.
  • the recognition result analysis unit 250 includes a success / failure determination unit 251, a type identification unit 252, a content comparison unit 253, and a condition determination unit 254 as shown in FIG. Not limited to this, the recognition result analysis unit 250 only needs to include at least one of the success / failure determination unit 251, the type identification unit 252, the content comparison unit 253, and the condition determination unit 254.
  • the recognition result analysis unit 250 outputs the success / failure determination unit 251, the type identification unit 252, the content comparison unit 253, or the work analysis result using the overall work recognition result by the condition determination unit 254, and the association unit 240 outputs the result.
  • the association information that is, information on the recognition result of the entire work is also output to the output control unit 260.
  • the output control unit 260 outputs the recognition result of the entire work output by the recognition result analysis unit 250 to at least one of the output devices. In addition, when receiving an analysis result output instruction, the output control unit 260 outputs a work analysis result using the recognition result of the entire work output by the recognition result analysis unit 250 to at least one of the output devices.
  • the user can specify an analysis result to be output by using an input device (not shown), and an instruction receiving unit (not shown) receives the designation from the user as an analysis result output instruction, and performs recognition result analysis.
  • An output instruction is given to the unit 250.
  • the output control unit 260 outputs the work analysis result output from the recognition result analysis unit 250 to at least one of the output devices. Specifically, for example, when information on a determination result indicating whether or not a prescribed work has been performed is output from the success / failure determination unit 251 of the recognition result analysis unit 250, the output control unit 260 determines whether or not the determination is made on the display 151. The result is displayed with ⁇ and ⁇ .
  • the output control unit 260 may cause the speaker 152 to output a sound corresponding to the determination result, or may cause the storage device 153 to store the determination result.
  • the output control unit 260 may transmit different control signals to the control device 154 according to the determination result.
  • the output control unit 260 when the information on the work type is output from the type identification unit 252 of the recognition result analysis unit 250, the output control unit 260 causes the display 151 to display a video indicating the work type. Alternatively, the output control unit 260 may cause the speaker 152 to output a sound corresponding to the work type, or may cause the storage device 153 to store the work type. The output control unit 260 may transmit different control signals to the control device 154 according to the work type.
  • the output control unit 260 when the information of the comparison result of the work is output from the content comparison unit 253 of the recognition result analysis unit 250, the output control unit 260 causes the display 151 to display a video indicating the information of the comparison result of the work.
  • the output control unit 260 may output a sound or sound for work that is lacking in the speaker 152, or may store the comparison result in the storage device 153.
  • the output control unit 260 may transmit different control signals to the control device 154 according to the comparison result.
  • the output control unit 260 delays the operation on the display 151. A message indicating that it is too much is displayed.
  • the output control unit 260 may cause the speaker 152 to output a sound or sound indicating that the work is too slow, or may cause the storage device 153 to store the determination result.
  • the output control unit 260 may transmit different control signals to the control device 154 according to the comparison result.
  • the body part information storage unit 270 stores body part information regarding the body part 121 of the worker 120 acquired by the body part information acquisition unit 220.
  • the object information storage unit 280 stores object information related to the object 131 acquired by the object information acquisition unit 230.
  • the body part information storage unit 270 and the object information storage unit 280 are provided in the work recognition device 100, but not limited thereto, the body part information storage unit 270 and The object information storage unit 280 may be provided outside the work recognition apparatus 100 in a place where the work recognition apparatus 100 can refer to it.
  • the body part information storage unit 270 and the object information storage unit 280 may be a single storage medium.
  • FIG. 6A and 6B are diagrams showing an example of a hardware configuration of the work recognition apparatus 100 according to Embodiment 1 of the present invention.
  • the functions of the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 are realized by the processing circuit 601. Is done. That is, the work recognition apparatus 100 includes a processing circuit 601 for performing processing for recognizing work performed by the worker 120 on the acquired sensor data and performing control for outputting a recognition result or the like.
  • the processing circuit 601 may be dedicated hardware as shown in FIG. 6A or may be a CPU (Central Processing Unit) 606 that executes a program stored in the memory 607 as shown in FIG. 6B.
  • CPU Central Processing Unit
  • the processing circuit 601 When the processing circuit 601 is dedicated hardware, the processing circuit 601 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), and an FPGA (Field-Programmable). Gate Array) or a combination of these.
  • each function of the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 is software, firmware, or Realized by a combination of software and firmware. That is, the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 are stored in the HDD (Hard Disk Drive) 602, the memory 607, and the like. It is realized by a processing circuit such as a CPU 606 for executing a program and a system LSI (Large-Scale Integration).
  • the programs stored in the HDD 602, the memory 607, and the like are the procedures and methods of the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260. It can be said that it is what is executed by a computer.
  • the memory 607 is, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory, an EEPROM). And volatile semiconductor memories, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
  • the functions of the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 are realized by dedicated hardware.
  • the unit may be realized by software or firmware.
  • the function of the body part information acquisition unit 220 is realized by a processing circuit 601 as dedicated hardware, and the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 are realized.
  • the processing circuit can realize the function by reading and executing the program stored in the memory 607.
  • the work recognition apparatus 100 also includes a communication device 605 that communicates with external devices such as a visible light camera 111, an infrared camera 112, a depth sensor 113, an acceleration sensor 114, a gyro sensor 115, a display 151, and a speaker 152.
  • the sensor data acquisition unit 210 configures an input interface device 303 that acquires sensor data from the visible light camera 111, the infrared camera 112, the depth sensor 113, the acceleration sensor 114, the gyro sensor 115, and the like using the communication device 605.
  • the work recognition apparatus 100 also includes an output interface device 304 for outputting work recognition results and the like.
  • the output control unit 260 communicates with the display 151, the speaker 152, and the like using the communication device 605, and outputs work recognition results and the like to the display 151, the speaker 152, and the like using the output interface device 604.
  • the body part information storage unit 270 and the object information storage unit 280 use, for example, the HDD 602. This is only an example, and the body part information storage unit 270 and the object information storage unit 280 may be configured by a DVD, a memory 607, and the like. Further, the work recognition device 100 may be configured to have an auxiliary storage device (not shown) for storing work recognition results and the like.
  • FIG. 7 is a diagram illustrating an example of a hardware configuration of the control device 154 according to the first embodiment.
  • the control device 154 includes a communication device 703 that communicates with the work recognition device 100, and receives the work recognition result and the like transmitted by the work recognition device 100 via the communication device 605 using the communication device 703. .
  • the control device 154 includes a CPU 701 that generates a command signal for operating a driving device (not shown) such as a servo.
  • the control device 154 includes a memory 702 that temporarily stores a work recognition result received using the communication device 703 and a command signal for operating a driving device such as a servo.
  • the communication device 703 receives a work recognition result from the work recognition device 100 and transmits a command signal for operating a drive device such as a servo.
  • FIG. 8 is a flowchart for explaining the operation of the work recognition apparatus 100 according to Embodiment 1 of the present invention.
  • Sensor data acquisition section 210 acquires sensor data from at least one of visible light camera 111, infrared camera 112, depth sensor 113, acceleration sensor 114, and gyro sensor 115 (step ST801).
  • the sensor data acquisition unit 210 outputs the acquired sensor data to the body part information acquisition unit 220 and the object information acquisition unit 230.
  • Body part information acquisition section 220 acquires body part information related to body part 121 of worker 120 based on the sensor data output from sensor data acquisition section 210 (step ST802).
  • FIG. 9 is a flowchart illustrating details of the operation of body part information acquisition section 220 in step ST802 of FIG.
  • the body part detection unit 221 of the body part information acquisition unit 220 acquires the already acquired body part information from the body part information storage unit 270 (step ST901).
  • the body part detection unit 221 is based on the body part information acquired from the body part information storage unit 270, in the optical image, infrared image, or depth image before frame included in the sensor data output from the sensor data acquisition unit 210. It is determined whether each part 121 of the worker's 120 body has been detected (step ST902).
  • the body part detection unit 221 may determine the previous frame from the position information acquisition date / time information included in the body part information and the current date / time information.
  • step ST902 when the part 121 has not been detected in the previous frame (in the case of “NO” in step ST902), the body part detection unit 221 uses the sensor data output by the sensor data acquisition unit 210 to perform the operator 120.
  • the body part 121 of the worker 120 is detected ("YES” in step ST904), the position coordinates of the part 121 are acquired (step ST907).
  • the body part detection unit 221 stores the acquired position coordinate information of the body part 121 of the worker 120 in the body part information storage unit 270 in association with the acquisition date information.
  • the process proceeds to step ST803 in FIG.
  • step ST902 when the part 121 has been detected in the previous frame (in the case of “YES” in step ST902), the body part tracking unit 222 of the body part information acquisition unit 220 indicates that the body part detection unit 221 is in the previous frame.
  • the body part 121 of the worker 120 that has acquired the position coordinates is tracked (step ST905), and when the body part 121 of the worker 120 can be tracked (in the case of “YES” in step ST906), The position coordinates after the movement are acquired (step ST907).
  • the body part tracking unit 222 stores the acquired position coordinate information in the body part information storage unit 270 in association with the acquisition date information. If the body part 121 of the worker 120 cannot be tracked (in the case of “NO” in step ST906), the process proceeds to step ST803 in FIG.
  • the body part recognition unit 223 recognizes the type, shape, or state of the body part 121 of the worker 120, which the body part detection unit 221 acquires the position coordinates and stores in the body part information storage unit 270 ( Step ST908).
  • step ST901 to step ST908 shown in FIG. 9 is performed by the body part information acquisition unit 220, such as an optical image, infrared image, or infrared image included in the sensor data output from the sensor data acquisition unit 210.
  • This is a process of acquiring body part information for one frame of a depth image.
  • Object information acquisition section 230 acquires object information related to object 131 used by operator 120 for work based on the sensor data output from sensor data acquisition section 210 (step ST803).
  • FIG. 10 is a flowchart illustrating details of the operation of the object information acquisition unit 230 in Step ST803 of FIG.
  • the object detection unit 231 of the object information acquisition unit 230 acquires already acquired object information from the object information storage unit 280 (step ST1001).
  • the object detection unit 231 Based on the object information acquired from the object information storage unit 280, the object detection unit 231 includes, for each object 131, an optical image, an infrared image, or a depth image of the sensor data output by the sensor data acquisition unit 210. It is determined whether or not it has been detected (step ST1002). The object detection unit 231 may determine the previous frame from the position information acquisition date / time information included in the object information and the current date / time information.
  • step ST1002 when the object 131 is not detected in the previous frame (in the case of “NO” in step ST1002), the object detection unit 231 detects the object 131 based on the sensor data output from the sensor data acquisition unit 210. If the object 131 can be detected (in the case of “YES” in step ST1004), the position coordinates of the object 131 are acquired (step ST1007). Then, the object detection unit 231 stores the acquired position coordinate information of the object 131 in the object information storage unit 280 in association with the acquisition date information. When the object 131 cannot be detected (in the case of “NO” in step ST1004), the process proceeds to step ST804 in FIG.
  • step ST1002 when the object 131 has been detected in the previous frame (in the case of “YES” in step ST1002), the object tracking unit 232 of the object information acquisition unit 230 causes the object detection unit 231 to detect the position coordinates in the previous frame.
  • the acquired object 131 is tracked (step ST1005), and when the object 131 can be tracked (in the case of “YES” in step ST1006), the position coordinates after the movement of the object 131 are acquired (step ST1007).
  • the object tracking unit 232 stores the acquired position coordinate information in the object information storage unit 280 in association with the acquisition date information.
  • the process proceeds to step ST804 in FIG.
  • the object recognition unit 233 recognizes the type, shape, or state of the object 131 that the object detection unit 231 acquires the position coordinates and stores in the object information storage unit 280 (step ST1008).
  • the series of processing from step ST1001 to step ST1008 shown in FIG. 10 is performed by the object information acquisition unit 230 by the optical image, infrared image, or depth included in the sensor data output from the sensor data acquisition unit 210. This is object information acquisition processing for one frame of video.
  • the control part (illustration omitted) of the work recognition apparatus 100 determines whether the work by the worker 120 is finished (step ST804). For example, the control unit determines whether or not the work has been completed based on the fact that the worker 120 cannot be detected from the video acquired from the sensor, and determines that the work has been completed if the worker 120 cannot be detected. Also good. Alternatively, the control unit may determine that the work is finished when the worker 120 receives a work end notification input from an input device (not shown). This is merely an example, and it is sufficient that the control unit can determine whether or not the work of the worker 120 has been completed by any means.
  • step ST804 If it is determined in step ST804 that the work has not ended (in the case of “NO” in step ST804), the process returns to step ST801, and the processes in steps ST801 to ST803 are repeated.
  • step ST804 If it is determined in step ST804 that the work has been completed (in the case of “YES” in step ST804), the associating unit 240 performs the optical image, infrared image, or depth of the sensor data acquired in step ST801 until the work is completed.
  • the object 131 is associated with the body part 121 of the worker 120 who performed the work using the object 131 (step ST805).
  • the associating unit 240 stores the body part information regarding the body part 121 of the worker 120, which is stored in the body part information storage unit 270 by the body part information acquiring unit 220 in step ST502, and the object information in step ST503.
  • the acquisition unit 230 Based on the object information regarding the object 131 stored in the object information storage unit 280 by the acquisition unit 230, the acquisition unit 230 associates the body part 121 of the worker 120 with the body information.
  • the associating unit 240 outputs the result of associating the object 131 with the body part 121 of the worker 120 who performed the work using the object 131 to the recognition result analyzing unit 250.
  • FIG. 11 is a flowchart illustrating details of the operation of the associating unit 240 in step ST805 of FIG.
  • the score calculation unit 241 of the association unit 240 calculates an association score indicating which body part 121 is likely to be related to the movement of the object 131 with respect to the combination of the body part 121 and the object 131 of the worker 120. (Step ST1101).
  • the score calculation unit 241 includes the body part information related to the body part 121 of the worker 120 and the object information acquisition unit 230 that are stored in the body part information storage unit 270 by the body part information acquisition unit 220.
  • the association score is calculated using the object information regarding the object 131 stored in the information storage unit 280.
  • the score calculation unit 241 calculates an association score representing the degree of association between the body part 121 of the worker 120 and the object 131 with respect to the combination of the body part 121 and the object 131 of the worker 120.
  • the score calculation unit 241 outputs the calculated association score information to the matching unit 242 of the association unit 240.
  • the matching unit 242 uses the association score calculated by the score calculation unit 241 to combine the body part 121 of the worker 120 and the object 131 so as to maximize the association score within a range where the matching as the work can be achieved. Is determined (step ST1102).
  • the matching unit 242 outputs the determined combination of the body part 121 of the worker 120 and the object to the position correction unit 243 of the association unit 240.
  • the position correction unit 243 uses the combination of the body part 121 and the object 131 of the worker 120 determined by the matching unit 242 to determine the position coordinates of the body part 121 and / or the object 131 of the worker 120 or both. Correction is performed (step ST1103).
  • the recognition result analysis unit 250 analyzes the recognition result of the entire work based on the association result output by the association unit 240 (step ST806).
  • the recognition result analysis unit 250 outputs to the output control unit 260 information regarding the recognition result of the entire work, which is the association information output from the association unit 240, or the analysis result obtained by analyzing the recognition result.
  • the output control unit 260 outputs the recognition result of the entire work output by the recognition result analysis unit 250 to at least one of the output devices.
  • the output control unit 260 receives an analysis result output instruction, the output control unit 260 outputs the work analysis result to at least one of the output devices using the recognition result of the entire work output from the recognition result analysis unit 250 (step) ST807).
  • 12 to 14 are diagrams for explaining an output example of the recognition result of the whole work or the work analysis result by the output control unit 260 in the first embodiment. 12 to 14 show an example in which the output control unit 260 outputs the recognition result of the entire work or the work analysis result to the display 151 as a video.
  • FIG. 12 is a diagram illustrating the position coordinates of the object 131 and the left and right hands based on the result of the recognition result analysis unit 250 recognizing the assembly work by moving the object 131 with the left and right hands in the first embodiment. , And an output example in which the timing at which the object is moved is displayed on the display 151.
  • the three-dimensional position coordinates of each object 131 in a certain frame are converted into two-dimensional position coordinates viewed from the vertical direction, and an icon of each object 131 is displayed.
  • a certain frame is an arbitrary frame
  • FIG. 12 shows, as an example, a frame at the end of the work in which the object 131 is moved with the left hand.
  • the timing at which the object 131 is associated with the left and right hands is displayed.
  • the time axis is taken in the vertical direction of the drawing, with 0 as the work start time.
  • the output control unit 260 for example, regarding the work performed by the worker 120, the recognition result at a certain time (see the left side of the drawing in FIG. 12) and the recognition result of the flow of the entire work in accordance with the passage of time. (See the right side of the drawing in FIG. 12) can be displayed.
  • the output control unit 260 receives information related to the recognition result of the work of the worker 120 from the recognition result analysis unit 250, and displays the time interval in which the object 131 is moved together with a color rectangle indicating the object 131.
  • the object 131 is represented not by color coding but by the difference in dot density.
  • the left side of the drawing of FIG. 12 displays the work recognition result in a certain frame, but the output control unit 260 displays the display of the work recognition result after the time displayed on the right side of the drawing of FIG. It can be displayed in synchronization with the display of the flow of the combined work. Specifically, for example, when the user designates a certain time in the time axis direction of the work analysis result displayed on the right side of the drawing of FIG. 12, the output control unit 260 performs the work recognition result in the frame of the designated time.
  • the display on the left side of the drawing of FIG. 12 can be updated so as to be displayed. Thereby, the work content can be visualized with higher accuracy.
  • FIG. 12 it is assumed that the operation recognition result about 12 seconds after the start of the operation is designated to be displayed.
  • the output control unit 260 displays the corresponding icon of the object 131 displayed on the left side of the drawing of FIG. 12 and the rectangle indicating the object 131 displayed on the right side of the drawing of FIG. The relationship may be shown by connecting with a line.
  • FIG. 12 shows a total of six objects 131 having two objects m (m 1 and m 2 in FIG. 12), two objects n (n 1 and n 2 in FIG. 12), and one object o on the desk.
  • the trajectory x indicates that the operator 120 is holding the object m 1 with the left hand (L in FIG. 12) and moving it to the right of the screen.
  • the object o in the middle of the movement of the object m 1 is in a situation where it can be accidentally placed on the trajectory x of the object m 1 regardless of the work of the worker 120.
  • the reason why the object o is not shown in the display displayed on the right side of the drawing is that the object o is not related to the work of the worker 120, and the worker 120 holds the object o in the right hand (see FIG. 12). This is because neither the left hand nor the left hand is moved. 12 on the locus y is the operator 120 slightly before the time indicated in the drawing on the left is a locus obtained by moving the object m 2. Looking at the right side of the drawing, it can be seen that the object m 2 is moved with the right hand by 10 seconds after the start of the work. In FIG. 12, it is assumed that the trajectories x and y are displayed for several seconds even after the movement of the object m is completed and disappear after a few seconds. Note that the time for which the trajectories x and y remain displayed can be set as appropriate.
  • the output control unit 260 In response to the recognition result, the left hand icon is displayed over the object icon. This is only an example, and the same applies when an object 131 is associated with the right hand. The same applies when using body parts other than the left and right hands.
  • the output control unit 260 displays the locus of the position coordinates of the object 131 with a line. The output control unit 260 may continue to display the trace line to be displayed, or may delete the line after a certain period of time, that is, not display it. Further, the output control unit 260 may display the object 131 and the three-dimensional position coordinates of the body part 121 of the worker 120 as they are three-dimensionally.
  • FIG. 12 at which timing the object 131 is associated with the left and right hands (left side of the drawing in FIG. 12) and at what timing the object 131 is associated with the left and right hands (FIG. 12). 12, the right side of the drawing) is displayed together.
  • the present invention is not limited to this, and the timing at which the object 131 is associated with the left and right hands (left side of the drawing in FIG. 12) Only one of the timings when the object 131 is associated with the hand (right side in the drawing of FIG. 12) may be displayed.
  • FIG. 13 shows an analysis of work contents based on the recognition result analysis unit 250 recognizes the work of moving and assembling the object 131 with the left and right hands in the first embodiment, and the output control unit 260 analyzes the analysis result. It is an output example displayed on the display 151.
  • the output control unit 260 is the analysis result that the recognition result analysis unit 250 performs using the information of the work recognition result. The information about is displayed. Instead of displaying the object movement timing as shown on the right side of the drawing in FIG. 12, information regarding the analysis result as shown in FIG. 13 may be displayed.
  • the worker 120 places two objects n 3 and n 4 on the base using the right hand (R in FIG. 13) or the left hand (L in FIG. 13), respectively, to n 3, on n 4, respectively, with the right hand or left hand, after placing the object m 3, m 4, with the right hand, an object q to bridge the the object m 3 and m 4, It is assumed that the work of placing on the objects m 3 and m 4 has been performed.
  • the recognition result analysis unit 250 calculates a cleaning interval in which the objects n 3 , n 4 , m 3 , m 4 , and q are moved by the left and right hands in the above-described operation.
  • the output control unit 260 displays the calculation result.
  • the total time for moving the object 131 with the left and right hands and the calculation of the time from the start to the end of the work are calculated by the success / failure determination unit 251, the type identification unit 252, and the content comparison unit 253 of the recognition result analysis unit 250. Any one of the condition determination units 254 may perform the recognition result analysis unit 250.
  • the recognition result analysis unit 250 determines the total time from the time associated with the video frame, the total time when the object 131 is moved by the left and right hands, and The time from the start to the end of the work may be calculated.
  • the recognition result analysis unit 250 calculates the sum of the distances of moving the objects n 3 , n 4 , m 3 , m 4 , and q with the left and right hands to calculate 254 cm,
  • the output control unit 260 displays the total distance calculated by the recognition result analysis unit 250.
  • the total distance may be determined by any of the success / failure determination unit 251, the type identification unit 252, the content comparison unit 253, and the condition determination unit 254 of the recognition result analysis unit 250.
  • the total distance may be calculated from the position coordinates.
  • the success / failure determination unit 251 determines that the prescribed work has been performed in the prescribed order, and the content comparison unit 253 performs unnecessary work without using the wrong object 131. It is determined that it is not included, and the condition determination unit 254 receives the result that the work is determined not to be slower than the specified time, and causes the output control unit 260 to display “ ⁇ Good!” ing.
  • the output control unit 260 may display the analysis result in accordance with the appropriately set conditions.
  • the determination results of the success / failure determination unit 251, the type identification unit 252, the content comparison unit 253, and the condition determination unit 254 may be individually displayed.
  • the output control unit 260 receives the analysis result
  • the heading “Assembly work W1” may be displayed.
  • the output control unit 260 may display “ ⁇ (success)” in response to the analysis result.
  • the output control unit 260 receives “the component Z assembly not performed” in response to the analysis result. May be displayed.
  • the recognition result analysis unit 250 analyzes the work content based on the result of recognizing the work of moving and assembling the object 131 with the left and right hands, and the output control unit 260 It is an output example in which the analysis result is displayed on the display 151.
  • FIG. 13 shows an example in which the recognition result analysis unit 250 determines that the work has been performed as specified
  • FIG. 14 illustrates the result of the analysis performed by the recognition result analysis unit 250 as a result of the work being performed as specified. An example in the case where it is determined that it has not been performed is shown.
  • the content comparison unit 253 of the recognition result analysis unit 250 compares with other reference work and determines that the work is performed using the wrong object 131 having a different color. Yes. Further, an example is shown in which the condition determination unit 254 of the recognition result analysis unit 250 determines that the work is too late for the work using a certain object 131.
  • the operator 120 moves the object n 5 with the right hand (R in FIG. 14) ⁇ moves the object r with the left hand (L in FIG. 14) ⁇ moves the object m 5 with the right hand.
  • moving ⁇ the object m 6 in the order of movement in the left hand it is assumed that the work was done.
  • the work for moving the object r with the left hand should have moved the object n 6 with the left hand, but the operator 120 mistakenly moved the object r of a different color. Shall be.
  • the content comparison unit 253 of the recognition result analysis unit 250 determines that the operation is performed using the wrong object r having a different color as compared with other reference operations.
  • the output control unit 260 receives the analysis result that the content comparison unit 253 has performed the work using the object r having a different color, and “x” so that the wrong work can be recognized at the position indicating the work. "The color is different”. In addition, the output control unit 260 receives the analysis result by the condition determination unit 254 that the work using the object m5 is too late, so that the work that is determined to be too late can be found at a location indicating the corresponding work. In addition, “ ⁇ work is slow” is displayed. Note that the display example as shown in FIG. 14 is merely an example, and the output control unit 260 is based on a preset method, and the wrong work and the content of the mistake, or the work time is longer than the reference time. It is only necessary to display the image on the display 151 so that it can be seen.
  • the output control unit 260 displays the work recognition result or the analysis result on the display 151 based on the work recognition result output from the recognition result analysis unit 250 or the analysis result information obtained by analyzing the work recognition result. Is displayed as appropriate. Thereby, the operator 120 etc. can grasp
  • the output control unit 260 displays the work recognition result or the analysis result information output from the recognition result analysis unit 250 on the display 151 .
  • the output of the result or the information on the analysis result is not limited to this.
  • the output control unit 260 is set in advance to indicate that the specified work is not performed.
  • the work recognition result or the analysis result information output from the recognition result analysis unit 250 such as outputting sound from the speaker 152, can be output from the speaker 152 by sound or voice.
  • the output control unit 260 can store, in the storage device 153, information related to the work that is determined to have not been performed as a prescribed work, along with the determination result. By storing work result information in the storage device 153, the worker 120 or the like can analyze the cause determined that the prescribed work is not performed from the stored data. Further, the output control unit 260 may transmit a control signal to the control device 154 to the effect that it is determined that the prescribed work has not been performed. The control device 154 receives the control signal, and performs control that helps to perform the prescribed work, for example, downloads a manual and provides it to the work recognition device 100. As described above, the output control unit 260 can transmit a control signal corresponding to the analysis result of the recognition result analysis unit 250 to the control device 154 so that the control device 154 performs control according to the analysis result. .
  • the sensor data acquisition unit 210 that acquires sensor data and the body part 121 of the worker 120 are detected based on the sensor data acquired by the sensor data acquisition unit 210.
  • the object 131 is detected.
  • the object 131 and the object 131 are used.
  • An association unit 240 that performs association with the body part 121 of the worker 120 that performed the work, and an association associated with the association unit 240 Since the recognition result analysis unit 250 that recognizes the work performed by the worker 120 based on the association information regarding the result is provided, the correspondence between the object 131 and the body part 121 of the worker 120 should be taken into consideration.
  • the position of the body part 121 or the object 131 of the worker 120 cannot be estimated in part or compared with the conventional technique for obtaining the movement locus of the body part 121 or the object 131 alone. Even when the estimation error between the movement locus of the object 121 or the object 131 and the movement locus of the actual part 121 or the object 131 is large, the work of the operator 120 can be recognized with high accuracy.
  • any component of the embodiment can be modified or any component of the embodiment can be omitted within the scope of the invention.
  • the work recognition apparatus is configured to recognize the work content by the worker in a series of flows related to the body part and the object of the worker with high accuracy.
  • the present invention can be applied to a work recognition device that recognizes contents.
  • 100 work recognition device 111 visible light camera, 112 infrared camera, 113 depth sensor, 114 acceleration sensor, 115 gyro sensor, 120 worker, 121 parts, 131 object, 151 display, 152 speaker, 153 storage device, 154 control device, 210 sensor data acquisition unit, 220 body part information acquisition unit, 221 body part detection unit, 222 body part tracking unit, 223 body part recognition unit, 230 object information acquisition unit, 231 object detection unit, 232 object tracking unit, 233 object recognition Unit, 240 association unit, 241 score calculation unit, 242 matching unit, 243 position correction unit, 250 recognition result analysis unit, 251 success / failure determination unit, 252 type identification unit, 253 content comparison unit, 254 condition determination unit, 260 output control unit 270 Body part information storing unit, 280 object information storage unit, 601 processing circuits, 602 HDD, 603 input interface device 604 output interface device, 605,703 communication device, 606,701 CPU, 607,702 memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Quality & Reliability (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Operations Research (AREA)
  • Tourism & Hospitality (AREA)
  • Multimedia (AREA)
  • Educational Administration (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Game Theory and Decision Science (AREA)
  • General Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • General Factory Administration (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present invention comprises: a sensor-data acquisition unit (210) that acquires sensor data; a body-part information acquisition unit (220) that detects, on the basis of the sensor data, a body part (121) of a worker (120), and that acquires body-part information related to the body part (121) of the worker (120); an object-information acquisition unit (230) that detects, on the basis of the sensor data, an object (131), and that acquires object information related to the object (131); an association unit (240) that associates, on the basis of the body-part information and the object information, the object (131) and the body part (121) of the worker (120) who has carried out work using the object (131); and a recognition-result analysis unit (250) that recognizes, on the basis of association information related to the association results that were associated in the association unit (240), the work performed by the worker (120).

Description

作業認識装置および作業認識方法Work recognition device and work recognition method
 この発明は、作業者による作業内容を認識する作業認識装置および作業認識方法に関するものである。 The present invention relates to a work recognition device and a work recognition method for recognizing work contents by an operator.
 従来より、カメラ映像等を用いて人の体の部位、あるいは、物体の位置に関する情報を取得して、人の動作を認識し、分析する技術が知られている。
 例えば、特許文献1には、被写体として両手で作業を行う作業者の動作を撮影し、その中で特に作業者の両手の動作を、基準被写体である、基準となる作業者の動作に基づいて分析する動作分析装置が開示されている。
 特許文献1に開示されている動作分析装置では、両手の座標を映像のフレーム毎に取得し、座標を追跡することで、両手の軌跡を取得しており、カメラ位置の問題で左手、右手の座標がとれない場合は、当該座標がとれない左手、または、右手の座標を、座標が消失した場所の座標から推定するようにしている。
2. Description of the Related Art Conventionally, there is known a technique for recognizing and analyzing a human motion by acquiring information related to a human body part or an object position using a camera image or the like.
For example, Patent Document 1 captures the motion of a worker who performs a work with both hands as a subject, and in particular, the motion of both hands of the worker is based on the motion of a reference worker that is a reference subject. A motion analysis device for analysis is disclosed.
In the motion analysis apparatus disclosed in Patent Document 1, the coordinates of both hands are acquired for each frame of the video, and the locus of both hands is acquired by tracking the coordinates. When the coordinates cannot be taken, the left hand or right hand coordinates where the coordinates cannot be taken are estimated from the coordinates where the coordinates have disappeared.
特開2011-34234号公報JP 2011-34234 A
 特許文献1に開示されているような動作分析装置では、例えば、作業に物体の移動が含まれる場合であって、手や物体の位置の座標を取得することができない場合には、単に座標が消失した場所の座標から、手あるいは物体の座標を推定するのみであるため、作業内容を表わす軌跡を高精度に取得することができないという課題があった。 In the motion analysis device as disclosed in Patent Document 1, for example, when the movement of the object is included in the work, and the coordinates of the position of the hand or the object cannot be acquired, the coordinates are simply Since only the coordinates of the hand or the object are estimated from the coordinates of the disappeared place, there is a problem that the locus representing the work content cannot be obtained with high accuracy.
 この発明は上記のような課題を解決するためになされたもので、作業者による作業内容を高精度に認識する作業認識装置および作業認識方法を提供することを目的とする。 The present invention has been made to solve the above-described problems, and an object of the present invention is to provide a work recognition device and a work recognition method for recognizing work contents by an operator with high accuracy.
 この発明に係る作業認識装置は、センサデータを取得するセンサデータ取得部と、センサデータ取得部が取得したセンサデータに基づき、作業者の体の部位を検出して、当該作業者の体の部位に関する体部位情報を取得する体部位情報取得部と、センサデータ取得部が取得したセンサデータに基づき、物体を検出して、当該物体に関する物体情報を取得する物体情報取得部と、体部位情報取得部が取得した体部位情報と、物体情報取得部が取得した物体情報とに基づき、物体と、当該物体を用いた作業を行った、作業者の体の部位との関連付けを行う関連付け部と、関連付け部で関連付けられた関連付け結果に関する関連付け情報に基づき、作業者によって実施された作業を認識する認識結果分析部とを備えたものである。 The work recognition apparatus according to the present invention detects a part of a worker's body based on the sensor data acquisition unit that acquires sensor data and the sensor data acquired by the sensor data acquisition unit. A body part information acquisition unit that acquires body part information about the body, an object information acquisition unit that acquires object information about the object based on the sensor data acquired by the sensor data acquisition unit, and body part information acquisition An association unit for associating an object with a body part of an operator who performed an operation based on the body part information acquired by the unit and the object information acquired by the object information acquisition unit; And a recognition result analysis unit for recognizing work performed by the worker based on association information related to the association result associated by the association unit.
 この発明によれば、物体と当該物体を用いた作業を行った作業者の体の部位との関連付けを行って作業者によって実施された作業を認識するようにしたので、作業者による作業内容を高精度に認識することができる。 According to this invention, the work performed by the worker is recognized by associating the object with the body part of the worker who performed the work using the object. It can be recognized with high accuracy.
この発明の実施の形態1に係る作業認識装置を備えた作業認識システムの全体構成の一例を説明する図である。It is a figure explaining an example of the whole structure of the work recognition system provided with the work recognition apparatus which concerns on Embodiment 1 of this invention. この発明の実施の形態1に係る作業認識装置の構成図である。It is a block diagram of the work recognition apparatus which concerns on Embodiment 1 of this invention. この実施の形態1における、スコア算出部による関連付けスコアの算出方法の一例を説明する図である。It is a figure explaining an example of the calculation method of the association score by the score calculation part in this Embodiment 1. FIG. この実施の形態1における、スコア算出部による位置座標の軌跡の補間および補間後の関連付けスコア算出動作の一例を説明する図である。It is a figure explaining an example of the correlation score calculation operation | movement by interpolation of the locus | trajectory of a position coordinate by the score calculation part in this Embodiment 1, and interpolation. この実施の形態1における、位置補正部による位置座標の補正動作の一例を説明する図である。It is a figure explaining an example of the correction | amendment operation | movement of the position coordinate by the position correction part in this Embodiment 1. FIG. 図6A,図6Bは、この発明の実施の形態1に係る作業認識装置のハードウェア構成の一例を示す図である。6A and 6B are diagrams showing an example of a hardware configuration of the work recognition apparatus according to Embodiment 1 of the present invention. 実施の形態1における制御装置のハードウェア構成の一例を示す図である。3 is a diagram illustrating an example of a hardware configuration of a control device according to Embodiment 1. FIG. この発明の実施の形態1に係る作業認識装置の動作を説明するフローチャートである。It is a flowchart explaining operation | movement of the work recognition apparatus which concerns on Embodiment 1 of this invention. 図8のステップST802における、体部位情報取得部の動作の詳細を説明するフローチャートである。It is a flowchart explaining the detail of operation | movement of the body part information acquisition part in step ST802 of FIG. 図8のステップST803における、物体情報取得部の動作の詳細を説明するフローチャートである。It is a flowchart explaining the detail of operation | movement of the object information acquisition part in step ST803 of FIG. 図8のステップST805における、関連付け部の動作の詳細を説明するフローチャートである。FIG. 9 is a flowchart illustrating details of an operation of an associating unit in step ST805 of FIG. この実施の形態1において、出力制御部による、作業全体の認識結果、または、作業分析結果の出力例を説明する図である。In this Embodiment 1, it is a figure explaining the output example of the recognition result of the whole operation | work by the output control part, or a work analysis result. この実施の形態1において、出力制御部による、作業全体の認識結果、または、作業分析結果の出力例を説明する図である。In this Embodiment 1, it is a figure explaining the output example of the recognition result of the whole operation | work by the output control part, or a work analysis result. この実施の形態1において、出力制御部による、作業全体の認識結果、または、作業分析結果の出力例を説明する図である。In this Embodiment 1, it is a figure explaining the output example of the recognition result of the whole operation | work by the output control part, or a work analysis result.
 以下、この発明の実施の形態について、図面を参照しながら詳細に説明する。
実施の形態1.
 図1は、この発明の実施の形態1に係る作業認識装置100を備えた作業認識システムの全体構成の一例を説明する図である。
 この作業認識システムでは、作業認識装置100が、例えば、可視光カメラ111等から受信した映像情報、あるいは、加速度センサ114等から受信したセンシング情報に基づき、作業者120が手などの体の部位121および物体131を用いて行う作業を認識し、作業認識結果、および、当該作業認識結果に基づく作業分析結果を、外部の装置等に出力する。
 作業者120が作業を行う際に用いる体の部位121は、1つに限らない。作業者120は、例えば、右手と左手のように、複数の体の部位121を用いて作業を行ってもよい。また、作業者120が作業を行う際に用いる物体131も、1つに限らず、作業者120は、複数の物体131を用いて作業を行ってもよい。
 すなわち、この実施の形態1において、作業者120は、1つ以上の体の部位121、および、1つ以上の物体131を用いて作業を行うものである。
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
Embodiment 1 FIG.
FIG. 1 is a diagram illustrating an example of the overall configuration of a work recognition system including a work recognition device 100 according to Embodiment 1 of the present invention.
In this work recognition system, the work recognizer 100 is based on, for example, video information received from the visible light camera 111 or the like, or sensing information received from the acceleration sensor 114 or the like. Then, the work performed using the object 131 is recognized, and the work recognition result and the work analysis result based on the work recognition result are output to an external device or the like.
The number of body parts 121 used when the worker 120 performs work is not limited to one. The worker 120 may perform work using a plurality of body parts 121 such as a right hand and a left hand, for example. Also, the number of objects 131 used when the worker 120 performs the work is not limited to one, and the worker 120 may perform work using a plurality of objects 131.
That is, in the first embodiment, the worker 120 performs work using one or more body parts 121 and one or more objects 131.
 作業認識システムは、図1に示すように、作業認識装置100と、可視光カメラ111と、赤外線カメラ112と、深度センサ113と、加速度センサ114と、ジャイロセンサ115と、ディスプレイ151と、スピーカ152と、記憶装置153と、制御装置154とを備える。 As shown in FIG. 1, the work recognition system includes a work recognition device 100, a visible light camera 111, an infrared camera 112, a depth sensor 113, an acceleration sensor 114, a gyro sensor 115, a display 151, and a speaker 152. And a storage device 153 and a control device 154.
 可視光カメラ111、赤外線カメラ112、および、深度センサ113は、作業者120周辺を撮影する撮像装置であり、それぞれ、可視光映像、赤外線映像、および、深度映像を、作業認識装置100に送信する。
 この実施の形態1において、作業者120周辺とは、予め設定された範囲であり、少なくとも、作業者120が作業する際に用いる体の部位121、および、物体131が作業中に移動する主要な範囲が含まれる。
The visible light camera 111, the infrared camera 112, and the depth sensor 113 are imaging devices that capture the periphery of the worker 120, and transmit visible light images, infrared images, and depth images to the work recognition device 100, respectively. .
In the first embodiment, the vicinity of the worker 120 is a preset range, and at least the body part 121 and the object 131 used when the worker 120 works are mainly moved during the work. A range is included.
 なお、この実施の形態1では、図1に示すように、作業認識システムは、撮像装置として、可視光カメラ111、赤外線カメラ112、および、深度センサ113を備えるものとしたが、これに限らず、作業認識システムは、これらの少なくとも1つを備えるものであればよい。
 また、この実施の形態1の作業認識システムでは、可視光カメラ111、赤外線カメラ112、および、深度センサ113のうちの2つ以上が一体となったセンサを用いるようにしてもよい。
 また、可視光カメラ111、赤外線カメラ112、および、深度センサ113以外に、作業者120周辺の映像を撮影できる撮像装置を備えるようにしてもよい。
 また、可視光カメラ111、赤外線カメラ112、および、深度センサ113それぞれの設置数に制限はなく、例えば、異なる視点から作業者120周辺を撮影するために、可視光カメラ111、赤外線カメラ112、あるいは、深度センサ113をそれぞれ複数設置するようにしてもよい。
In the first embodiment, as shown in FIG. 1, the work recognition system includes the visible light camera 111, the infrared camera 112, and the depth sensor 113 as the imaging device. The work recognition system only needs to include at least one of these.
In the work recognition system of the first embodiment, a sensor in which two or more of the visible light camera 111, the infrared camera 112, and the depth sensor 113 are integrated may be used.
In addition to the visible light camera 111, the infrared camera 112, and the depth sensor 113, an imaging device that can capture an image around the worker 120 may be provided.
Moreover, there is no restriction | limiting in the installation number of each of the visible light camera 111, the infrared camera 112, and the depth sensor 113, for example, in order to image | photograph the periphery of the operator 120 from a different viewpoint, the visible light camera 111, the infrared camera 112, or A plurality of depth sensors 113 may be installed.
 また、この実施の形態1の作業認識システムでは、作業者120の体の部位121、または、物体131に、マーカを付してもよい。マーカとしては、バーコード、もしくは、特徴的な図形を印刷したマーカ、可視光もしくは赤外線に反射して光る反射マーカ、特徴的な色を用いたカラーマーカ、または、赤外線を発する赤外線マーカなどを用いることができる。作業認識装置100は、当該マーカを検出、追跡、あるいは、認識することで、作業者120の体の部位121、または、物体131を精度よく検出、追跡、あるいは、認識できる。作業認識装置100の詳細については後述する。
 マーカを利用する際は、利用するマーカ全てを同一のマーカとしてもよいし、作業者120の体の部位121、または、物体131毎に異なるマーカとしてもよい。
In the work recognition system according to the first embodiment, a marker may be attached to the body part 121 or the object 131 of the worker 120. As the marker, a bar code or a marker printed with a characteristic figure, a reflective marker that shines when reflected by visible light or infrared light, a color marker using a characteristic color, an infrared marker that emits infrared light, or the like is used. be able to. The work recognition apparatus 100 can detect, track, or recognize the body part 121 or the object 131 of the worker 120 with high accuracy by detecting, tracking, or recognizing the marker. Details of the work recognition device 100 will be described later.
When using a marker, all the markers to be used may be the same marker, or may be different for each body part 121 or object 131 of the worker 120.
 加速度センサ114は、作業者120の体の部位121または物体131のいずれか一方、あるいは、両方に装着され、加速度の時系列の情報を作業認識装置100に送信する。
 ジャイロセンサ115は、作業者120の体の部位121または物体131のいずれか一方、あるいは、両方に装着され、角加速度の時系列の情報を作業認識装置100に送信する。
 なお、この実施の形態1の作業認識システムは、図1に示すように、作業者120の体の部位121または物体131の動きをセンシングできるセンサとして、加速度センサ114およびジャイロセンサ115を備えるものとしたが、これに限らず、作業認識システムは、これらの少なくとも1つを備えていればよい。
 また、作業認識システムは、加速度センサ114またはジャイロセンサ115以外に、作業者120の体の部位121および物体131の動きをセンシングできるセンサを備えるようにしてもよい。
The acceleration sensor 114 is attached to one or both of the body part 121 and the object 131 of the worker 120, and transmits time series information of acceleration to the work recognition device 100.
The gyro sensor 115 is attached to one or both of the body part 121 and the object 131 of the worker 120, and transmits time-series information on angular acceleration to the work recognition device 100.
As shown in FIG. 1, the work recognition system according to the first embodiment includes an acceleration sensor 114 and a gyro sensor 115 as sensors capable of sensing the movement of the body part 121 or the object 131 of the worker 120. However, the present invention is not limited to this, and the work recognition system only needs to include at least one of these.
In addition to the acceleration sensor 114 or the gyro sensor 115, the work recognition system may include a sensor that can sense the movement of the body part 121 and the object 131 of the worker 120.
 以下、可視光カメラ111と、赤外線カメラ112と、深度センサ113と、加速度センサ114と、ジャイロセンサ115とをまとめてセンサといい、これらのセンサから得られるデータをまとめてセンサデータというものとする。すなわち、可視光カメラ111で得られる可視光映像、赤外線カメラ112で得られる赤外線映像、および、深度センサ113で得られる深度映像等の映像データも、センサデータに含まれるものとする。 Hereinafter, the visible light camera 111, the infrared camera 112, the depth sensor 113, the acceleration sensor 114, and the gyro sensor 115 are collectively referred to as sensors, and data obtained from these sensors are collectively referred to as sensor data. . That is, image data such as a visible light image obtained by the visible light camera 111, an infrared image obtained by the infrared camera 112, and a depth image obtained by the depth sensor 113 are also included in the sensor data.
 作業認識装置100は、センサから受信したセンサデータのうち少なくとも1つに基づき、作業者120の体の部位121および物体131の動きを捉えて作業を認識し、作業認識結果、あるいは、当該作業認識結果を用いた作業分析結果を、ディスプレイ151、スピーカ152、記憶装置153、または、制御装置154の少なくとも1つに対して出力する。 The work recognition device 100 recognizes the work by capturing the movement of the body part 121 and the object 131 of the worker 120 based on at least one of the sensor data received from the sensor, and recognizes the work recognition result or the work recognition. The work analysis result using the result is output to at least one of the display 151, the speaker 152, the storage device 153, or the control device 154.
 ディスプレイ151は、作業認識装置100から出力された作業認識結果、あるいは、当該作業認識結果を用いた作業分析結果を、映像等により出力する。
 スピーカ152は、作業認識装置100から出力された作業認識結果、あるいは、当該作業認識結果を用いた作業分析結果を、映像、あるいは、音声等により出力する。
 また、記憶装置153は、作業認識装置100から出力された作業認識結果、あるいは、当該作業認識結果を用いた作業分析結果を記憶する。
 例えば、作業認識装置100は、記憶装置153に、作業認識結果を長時間記憶させるようにする。作業認識装置100は、記憶装置153に、作業認識結果を複数人分記憶させるようにしてもよい。作業者120の作業の管理を行う管理者等は、当該記憶装置153に記憶された作業認識結果を参照し、作業分析を行ったり、作業分析レポートを作成したりする。このように、記憶装置153に作業認識結果等を記憶させておくことで、リアルタイムの作業認識ではなく、管理者等が、後から作業分析を行うことができる。なお、これは一例に過ぎず、記憶装置153に記憶させた作業認識結果は、その他の方法で利用されるものとしてもよい。
The display 151 outputs the work recognition result output from the work recognition device 100 or the work analysis result using the work recognition result as a video or the like.
The speaker 152 outputs the work recognition result output from the work recognition apparatus 100 or the work analysis result using the work recognition result by video or voice.
The storage device 153 stores the work recognition result output from the work recognition device 100 or the work analysis result using the work recognition result.
For example, the work recognition device 100 causes the storage device 153 to store the work recognition result for a long time. The work recognition apparatus 100 may store a plurality of work recognition results in the storage device 153. The manager who manages the work of the worker 120 refers to the work recognition result stored in the storage device 153 and performs work analysis or creates a work analysis report. As described above, by storing the work recognition result or the like in the storage device 153, the manager or the like can perform the work analysis later instead of the real-time work recognition. This is only an example, and the work recognition result stored in the storage device 153 may be used by other methods.
 また、制御装置154は、作業認識装置100から出力された作業認識結果、あるいは、当該作業認識結果を用いた作業分析結果に応じた制御信号に基づき、各種制御を行う。
 具体的には、例えば、制御装置154は、作業認識結果等に基づき、ロボットに対して、作業者120の作業を補佐させる制御を行う。これにより、作業状況にあわせて、必要な部品または工具等を作業者120に供給したり、作業者120の作業が遅い場合に作業の遅れを取り戻せるよう、作業を助けたりすることができる。なお、これは一例に過ぎず、制御装置154は、その他の制御を行うようにしてもよい。
 以下、ディスプレイ151と、スピーカ152と、記憶装置153と、制御装置154とをまとめて出力装置ともいうものとする。
 なお、この実施の形態1の作業認識システムは、図1に示すように、ディスプレイ151と、スピーカ152と、記憶装置153と、制御装置154とを備えているが、これに限らず、作業認識システムは、これらの少なくとも1つを備えていればよい。
In addition, the control device 154 performs various controls based on a work recognition result output from the work recognition device 100 or a control signal corresponding to a work analysis result using the work recognition result.
Specifically, for example, the control device 154 controls the robot to assist the work of the worker 120 based on the work recognition result and the like. This makes it possible to supply necessary parts or tools to the worker 120 according to the work situation, or to assist the work so that the work delay can be recovered when the work of the worker 120 is slow. This is only an example, and the control device 154 may perform other controls.
Hereinafter, the display 151, the speaker 152, the storage device 153, and the control device 154 are collectively referred to as an output device.
The work recognition system according to the first embodiment includes a display 151, a speaker 152, a storage device 153, and a control device 154 as shown in FIG. The system may include at least one of these.
 図2は、この発明の実施の形態1に係る作業認識装置100の構成図である。
 図2に示すように、作業認識装置100は、センサデータ取得部210と、体部位情報取得部220と、物体情報取得部230と、関連付け部240と、認識結果分析部250と、出力制御部260とを備える。
FIG. 2 is a configuration diagram of the work recognition apparatus 100 according to Embodiment 1 of the present invention.
As shown in FIG. 2, the work recognition apparatus 100 includes a sensor data acquisition unit 210, a body part information acquisition unit 220, an object information acquisition unit 230, an association unit 240, a recognition result analysis unit 250, and an output control unit. 260.
 センサデータ取得部210は、可視光カメラ111、赤外線カメラ112、深度センサ113、加速度センサ114、または、ジャイロセンサ115の少なくとも1つのセンサからセンサデータを取得する。
 センサデータ取得部210は、取得したセンサデータを、体部位情報取得部220および物体情報取得部230に出力する。
The sensor data acquisition unit 210 acquires sensor data from at least one of the visible light camera 111, the infrared camera 112, the depth sensor 113, the acceleration sensor 114, and the gyro sensor 115.
The sensor data acquisition unit 210 outputs the acquired sensor data to the body part information acquisition unit 220 and the object information acquisition unit 230.
 体部位情報取得部220は、センサデータ取得部210が出力したセンサデータに基づき、作業者120の体の部位121を検出して、作業者120の体の部位121に関する体部位情報を取得する。体部位情報取得部220は、取得した、作業者120の体の部位121に関する体部位情報を体部位情報記憶部270に記憶させる。 The body part information acquisition unit 220 detects the body part 121 of the worker 120 based on the sensor data output by the sensor data acquisition unit 210 and acquires body part information regarding the body part 121 of the worker 120. The body part information acquisition unit 220 causes the body part information storage unit 270 to store the acquired body part information related to the body part 121 of the worker 120.
 図2に示すように、体部位情報取得部220は、体部位検出部221と、体部位追跡部222と、体部位認識部223とを備える。
 体部位検出部221は、センサデータ取得部210が出力したセンサデータに基づき、作業者120の体の部位121を検出して、当該部位121の位置座標を取得する。部位121の位置座標とは、例えば、検出された体の部位121の任意の一点の座標としてもよいし、体の部位121を含むように囲った矩形の右上および左下の座標としてもよく、作業者120の体の部位121の位置座標を、どの点の座標とするかは適宜設定可能とする。
 また、体部位検出部221は、作業者120の体の部位121の位置座標を、取得した映像上の2次元座標としてもよいし、深度情報を用いて推定した3次元座標としてもよい。
 体部位検出部221が検出する部位121の位置座標の原点および座標系は任意とする。例えば、体部位検出部221は、取得した映像の左上を原点、右方向をX軸、下方向をY軸とする2次元座標系としてもよいし、作業領域の特定の点を原点とし、鉛直上方向をZ軸とした3次元座標系としてもよい。
As shown in FIG. 2, the body part information acquisition unit 220 includes a body part detection unit 221, a body part tracking unit 222, and a body part recognition unit 223.
The body part detection unit 221 detects the body part 121 of the worker 120 based on the sensor data output from the sensor data acquisition part 210 and acquires the position coordinates of the part 121. The position coordinates of the part 121 may be, for example, the coordinates of an arbitrary point of the detected body part 121, or may be the coordinates of the upper right and lower left corners of a rectangle enclosed so as to include the body part 121. It is possible to appropriately set which point the position coordinate of the body part 121 of the person 120 is set to.
In addition, the body part detection unit 221 may use the position coordinates of the body part 121 of the worker 120 as two-dimensional coordinates on the acquired image or three-dimensional coordinates estimated using the depth information.
The origin and the coordinate system of the position coordinates of the part 121 detected by the body part detection unit 221 are arbitrary. For example, the body part detection unit 221 may have a two-dimensional coordinate system in which the upper left of the acquired video is the origin, the right direction is the X axis, and the lower direction is the Y axis, or a specific point in the work area is the origin, and the vertical A three-dimensional coordinate system with the upward direction as the Z-axis may be used.
 また、体部位検出部221は、任意の方法で作業者120の体の部位121の位置座標を取得するものとする。体部位検出部221は、例えば、SURF(Speed Up Robust Features)、あるいは、HOG(Histgrams of Oriented Gradients)などの、既存の特徴点ベースの検出法で作業者120の体の部位121の位置座標を取得してもよいし、ニュートラルネットワークなどのモデルベースの検出手法で作業者120の体の部位121の位置座標を取得してもよい。また、体部位検出部221は、センサデータ取得部210から取得する単一フレームの画像のみを用いた検出手法を採用してもよいし、複数フレームの映像を用いた検出手法を採用してもよい。また、体部位検出部221は、光学映像、赤外線映像、または、深度映像等のいずれを用いた検出手法で作業者120の体の部位121の位置座標を取得してもよいし、光学映像、赤外線映像、または、深度映像等を複合して用いた検出手法で作業者120の体の部位121の位置座標を取得してもよい。
 体部位検出部221は、作業者120の体の部位121を複数検出した場合は、検出した部位121ごとに、位置座標を取得する。
Moreover, the body part detection part 221 shall acquire the position coordinate of the body part 121 of the operator 120 by arbitrary methods. The body part detection unit 221 calculates the position coordinates of the body part 121 of the worker 120 by an existing feature point based detection method such as SURF (Speed Up Robust Features) or HOG (Hisgrams of Oriented Gradients). Alternatively, the position coordinates of the body part 121 of the worker 120 may be acquired by a model-based detection method such as a neutral network. The body part detection unit 221 may employ a detection method using only a single frame image acquired from the sensor data acquisition unit 210, or may employ a detection method using a plurality of frames of video. Good. The body part detection unit 221 may acquire the position coordinates of the body part 121 of the worker 120 by any detection method using any one of an optical image, an infrared image, a depth image, and the like. The position coordinates of the body part 121 of the worker 120 may be acquired by a detection method using a combination of infrared video or depth video.
When a plurality of body parts 121 of the worker 120 are detected, the body part detection unit 221 acquires position coordinates for each detected part 121.
 体部位検出部221は、取得した、作業者120の体の部位121の位置座標を、取得日時の情報と対応付けて、体部位情報記憶部270に記憶させる。取得日時の情報は、センサデータ取得部210から取得した映像フレームに付随した取得日時の情報とすればよい。なお、体部位検出部221は、作業者120の体の部位121を複数検出した場合は、部位121ごとに、位置座標と取得日時の情報とを対応付けて、体部位情報記憶部270に記憶させる。 The body part detection unit 221 stores the acquired position coordinates of the body part 121 of the worker 120 in the body part information storage unit 270 in association with the acquisition date information. The acquisition date / time information may be information on the acquisition date / time associated with the video frame acquired from the sensor data acquisition unit 210. When the body part detection unit 221 detects a plurality of body parts 121 of the worker 120, the body part information storage unit 270 stores the position coordinates and the acquisition date information in association with each part 121. Let
 体部位追跡部222は、体部位検出部221が検出した作業者120の体の部位121の軌跡を取得する。具体的には、体部位追跡部222は、体部位検出部221が位置座標を取得した作業者120の体の部位121を、センサから取得したセンサデータに基づいて追跡し、当該部位121の移動後の位置座標を取得する。
 体部位追跡部222は、体部位検出部221が体部位情報記憶部270に記憶させた部位121に関する情報に基づき、作業者120の体の部位121の位置座標の追跡を行う。体部位検出部221が検出した部位121が複数存在した場合は、体部位追跡部222は、当該複数の部位121ごとに、移動後の位置座標を取得する。
The body part tracking unit 222 acquires the locus of the body part 121 of the worker 120 detected by the body part detection unit 221. Specifically, the body part tracking unit 222 tracks the body part 121 of the worker 120 from which the body part detection unit 221 has acquired the position coordinates based on the sensor data acquired from the sensor, and moves the part 121. Get the later position coordinates.
The body part tracking unit 222 tracks the position coordinates of the body part 121 of the worker 120 based on the information about the part 121 stored in the body part information storage unit 270 by the body part detection unit 221. When there are a plurality of parts 121 detected by the body part detection unit 221, the body part tracking unit 222 acquires the position coordinates after movement for each of the plurality of parts 121.
 体部位追跡部222は、任意の追跡手法で、部位121の追跡を行えばよい。例えば、体部位追跡部222は、更新テンプレートマッチング、アクティブ探索、Mean-shift法、または、粒子フィルタなどを用いた既存の領域ベース追跡手法を用いて部位121の追跡を行うものとしてもよいし、KLT法(Kanade-Lucas-Tomasi Feature Tracker)、または、SURF Trackingなどの特徴点ベースの追跡手法を用いて部位121の追跡を行うものとしてもよい。
 また、体部位追跡部222は、光学映像、赤外線映像、または、深度映像等のいずれを用いた追跡手法で部位121の追跡を行ってもよいし、光学映像、赤外線映像、または、深度映像等を複合して用いた検出手法で部位121の追跡を行ってもよい。
 体部位追跡部222は、取得した、作業者120の体の部位121の移動後の位置座標、すなわち、作業者120の体の部位121の軌跡の情報を、位置座標の取得日時の情報とともに、体部位情報記憶部270に記憶させている部位121の情報に対応付けて記憶させる。
The body part tracking unit 222 may track the part 121 by any tracking method. For example, the body part tracking unit 222 may track the part 121 using an update template matching, an active search, a Mean-shift method, or an existing region-based tracking method using a particle filter, The part 121 may be tracked using a KLT method (Kanade-Lucas-Tomasi Feature Tracker) or a feature point-based tracking method such as SURF Tracking.
Further, the body part tracking unit 222 may track the part 121 by a tracking method using any one of an optical image, an infrared image, a depth image, and the like, or an optical image, an infrared image, a depth image, etc. The part 121 may be tracked by a detection method using a combination of the above.
The body part tracking unit 222 acquires the acquired position coordinates after the movement of the body part 121 of the worker 120, that is, the information on the trajectory of the body part 121 of the worker 120, along with information on the acquisition date and time of the position coordinates. The body part information storage unit 270 stores the information in association with the part 121 information.
 体部位認識部223は、体部位検出部221が位置座標を取得し、体部位情報記憶部270に記憶させた作業者120の体の部位121の種類、形状、または、状態を認識する。この実施の形態1において、作業者120の体の部位121の種類とは、例えば、右手、左手、肘、顔等をいう。なお、これは一例に過ぎず、作業者120の体の部位121を特定するものであればよい。また、作業者120の体の部位121の形状または状態とは、例えば、作業者120が右手の手のひらで物体131を掴んでいる場合、当該物体131を掴んでいる右手の形状、または、作業者120が当該物体131を掴んでいるという状態等をいう。なお、これは一例に過ぎず、例えば、作業者120が指先で物体131をつまんでいる、作業者120が拳を握りしめている、作業者120が右手を開いている等の場合も同様である。
 体部位認識部223は、任意の認識手法で、部位121の種類、形状、または、状態を認識すればよい。例えば、SURF、あるいは、HOGなどの、既存の特徴点ベースの検出法で部位121の種類、形状、または、状態を認識してもよいし、ニュートラルネットワークなどのモデルベースの検出手法で部位121の種類、形状、または、状態を認識してもよい。
 また、体部位認識部223は、光学映像、赤外線映像、または、深度映像のいずれを用いた認識手法で部位121の種類、形状、または、状態の認識を行ってもよいし、光学映像、赤外線映像、および、深度映像を複合して用いた認識手法で部位121の種類、形状、または、状態の認識を行ってもよい。
 体部位認識部223は、部位121の認識結果の情報を、部位121の情報と対応付けて、体部位情報記憶部270に記憶させる。
 体部位検出部221、体部位追跡部222、および、体部位認識部223によって、体部位情報記憶部270に記憶させた部位121に関する情報が、体部位情報となる。
 なお、体部位検出部221、体部位追跡部222、および、体部位認識部223による体部位情報は、一連の作業ごとに記憶されることになる。
The body part recognition unit 223 recognizes the type, shape, or state of the body part 121 of the worker 120 that the body part detection unit 221 acquires the position coordinates and stores in the body part information storage unit 270. In the first embodiment, the type of body part 121 of the worker 120 refers to, for example, a right hand, a left hand, an elbow, a face, and the like. Note that this is only an example, as long as the body part 121 of the worker 120 is specified. The shape or state of the body part 121 of the worker 120 is, for example, the shape of the right hand holding the object 131 when the worker 120 is holding the object 131 with the palm of the right hand, or the worker A state in which 120 is holding the object 131 or the like. Note that this is only an example, and the same applies to, for example, the case where the worker 120 pinches the object 131 with the fingertip, the worker 120 grasps the fist, and the worker 120 opens the right hand. .
The body part recognition unit 223 may recognize the type, shape, or state of the part 121 by an arbitrary recognition method. For example, the type, shape, or state of the part 121 may be recognized by an existing feature point-based detection method such as SURF or HOG, or the part 121 may be recognized by a model-based detection method such as a neutral network. The type, shape, or state may be recognized.
The body part recognizing unit 223 may recognize the type, shape, or state of the part 121 by any recognition method using an optical image, an infrared image, or a depth image. The type, shape, or state of the part 121 may be recognized by a recognition method using a combination of video and depth video.
The body part recognition unit 223 stores the information on the recognition result of the part 121 in the body part information storage unit 270 in association with the information on the part 121.
Information relating to the part 121 stored in the body part information storage unit 270 by the body part detection unit 221, the body part tracking unit 222, and the body part recognition unit 223 becomes the body part information.
The body part information by the body part detection unit 221, the body part tracking unit 222, and the body part recognition unit 223 is stored for each series of operations.
 物体情報取得部230は、センサデータ取得部210が出力したセンサデータに基づき、物体131を検出して、物体131に関する物体情報を取得する。物体情報取得部230は、取得した、物体131に関する物体情報を物体情報記憶部280に記憶させる。 The object information acquisition unit 230 detects the object 131 based on the sensor data output from the sensor data acquisition unit 210 and acquires object information related to the object 131. The object information acquisition unit 230 causes the object information storage unit 280 to store the acquired object information regarding the object 131.
 物体情報取得部230は、図2に示すように、物体検出部231と、物体追跡部232と、物体認識部233とを備える。
 物体検出部231は、センサデータ取得部210が出力したセンサデータに基づき、物体131を検出して、当該物体131の位置座標を取得する。物体131の位置座標とは、例えば、検出された物体131の任意の一点の座標としてもよいし、物体131を含むように囲った矩形の右上および左下の座標としてもよく、物体131の位置座標を、どの点の座標とするかは適宜設定可能とする。
 物体検出部231は、物体131の位置座標を、取得した映像上の2次元座標としてもよいし、深度情報を用いて推定した3次元座標としてもよい。
 物体検出部231が検出する物体131の位置座標の原点および座標系は任意とする。例えば、物体検出部231は、取得した映像の左上を原点、右方向をX軸、下方向をY軸とする2次元座標系としてもよいし、作業領域の特定の点を原点とし、鉛直上方向をZ軸とした3次元座標系としてもよい。
As illustrated in FIG. 2, the object information acquisition unit 230 includes an object detection unit 231, an object tracking unit 232, and an object recognition unit 233.
The object detection unit 231 detects the object 131 based on the sensor data output from the sensor data acquisition unit 210 and acquires the position coordinates of the object 131. The position coordinates of the object 131 may be, for example, the coordinates of an arbitrary point of the detected object 131, or may be the coordinates of the upper right and lower left of a rectangle enclosed so as to include the object 131. The coordinates of which point can be set as appropriate.
The object detection unit 231 may use the position coordinates of the object 131 as two-dimensional coordinates on the acquired video or three-dimensional coordinates estimated using depth information.
The origin and coordinate system of the position coordinates of the object 131 detected by the object detection unit 231 are arbitrary. For example, the object detection unit 231 may be a two-dimensional coordinate system in which the upper left of the acquired image is the origin, the right direction is the X axis, and the lower direction is the Y axis. A three-dimensional coordinate system with the direction as the Z-axis may be used.
 また、物体検出部231は、任意の方法で物体131の位置座標を取得するものとする。物体検出部231は、例えば、SURF(Speed Up Robust Features)、あるいは、HOG(Histgrams of Oriented Gradients)などの、既存の特徴点ベースの検出法で物体131の位置座標を取得してもよいし、ニュートラルネットワークなどのモデルベースの検出手法で物体131の位置座標を取得してもよい。また、物体検出部231は、センサデータ取得部210から取得する単一フレームの画像のみを用いた検出手法を採用してもよいし、複数フレームの映像を用いた検出手法を採用してもよい。また、物体検出部231は、光学映像、赤外線映像、または、深度映像等のいずれを用いた検出手法で物体131の位置座標を取得してもよいし、光学映像、赤外線映像、または、深度映像等を複合して用いた検出手法で物体131の位置座標を取得してもよい。
 物体検出部231は、物体131を複数検出した場合は、検出した物体131ごとに、位置座標を取得する。
The object detection unit 231 acquires the position coordinates of the object 131 by an arbitrary method. The object detection unit 231 may acquire the position coordinates of the object 131 by an existing feature point-based detection method such as SURF (Speed Up Robust Features) or HOG (Hisgrams of Oriented Gradients), for example. The position coordinates of the object 131 may be acquired by a model-based detection method such as a neutral network. The object detection unit 231 may employ a detection method using only a single frame image acquired from the sensor data acquisition unit 210, or may employ a detection method using a plurality of frames of video. . Further, the object detection unit 231 may acquire the position coordinates of the object 131 by a detection method using any one of an optical image, an infrared image, a depth image, and the like, or an optical image, an infrared image, or a depth image. The position coordinates of the object 131 may be acquired by a detection method using a combination of the above.
When a plurality of objects 131 are detected, the object detection unit 231 acquires position coordinates for each detected object 131.
 物体検出部231は、取得した物体131の位置座標を、取得日時の情報と対応付けて、物体情報記憶部280に記憶させる。取得日時の情報は、センサデータ取得部210から取得した映像フレームに付随した取得日時の情報とすればよい。なお、物体検出部231は、物体131を複数検出した場合は、物体131ごとに、位置座標と取得日時の情報とを対応付けて、物体情報記憶部280に記憶させる。 The object detection unit 231 stores the acquired position coordinates of the object 131 in the object information storage unit 280 in association with the acquisition date information. The acquisition date / time information may be information on the acquisition date / time associated with the video frame acquired from the sensor data acquisition unit 210. When a plurality of objects 131 are detected, the object detection unit 231 stores the position coordinates and the acquisition date information in the object information storage unit 280 in association with each object 131.
 物体追跡部232は、物体検出部231が検出した物体131の軌跡を取得する。具体的には、物体追跡部232は、物体検出部231が位置座標を取得した物体131を、センサから取得したセンサデータに基づいて追跡し、当該物体131の移動後の位置座標を取得する。
 物体追跡部232は、物体検出部231が物体情報記憶部280に記憶させた物体131に関する情報に基づき、物体131の位置座標の追跡を行う。物体検出部231が検出した物体131が複数存在した場合は、物体追跡部232は、当該複数の物体131ごとに、移動後の位置座標を取得する。
The object tracking unit 232 acquires the trajectory of the object 131 detected by the object detection unit 231. Specifically, the object tracking unit 232 tracks the object 131 from which the object detection unit 231 has acquired the position coordinates based on the sensor data acquired from the sensor, and acquires the position coordinates after the movement of the object 131.
The object tracking unit 232 tracks the position coordinates of the object 131 based on the information regarding the object 131 stored in the object information storage unit 280 by the object detection unit 231. When there are a plurality of objects 131 detected by the object detection unit 231, the object tracking unit 232 acquires the position coordinates after movement for each of the plurality of objects 131.
 物体追跡部232は、任意の追跡手法で、物体131の追跡を行えばよい。例えば、物体追跡部232は、更新テンプレートマッチング、アクティブ探索、Mean-shift法、または、粒子フィルタなどを用いた既存の領域ベース追跡手法を用いて部位121の追跡を行うものとしてもよいし、KLT法(Kanade-Lucas-Tomasi Feature Tracker)、または、SURF Trackingなどの特徴点ベースの追跡手法を用いて物体131の追跡を行うものとしてもよい。
 また、物体追跡部232は、光学映像、赤外線映像、または、深度映像等のいずれを用いた追跡手法で物体131の追跡を行ってもよいし、光学映像、赤外線映像、または、深度映像等を複合して用いた検出手法で物体131の追跡を行ってもよい。
 物体追跡部232は、取得した、物体131の移動後の位置座標、すなわち、物体131の軌跡の情報を、位置座標の取得日時の情報とともに、物体情報記憶部280に記憶させている物体131の情報に対応付けて記憶させる。
The object tracking unit 232 may track the object 131 by any tracking method. For example, the object tracking unit 232 may track the part 121 using an update template matching, an active search, a Mean-shift method, or an existing region-based tracking method using a particle filter or the like. The object 131 may be tracked using a feature point-based tracking method such as a method (Kanade-Lucas-Tomasi Feature Tracker) or SURF Tracking.
In addition, the object tracking unit 232 may track the object 131 by a tracking method using any one of an optical image, an infrared image, a depth image, and the like, and an optical image, an infrared image, a depth image, or the like. The object 131 may be tracked by a detection method used in combination.
The object tracking unit 232 stores the acquired position coordinates after the movement of the object 131, that is, information on the trajectory of the object 131, together with information on the acquisition date and time of the position coordinates, in the object information storage unit 280. The information is stored in association with the information.
 物体認識部233は、物体検出部231が位置座標を取得し、物体情報記憶部280に記憶させた、物体131の種類、形状、または、状態を認識する。この実施の形態1において、物体131の種類とは、例えば、作業者120が組み立て作業を行う場合に用いられる物体131を特定する情報であり、組み立て対象の各部品または組み立て工具等をいう。また、物体131の形状または状態とは、例えば、作業者120が、ある部品を用いて組み立て作業を行う場合、当該部品の向き、または、基板に当該部品が組み込まれた状態等をいう。なお、これは一例に過ぎず、例えば、作業者120が工具を用いて作業を行う場合も同様である。
 物体認識部233は、任意の認識手法で、物体131の種類、形状、または、状態を認識すればよい。例えば、SURF、あるいは、HOGなどの、既存の特徴点ベースの検出法で物体131の種類、形状、または、状態を認識してもよいし、ニュートラルネットワークなどのモデルベースの検出手法で物体131の種類、形状、または、状態を認識してもよい。
 また、物体認識部233は、光学映像、赤外線映像、または、深度映像のいずれを用いた認識手法で物体131の種類、形状、または、状態の認識を行ってもよいし、光学映像、赤外線映像、または、深度映像を複合して用いた認識手法で物体131の種類、形状、または、状態の認識を行ってもよい。
 物体認識部233は、物体131の認識結果の情報を、物体131の情報と対応付けて、物体情報記憶部280に記憶させる。
 物体検出部231、物体追跡部232、および、物体認識部233によって、物体情報記憶部280に記憶させた物体131に関する情報が、物体情報となる。
 なお、物体検出部231、物体追跡部232、および、物体認識部233による物体情報は、一連の作業ごとに記憶されることになる。
The object recognition unit 233 recognizes the type, shape, or state of the object 131 that the object detection unit 231 acquires the position coordinates and stores in the object information storage unit 280. In the first embodiment, the type of the object 131 is, for example, information for specifying the object 131 used when the worker 120 performs the assembly work, and refers to each part to be assembled or an assembly tool. In addition, the shape or state of the object 131 means, for example, when the worker 120 performs an assembly operation using a certain component, the direction of the component, a state where the component is incorporated in the board, or the like. Note that this is only an example, and the same applies to the case where the worker 120 performs a work using a tool, for example.
The object recognition unit 233 may recognize the type, shape, or state of the object 131 by any recognition method. For example, the type, shape, or state of the object 131 may be recognized by an existing feature point-based detection method such as SURF or HOG, or the object 131 may be detected by a model-based detection method such as a neutral network. The type, shape, or state may be recognized.
Further, the object recognition unit 233 may recognize the type, shape, or state of the object 131 by a recognition method using any one of an optical image, an infrared image, and a depth image, and the optical image, the infrared image Alternatively, the type, shape, or state of the object 131 may be recognized by a recognition method using a combination of depth images.
The object recognition unit 233 stores information on the recognition result of the object 131 in the object information storage unit 280 in association with the information on the object 131.
Information relating to the object 131 stored in the object information storage unit 280 by the object detection unit 231, the object tracking unit 232, and the object recognition unit 233 becomes object information.
The object information by the object detection unit 231, the object tracking unit 232, and the object recognition unit 233 is stored for each series of operations.
 関連付け部240は、体部位情報取得部220が体部位情報記憶部270に記憶させた、作業者120の体の部位121に関する体部位情報と、物体情報取得部230が物体情報記憶部280に記憶させた、物体131に関する物体情報とに基づき、物体131と、当該物体131を用いた作業を行った作業者120の体の部位121との関連付けを行う。関連付け部240は、関連付け結果を認識結果分析部250に出力する。 The associating unit 240 stores the body part information related to the body part 121 of the worker 120 stored in the body part information storage unit 270 by the body part information acquisition unit 220 and the object information acquisition unit 230 stores it in the object information storage unit 280. Based on the object information regarding the object 131, the association between the object 131 and the body part 121 of the worker 120 who performed the work using the object 131 is performed. The associating unit 240 outputs the associating result to the recognition result analyzing unit 250.
 関連付け部240は、スコア算出部241と、整合部242と、位置補正部243とを備える。
 スコア算出部241は、体部位情報取得部220が体部位情報記憶部270に記憶させた、作業者120の体の部位121に関する体部位情報と、物体情報取得部230が物体情報記憶部280に記憶させた、物体131に関する物体情報とを用いて、作業者120の体の部位121と物体131との関連度を表す関連付けスコアを算出する。作業者120の体の部位121と物体131との関連度は、物体131の動きにどの作業者120の体の部位121が関連した可能性が高いかを表し、関連付けスコアが高いほど、作業者120の体の部位121と物体131とが関連した可能性が高いと判断したものとする。
 スコア算出部241は、任意の手法で関連付けスコアを算出するようにすればよい。例えば、スコア算出部241は、作業者120の体の部位121と物体131との位置座標が近ければ関連付けスコアを高くするようにしてもよいし、作業者120の体の部位121の動く方向と物体131の動く方向とが近ければ関連付けスコアを高くするようにしてもよい。また、例えば、スコア算出部241は、作業者120の体の部位121が物体131を把持している形状であれば関連付けスコアを高くするようにしてもよい。
The associating unit 240 includes a score calculating unit 241, a matching unit 242, and a position correcting unit 243.
The score calculation unit 241 includes the body part information related to the body part 121 of the worker 120 stored in the body part information storage unit 270 by the body part information acquisition unit 220 and the object information acquisition unit 230 stored in the object information storage unit 280. Using the stored object information regarding the object 131, an association score representing the degree of association between the body part 121 of the worker 120 and the object 131 is calculated. The degree of association between the body part 121 of the worker 120 and the object 131 indicates which worker body part 121 is likely to be associated with the movement of the object 131. The higher the association score, the more the worker It is assumed that there is a high possibility that 120 body parts 121 and the object 131 are related.
The score calculation unit 241 may calculate the association score by any method. For example, the score calculation unit 241 may increase the association score if the position coordinate between the body part 121 of the worker 120 and the object 131 is close, or the movement direction of the body part 121 of the worker 120 If the moving direction of the object 131 is close, the association score may be increased. In addition, for example, the score calculation unit 241 may increase the association score if the body part 121 of the worker 120 has a shape that holds the object 131.
 ここで、図3は、この実施の形態1における、スコア算出部241による関連付けスコアの算出方法の一例を説明する図である。
 図3では、体部位情報取得部220によって、体の部位121として、部位Xと部位Yが検出されて体部位情報が取得され、物体情報取得部230によって、物体131として、物体A~Dが検出されて物体情報が取得されたものとしている。また、図3では、部位X,Y、および、物体A~Dの10秒間の位置座標の軌跡の情報を得たものとし、図3上、当該10秒間の位置を、0~10の数字で示している。
Here, FIG. 3 is a diagram for explaining an example of an association score calculation method by the score calculation unit 241 in the first embodiment.
In FIG. 3, the body part information acquisition unit 220 detects the part X and the part Y as the body part 121 and acquires the body part information, and the object information acquisition unit 230 acquires the objects A to D as the object 131. It is assumed that object information has been detected. Also, in FIG. 3, it is assumed that information on the locus of the position coordinates for 10 seconds of the parts X and Y and the objects A to D is obtained, and the position for 10 seconds in FIG. Show.
 例えば、0~2秒後までの2秒間の物体Aについて、当該物体Aの周囲には、部位Xと部位Yが存在するが、物体Aと部位Xの位置座標の方が、物体Aと部位Yの位置座標よりも近い。すなわち、物体Aと部位Xとの距離の方が、物体Aと部位Yとの距離よりも近い。
 よって、スコア算出部241によって、物体Aは、当該物体Aとの距離がより近い部位Xとの関連付けスコアのほうが、部位Yとの関連付けスコアよりも高く設定される。
 同様に、8~10秒後までの2秒間についても、スコア算出部241によって、物体Aと、当該物体Aとの距離がより近い部位Xとの関連付けスコアのほうが、部位Yとの関連付けスコアよりも高く設定される。
For example, with respect to the object A for 2 seconds from 0 to 2 seconds later, the part X and the part Y exist around the object A, but the position coordinates of the object A and the part X are the object A and the part X. It is closer than the position coordinate of Y. That is, the distance between the object A and the part X is closer than the distance between the object A and the part Y.
Therefore, the score calculation unit 241 sets the object A higher in the association score with the part X that is closer to the object A than the association score with the part Y.
Similarly, for 2 seconds from 8 to 10 seconds later, the score calculation unit 241 causes the association score between the object A and the part X closer to the object A to be more than the association score with the part Y. Is also set high.
 また、例えば、3秒後の部位Yについて、物体Bを握持している形状であったとすると、スコア算出部241は、物体Bと部位Yとの関連付けスコアを高く設定する。
 このように、スコア算出部241は、任意の手法で、作業者120の体の部位121と物体131との組み合わせに関して、物体131の動きにどの体の部位121が関連した可能性が高いかを表す関連付けスコアを算出する。
 なお、上述のように、図3を用いて説明した、スコア算出部241による関連付けスコアの算出方法は一例に過ぎず、スコア算出部241は、その他の方法で関連付けスコアを算出するようにしてもよく、予め設定された基準に則り、作業中の物体131と作業者120の体の部位121との組み合わせに対する関連の強さが設定されるようになっていればよい。
Further, for example, if the part Y after 3 seconds is in the shape of holding the object B, the score calculation unit 241 sets the association score between the object B and the part Y high.
In this way, the score calculation unit 241 determines which body part 121 is likely to be related to the movement of the object 131 with respect to the combination of the body part 121 and the object 131 of the worker 120 by any method. Calculate the association score to represent.
Note that, as described above, the calculation method of the association score by the score calculation unit 241 described with reference to FIG. 3 is merely an example, and the score calculation unit 241 may calculate the association score by other methods. The strength of the association with respect to the combination of the working object 131 and the body part 121 of the worker 120 may be set in accordance with a preset criterion.
 また、スコア算出部241は、物体情報取得部230が、物体131の検出、あるいは、追跡を失敗し、物体131の位置座標の軌跡が途切れたと判断した場合には、当該物体131の位置座標の消失点と検出点とを結合して位置座標の軌跡を補間して、関連付けスコアを算出するようにしてもよい。 In addition, when the object information acquisition unit 230 determines that the detection or tracking of the object 131 has failed and the locus of the position coordinates of the object 131 is interrupted, the score calculation unit 241 determines the position coordinates of the object 131. The association score may be calculated by combining the vanishing point and the detection point and interpolating the locus of the position coordinates.
 ここで、図4は、この実施の形態1における、スコア算出部241による位置座標の軌跡の補間および補間後の関連付けスコア算出動作の一例を説明する図である。
 図3では、物体Aについて、2秒後に位置座標が検出された後、8秒後まで位置座標の軌跡が途切れている。そこで、図4を用いて、図3に示した軌跡が途切れた物体Aについて、スコア算出部241が位置座標の軌跡を補間し、当該軌跡を補間した物体Aについて部位X,Yとの関連付けスコアを算出する動作の一例を説明する。
 まず、スコア算出部241は、図4の点線で示すように、物体Aの消失点と検出点とを結合して、物体Aの位置座標の軌跡を補間する。
 次に、スコア算出部241は、補間した位置座標の軌跡において、物体Aと部位X,Yとの関連付けスコアを算出する。
Here, FIG. 4 is a diagram for explaining an example of position coordinate locus interpolation and post-interpolation association score calculation operations by the score calculation unit 241 in the first embodiment.
In FIG. 3, for the object A, after the position coordinates are detected after 2 seconds, the locus of the position coordinates is interrupted until 8 seconds later. Therefore, the score calculation unit 241 interpolates the locus of the position coordinates for the object A in which the locus shown in FIG. 3 is interrupted using FIG. 4, and the association score with the parts X and Y for the object A obtained by interpolating the locus. An example of the operation for calculating the will be described.
First, the score calculation unit 241 interpolates the locus of the position coordinates of the object A by combining the vanishing point and the detection point of the object A as indicated by the dotted line in FIG.
Next, the score calculation unit 241 calculates an association score between the object A and the parts X and Y in the interpolated position coordinate trajectory.
 例えば、補間した3秒後の物体Aの位置座標について、部位Xよりも、部位Yの方が近い。しかしながら、位置座標消失点および位置座標検出点、すなわち、2秒後および8秒後に、物体Aと、より高い関連付けスコアが算出されているのは部位Xである。そこで、例えば、スコア算出部241は、位置座標の軌跡が途切れていた間も、物体Aの移動に伴う作業には、部位Xが用いられていた可能性の方が高いと判断し、物体Aについて、部位Xとの関連付けスコアを、部位Yとの関連付けスコアよりも高く設定する。
 スコア算出部241は、物体131の位置座標が途切れた間の、補間された位置座標と周囲に存在する部位121との位置関係だけではなく、当該物体131の消失点と検出点、すなわち、位置座標が途切れた前後の、作業者120の体の部位121との関連付けとを考慮した、関連付けスコアの算出を行うことができる。
For example, the part Y is closer to the part Y than the part X with respect to the position coordinates of the object A three seconds after the interpolation. However, the position X disappearance point and the position coordinate detection point, that is, the part X for which a higher association score is calculated with the object A after 2 seconds and after 8 seconds. Therefore, for example, the score calculation unit 241 determines that there is a high possibility that the part X is used for the work accompanying the movement of the object A even while the locus of the position coordinates is interrupted. , The association score with the part X is set higher than the association score with the part Y.
The score calculation unit 241 not only detects the positional relationship between the interpolated position coordinate and the surrounding part 121 while the position coordinate of the object 131 is interrupted, but also the vanishing point and the detection point of the object 131, that is, the position The association score can be calculated in consideration of the association with the body part 121 of the worker 120 before and after the coordinates are interrupted.
 なお、上述のように、図4を用いて説明した、スコア算出部241による位置座標の軌跡の補間および補間後の関連付けスコア算出方法は一例に過ぎず、スコア算出部241は、その他の方法で位置座標の軌跡の補間および補間後の関連付けスコアの算出を行うようにしてもよい。 Note that, as described above, the interpolation of the position coordinate trajectory by the score calculation unit 241 and the correlation score calculation method after the interpolation described with reference to FIG. 4 are merely examples, and the score calculation unit 241 uses other methods. You may make it perform the calculation of the correlation score after interpolation of the locus | trajectory of a position coordinate, and interpolation.
 例えば、スコア算出部241は、同じ方向に作業者120の体の部位121が動いていれば、関連付けスコアを高くするようにしてもよい。
 具体的には、例えば、作業者120が、右手で、ある部品Vを、地点a→地点b→地点c→地点dと、地点aから地点dまで移動させたとする。その際、地点bにおいて、物体検出部231が部品Vを検出できなかったため、物体追跡部232は途中で当該部品Vの追跡ができなくなり、地点cにおいて、再び物体検出部231が部品Vを検出でき、物体追跡部232は当該部品Vの追跡を再開できたとする。なお、この場合、地点bが消失点であり、地点cが検出点である。物体追跡部232が、地点a→地点bの位置座標の軌跡と、地点c→地点dまでの位置座標の軌跡は取得することができるものの、地点b→地点cまでの位置座標の軌跡は取得できない、という状態となっているものとする。また、物体認識部233は、地点a→地点bの物体131と、地点c→地点dの物体131が、同一の物体131か異なる物体131か、認識できなかったものとする。
 このような場合、スコア算出部241は、地点b→地点cの物体131の軌跡を、例えば直線補間等により試しに補間してみて、地点a→地点b→地点c→地点dという物体131の移動方向と、右手の移動方向が一致していれば、右手で部品Vが地点a→地点b→地点c→地点dまで移動させられたと判断し、部品Vと右手との関連付けスコアを高くする。関連付けスコアが高くなることで、この後、整合部242が作業者120の体の部位121と物体131との組み合わせを決定する際に、当該関連付けが選ばれやすくなり、結果的に、地点b→地点cの軌跡の途切れをうまく補間することができる。整合部242については後述する。
For example, the score calculation unit 241 may increase the association score if the body part 121 of the worker 120 moves in the same direction.
Specifically, for example, it is assumed that the operator 120 moves a part V with the right hand from point a to point d, from point a → point b → point c → point d. At that time, since the object detection unit 231 cannot detect the component V at the point b, the object tracking unit 232 cannot track the component V halfway, and the object detection unit 231 detects the component V again at the point c. The object tracking unit 232 can resume tracking of the part V. In this case, the point b is a vanishing point and the point c is a detection point. The object tracking unit 232 can acquire the locus of position coordinates from point a to point b and the locus of position coordinates from point c to point d, but obtains the locus of position coordinates from point b to point c. It is assumed that it is not possible to do so. Further, it is assumed that the object recognition unit 233 cannot recognize whether the object 131 from the point a → the point b and the object 131 from the point c → the point d are the same object 131 or different objects 131.
In such a case, the score calculation unit 241 tries to interpolate the trajectory of the object 131 from the point b → the point c by, for example, linear interpolation, and the object 131 of the point a → the point b → the point c → the point d. If the movement direction matches the movement direction of the right hand, it is determined that the part V has been moved from point a → point b → point c → point d with the right hand, and the association score between the part V and the right hand is increased. . As the association score increases, the matching unit 242 can easily select the association when the matching unit 242 determines the combination of the body part 121 and the object 131 of the worker 120, and as a result, the point b → The discontinuity of the locus of the point c can be interpolated well. The matching unit 242 will be described later.
 また、スコア算出部241は、消失点と検出点の物体131が同一であると判断すれば、当該消失点と検出点において物体131と関連付けられた部位121との関連付けスコアを高くするようにしてもよい。
 例えば、上述した例で、物体認識部233は、地点a→地点bの物体131と、地点c→地点dの物体131が、いずれも部品Vであると認識したものとする。この場合、スコア算出部241は、地点b→地点cの位置座標の軌跡を補間し、部品Vと右手との関連付けスコアを高くする。
If the score calculation unit 241 determines that the vanishing point and the object 131 at the detection point are the same, the score calculation unit 241 increases the association score between the vanishing point and the part 121 associated with the object 131 at the detection point. Also good.
For example, in the example described above, the object recognition unit 233 recognizes that the object 131 from the point a → the point b and the object 131 from the point c → the point d are both components V. In this case, the score calculation unit 241 interpolates the locus of position coordinates from point b to point c, and increases the association score between the component V and the right hand.
 また、スコア算出部241は、消失点と検出点の位置座標の差が一定値以下であれば作業者120の体の部位121では、当該物体131は動かされていないと関連付けるスコアを高くするようにしてもよい。
 例えば、静止している部品Wの上を左手が一瞬通過したことにより、物体検出部231が、部品Wを一瞬検出できなくなったとする。スコア算出部241は、物体検出部231が部品Wを検出できなくなった前後で、当該部品Wの位置に変更がなければ、当該部品Wは左手で移動されたものではないと判断し、部品Wが左手で動かされていないと関連付けるスコアを高くする。スコア算出部241は、部品Wと左手との関連付けスコアを低くするようにしてもよい。
In addition, the score calculation unit 241 increases the score associated with the fact that the object 131 is not moved in the body part 121 of the worker 120 if the difference between the position coordinates of the vanishing point and the detection point is equal to or less than a certain value. It may be.
For example, it is assumed that the object detection unit 231 cannot detect the component W for a moment because the left hand passes over the stationary component W for a moment. If the position of the part W is not changed before and after the object detection unit 231 can no longer detect the part W, the score calculation unit 241 determines that the part W is not moved with the left hand, and the part W If the is not moved with the left hand, increase the associated score. The score calculation unit 241 may lower the association score between the component W and the left hand.
 スコア算出部241は、予め設定された基準に則り物体131の位置座標の軌跡の補間および、補間後の関連付けスコアの設定が行われるようになっていればよい。
 スコア算出部241は、算出した関連付けスコアの情報を、整合部242に出力する。
The score calculation unit 241 is only required to perform the interpolation of the locus of the position coordinates of the object 131 and the setting of the association score after the interpolation in accordance with a preset criterion.
The score calculation unit 241 outputs information on the calculated association score to the matching unit 242.
 整合部242は、スコア算出部241で算出した関連付けスコアを用いて、作業としての整合が取れる範囲で、関連付けスコアを最大化するような、作業者120の体の部位121と物体131との組み合わせを決定する。すなわち、作業者120の体の部位121と物体131とが関連付けられた関連付け結果に基づき、物体131は、当該物体131と一番関連があると判断された作業者120の体の部位121と組み合わされることになる。
 作業としての整合が取れるための条件は任意である。例えば、整合部242は、同じ体の部位121が同じ時刻に一定値以上離れた位置座標に存在しないようにする、あるいは、同じ物体131が同じ時刻に複数の矛盾する動作と関連付けられないようにすること等を整合が取れるための条件とする。整合部242は、当該条件を満たし、かつ、関連付けスコアの総和を最大化する、作業者120の体の部位121と物体131との組み合わせを決定する。
The matching unit 242 uses the association score calculated by the score calculation unit 241 to combine the body part 121 of the worker 120 and the object 131 so as to maximize the association score within a range where the matching as the work can be achieved. To decide. That is, based on the association result in which the body part 121 of the worker 120 and the object 131 are associated with each other, the object 131 is combined with the body part 121 of the worker 120 that is determined to be most related to the object 131. Will be.
Conditions for achieving consistency as work are arbitrary. For example, the matching unit 242 prevents the same body part 121 from being present at a position coordinate separated by a certain value or more at the same time, or prevents the same object 131 from being associated with a plurality of conflicting operations at the same time. Is a condition for achieving consistency. The matching unit 242 determines a combination of the body part 121 and the object 131 of the worker 120 that satisfies the condition and maximizes the sum of the association scores.
 ここで、整合部242が、図3,図4で示したような、0~10秒後の10秒間に検出された物体A~Dについて、部位X,Yとの組み合わせを決定する動作の一例を説明する。
 まず、物体Aについて、図3,4を用いて、上述したように、0~10秒後までの間、スコア算出部241によって、部位Xとの関連付けスコアが高く設定されている。よって、整合部242は、物体Aは、部位Xによって、移動を伴う作業が行われたものと判断し、物体Aと部位Xとを組み合わせる。
 また、物体Bについて、0~3秒後、および、8~10秒後までの位置座標は変わらず、3~8秒後までの間、スコア算出部241によって、部位Yとの関連付けスコアが高く設定されている。よって、整合部242は、物体Bは、部位Yによって、3秒後に移動が開始され、8秒後に検出された位置座標まで移動されたものとして、物体Bと部位Yとを組み合わせる。
Here, an example of the operation in which the matching unit 242 determines the combination of the parts X and Y with respect to the objects A to D detected in 10 seconds after 0 to 10 seconds as shown in FIGS. Will be explained.
First, for the object A, as described above with reference to FIGS. 3 and 4, the score calculation unit 241 sets a high association score with the part X for 0 to 10 seconds later. Therefore, the matching unit 242 determines that the object A has been moved according to the part X, and combines the object A and the part X.
Further, the position coordinates of the object B after 0 to 3 seconds and after 8 to 10 seconds are not changed, and the score for the association with the region Y is high by the score calculation unit 241 until after 3 to 8 seconds. Is set. Therefore, the matching unit 242 combines the object B and the part Y on the assumption that the object B starts to move after 3 seconds by the part Y and has moved to the position coordinates detected after 8 seconds.
 なお、仮に、図4を用いて説明したスコア算出部241による位置座標の軌跡の補間および関連付けスコア算出動作において、スコア算出部241が、3秒後の物体Aについて、位置座標の近さを優先し、部位Yとの関連付けスコアを、部位Xとの関連付けスコアよりも高く算出していたとする。物体Aのみのデータに基づく、3秒後の物体Aの補間位置は、図4に示すように、部位Xよりも部位Yに近いため、位置座標の近さのみに基づけば、このような関連付けスコアの計算は起こり得る。この場合、部位Yは、3秒後において、物体Aとも物体Bとも、高い関連付けスコアが設定されていることになる。
 そこで、整合部242は、作業としての整合が取れる範囲で、部位Yは、物体A、Bどちらと組み合わせるべきかを決定する。
 具体的には、例えば、3秒後において、部位Yが物体Bを握持している形状であれば、物体Bは部位Yを用いて作業されている可能性が高いので、整合部242は、部位Yは物体Bと組み合わせた方が作業としての整合が取れると判断し、物体Bと部位Yとを組み合わせる。
 あるいは、例えば、整合部242は、前後の物体Bと部位X,Yとの組み合わせから、物体Bと部位X,Yのどちらを組み合わせた方が、整合が取れるかを判断してもよい。ここでは、0~2秒、および、8~10秒の間においては、部位Xは物体Aとの関連付けスコアが高く設定されている。すなわち、物体Aは、部位Xを用いて一連の動作として作業が行われたとする方が、作業としての整合が取れるといえる。そこで、整合部242は、3秒後において、物体Bと部位Yとを組み合わせる。
Note that, in the position coordinate locus interpolation and association score calculation operation by the score calculation unit 241 described with reference to FIG. 4, the score calculation unit 241 prioritizes the proximity of the position coordinates for the object A after 3 seconds. Assume that the association score with the part Y is calculated higher than the association score with the part X. Since the interpolation position of the object A after 3 seconds based on the data of only the object A is closer to the part Y than the part X, as shown in FIG. Score calculation can occur. In this case, a high association score is set for the part Y in both the object A and the object B after 3 seconds.
Therefore, the matching unit 242 determines whether the part Y should be combined with the object A or B within a range where matching as work can be performed.
Specifically, for example, if the part Y has a shape that holds the object B after 3 seconds, the object B is likely to be operated using the part Y. The part Y is determined that the combination of the object B and the object B is more consistent as work, and the object B and the part Y are combined.
Alternatively, for example, the matching unit 242 may determine whether the combination of the object B and the parts X and Y is matched from the combination of the front and rear objects B and the parts X and Y. Here, between 0 to 2 seconds and 8 to 10 seconds, the part X has a high association score with the object A. That is, it can be said that the object A is more consistent as a work if the work is performed as a series of operations using the part X. Therefore, the matching unit 242 combines the object B and the part Y after 3 seconds.
 また、物体C,Dについては、10秒間、位置座標に動きがない。また、8~10秒後の間で、周囲に部位X,Yが検出されているが、部位X,Yは、それぞれ、関連付けスコアが物体Cよりも高く設定され、移動を伴う作業を行ったと判断できる物体A,Bが存在する。8~10秒後の間で、移動のない物体C,Dを部位X,Yと組み合わせると、作業として矛盾が生じる。そこで、整合部242は、物体C,Dについては、部位X,Yを用いた作業は行われなかったと判断し、部位X,Yとの組み合わせを行わないようにする。 Also, the objects C and D have no movement in position coordinates for 10 seconds. In addition, the parts X and Y are detected around 8 to 10 seconds later, and the parts X and Y are set to have an association score higher than that of the object C, respectively, and an operation involving movement is performed. There are objects A and B that can be determined. If the objects C and D that do not move are combined with the parts X and Y after 8 to 10 seconds, a contradiction occurs as work. Therefore, the matching unit 242 determines that the work using the parts X and Y is not performed for the objects C and D, and does not perform the combination with the parts X and Y.
 なお、上述した方法は一例に過ぎず、整合部242は、予め設定された適宜の方法で、作業としての整合が取れる範囲を判断し、作業者120の体の部位121と物体131との組み合わせを決定するようになっていればよい。
 整合部242は、決定した、作業者120の体の部位121と物体131との組み合わせを位置補正部243に出力する。
Note that the above-described method is merely an example, and the matching unit 242 determines a range in which the matching as the work can be obtained by an appropriate method set in advance, and the combination of the body part 121 and the object 131 of the worker 120. As long as it comes to decide.
The matching unit 242 outputs the determined combination of the body part 121 and the object 131 of the worker 120 to the position correction unit 243.
 位置補正部243は、整合部242で決定した、作業者120の体の部位121と物体131との組み合わせを用いて、作業者120の体の部位121または物体131、あるいはその両方の位置座標を補正する。
 位置補正部243は、任意の補正手法を用いて作業者120の体の部位121または物体131、あるいはその両方の位置座標を補正するようにすればよい。例えば、位置補正部243は、物体131の位置座標が途切れている場合に、当該物体131に組み合わせられた部位121の軌跡と付き合わせて、当該物体131の位置座標を補間するようにしてもよいし、逆に、部位121の位置座標の軌跡が途切れている場合に、当該部位121に組み合わせられた物体131の軌跡と付き合わせて、当該部位121の位置座標を補間するようにしてもよい。
The position correction unit 243 uses the combination of the body part 121 and the object 131 of the worker 120 determined by the matching unit 242 to determine the position coordinates of the body part 121 and / or the object 131 of the worker 120 or both. to correct.
The position correction unit 243 may correct the position coordinates of the body part 121 and / or the object 131 of the worker 120 using any correction method. For example, when the position coordinates of the object 131 are interrupted, the position correction unit 243 may interpolate the position coordinates of the object 131 in association with the locus of the part 121 combined with the object 131. On the contrary, when the locus of the position coordinate of the part 121 is interrupted, the position coordinate of the part 121 may be interpolated in association with the locus of the object 131 combined with the part 121.
 ここで、図5は、この実施の形態1における、位置補正部243による位置座標の補正動作の一例を説明する図である。
 図5では、一例として、位置補正部243が、図4で示したように、スコア算出部241によって位置座標が補間され、整合部242によって部位Xと組み合わされた、物体Aの2~8秒後までの間の位置座標を補正する動作を説明する。
 図4において、点線で示したように位置座標が補間された物体Aの軌跡について、位置補正部243は、当該物体Aと組み合わされた部位Xの位置座標の軌跡と付き合わせて、位置座標を補正する。
 その結果、図5に示すように、2~8秒後までの間において、物体Aの位置座標が、部位Xと同様の軌跡となる。
Here, FIG. 5 is a diagram for explaining an example of the position coordinate correction operation by the position correction unit 243 in the first embodiment.
In FIG. 5, as an example, the position correction unit 243 has the position coordinates interpolated by the score calculation unit 241 and combined with the part X by the matching unit 242 as shown in FIG. An operation for correcting the position coordinates until after will be described.
In FIG. 4, with respect to the trajectory of the object A in which the position coordinates are interpolated as indicated by the dotted line, the position correction unit 243 adds the position coordinates to the locus of the position coordinates of the part X combined with the object A. to correct.
As a result, as shown in FIG. 5, the position coordinate of the object A becomes a locus similar to that of the part X until 2 to 8 seconds later.
 なお、上述したように、図5を用いて説明した位置補正部243の動作は一例に過ぎず、位置補正部243は、その他の方法で位置座標の補正を行ってもよい。
 例えば、位置補正部243は、補間する際、位置補正部243は、軌跡をそのままコピーするようにし、軌跡の始点と終点が不連続にならないように始点と終点の位置にあわせて、関連付けられた部位121あるいは物体131の位置座標を補正するようにしてもよい。
 具体的には、例えば、右手で部品Zを地点e→地点f→地点g→地点hまで移動させ、地点f→地点gで部品Zの軌跡が途切れ、当該途切れた間の位置座標を補間する際、位置補正部243は、例えば、位置e→位置fでの部品Zの位置座標の軌跡を、同時刻の右手の位置座標の軌跡で補間する。この場合、部品Zの位置座標と右手の位置座標とは一致しないことがあり得るので、地点f→地点gにおいて、部品Zの位置座標の軌跡が不自然に不連続になることが想定される。そこで、位置補正部243は、軌跡の始点と終点が不連続にならないように始点と終点の位置にあわせて、部品Zが自然な、すなわち、連続的な、軌跡を描くように、当該部品Zの位置座標を補正するようにしてもよい。
As described above, the operation of the position correction unit 243 described with reference to FIG. 5 is merely an example, and the position correction unit 243 may correct the position coordinates by other methods.
For example, when interpolating, the position correction unit 243 causes the position correction unit 243 to copy the locus as it is, and associates the locus with the start point and the end point so that the start point and the end point of the locus are not discontinuous. The position coordinates of the part 121 or the object 131 may be corrected.
Specifically, for example, the part Z is moved from the point e → the point f → the point g → the point h with the right hand, the locus of the part Z is interrupted at the point f → the point g, and the position coordinates between the interruptions are interpolated. At this time, for example, the position correction unit 243 interpolates the locus of the position coordinates of the part Z from the position e → the position f with the locus of the position coordinates of the right hand at the same time. In this case, since the position coordinate of the part Z and the position coordinate of the right hand may not coincide with each other, it is assumed that the locus of the position coordinate of the part Z becomes unnaturally discontinuous from the point f to the point g. . Therefore, the position correction unit 243 adjusts the part Z so that the part Z is natural, that is, continuous, in accordance with the positions of the start point and the end point so that the start point and end point of the path are not discontinuous. The position coordinates may be corrected.
 このように、例えば、物体131の位置座標の軌跡が途切れてしまった場合でも、作業者120の体の部位121と物体131との関連付けを考慮して、作業者120の体の部位、あるいは、物体131の軌跡の補間を行うことができる。これにより、例えば、座標がとれない左手、または、右手の座標を、座標が消失した場所の座標から推定する等、物体と作業者の体の部位との対応関係を考慮せず、作業者の体の部位あるいは物体単独で軌跡の補間を行う従来の技術と比べ、より高精度に軌跡の補間を行うことができる。 Thus, for example, even when the locus of the position coordinates of the object 131 is interrupted, considering the association between the body part 121 of the worker 120 and the object 131, the body part of the worker 120, or The trajectory of the object 131 can be interpolated. As a result, for example, the left hand or right hand coordinates that cannot be taken are estimated from the coordinates of the location where the coordinates have disappeared, and the correspondence between the object and the body part of the worker is not taken into consideration. Compared with the conventional technique in which the locus is interpolated by the body part or the object alone, the locus can be interpolated with higher accuracy.
 位置補正部243は、補正後の、作業者120の体の部位121、および、物体131の位置座標と、組み合せ結果の情報を、関連付け情報として、認識結果分析部250に出力する。なお、位置補正部243は、補正を行わなかった場合は、補正前の位置座標を認識結果分析部250に出力する。 The position correction unit 243 outputs the corrected position coordinates of the body part 121 and the object 131 of the worker 120 and information on the combination result to the recognition result analysis unit 250 as association information. In addition, the position correction | amendment part 243 outputs the position coordinate before correction | amendment to the recognition result analysis part 250, when not correct | amending.
 認識結果分析部250は、関連付け部240が出力した関連付け情報に基づき、一連の作業を通した、作業全体の認識結果を認識する。また、認識結果分析部250は、作業全体の認識結果に基づき、作業を分析する。作業全体の認識結果とは、ここでは、物体131と当該物体131を用いた作業を行った部位121とが関連付けられた関連付け情報のことをいい、作業の一連の流れを示す情報である。
 そして、認識結果分析部250は、作業全体の認識結果、あるいは、当該認識結果を分析した分析結果に関する情報を出力制御部260に出力する。
The recognition result analysis unit 250 recognizes the recognition result of the entire work through a series of work based on the association information output by the association unit 240. Further, the recognition result analysis unit 250 analyzes the work based on the recognition result of the whole work. Here, the recognition result of the entire work refers to association information in which the object 131 and the part 121 that performed the work using the object 131 are associated with each other, and is information indicating a series of work flows.
Then, the recognition result analysis unit 250 outputs to the output control unit 260 information regarding the recognition result of the entire work or the analysis result obtained by analyzing the recognition result.
 認識結果分析部250は、成否判定部251と、種別識別部252と、内容比較部253と、条件判定部254とを備える。
 成否判定部251は、関連付け部240が決定した、作業者120の体の部位121と物体131との関連付け情報を用いて、規定の作業が成されたか否かを判定する。具体的には、例えば、予め、物体131ごとに、当該物体131を用いて作業者120が行うべき作業が登録された作業一覧を記憶しておき、当該記憶された作業一覧と関連付け情報とに基づいて、成否判定部251は、作業者120が、行うべき作業が成されたか否かを判定する。なお、これは一例に過ぎず、成否判定部251は、その他の方法で、規定の作業が成されたか否かを判定してもよい。
 成否判定部251は、規定の作業が成されたか否かの判定結果を出力制御部260に出力する。
The recognition result analysis unit 250 includes a success / failure determination unit 251, a type identification unit 252, a content comparison unit 253, and a condition determination unit 254.
The success / failure determination unit 251 determines whether or not the prescribed work has been performed using the association information between the body part 121 of the worker 120 and the object 131 determined by the association unit 240. Specifically, for example, for each object 131, a work list in which work to be performed by the worker 120 using the object 131 is registered in advance, and the stored work list and the association information are stored. Based on this, the success / failure determination unit 251 determines whether the worker 120 has performed an operation to be performed. This is merely an example, and the success / failure determination unit 251 may determine whether or not the prescribed work has been performed by another method.
The success / failure determination unit 251 outputs a determination result on whether or not the prescribed work has been performed to the output control unit 260.
 種別識別部252は、関連付け部240が決定した、作業者120の体の部位121と物体131との関連付け情報を用いて、規定の作業種別のいずれに該当するかを識別する。具体的には、例えば、予め、作業ごとに、どの物体131を用いてどのような動作を行うかの手順が紐付けられ種別分けされた作業種別情報を記憶させておき、当該記憶された作業種別情報と、関連付け情報とに基づいて、種別識別部252は、作業者120が行った作業が、規定の作業種別のいずれに該当するかを識別する。なお、これは一例に過ぎず、種別識別部252は、その他の方法で、規定の作業種別のいずれに該当するかを識別するようにしてもよい。
 種別識別部252は、規定の作業種別のいずれに該当するかの識別結果を出力制御部260に出力する。
The type identifying unit 252 identifies which of the prescribed work types corresponds to using the association information between the body part 121 of the worker 120 and the object 131 determined by the associating unit 240. Specifically, for example, for each work, work type information in which a procedure of which object 131 is used to perform what operation is linked and classified is stored in advance, and the stored work is stored. Based on the type information and the association information, the type identification unit 252 identifies which of the prescribed work types the work performed by the worker 120 corresponds. Note that this is merely an example, and the type identification unit 252 may identify which of the prescribed work types corresponds to other methods.
The type identifying unit 252 outputs to the output control unit 260 an identification result indicating which of the prescribed work types corresponds.
 内容比較部253は、関連付け部240が決定した、作業者120の体の部位121と物体131との関連付け情報を用いて、作業者120が実施した作業とは別の作業との比較によって、作業者120が実施した作業内容を分析する。具体的には、例えば、予め、手本となる他の作業者による作業を作業履歴として記憶させておき、当該記憶された作業履歴と、関連付け情報とに基づいて、内容比較部253は、作業内容を比較し、他の作業者の作業と比べて、例えば、不足している作業がある、無駄な作業がある、作業時間が遅い、速い等を判断する。なお、これは一例に過ぎず、内容比較部253は、その他の方法で、作業内容を比較するようにしてもよい。
 内容比較部253は、作業内容の比較結果を出力制御部260に出力する。
The content comparison unit 253 uses the association information determined by the association unit 240 between the body part 121 of the worker 120 and the object 131 to compare the work performed by the worker 120 with a work different from the work performed by the worker 120. The contents of work performed by the person 120 are analyzed. Specifically, for example, work by another worker who serves as a model is stored in advance as a work history, and based on the stored work history and the association information, the content comparison unit 253 The contents are compared, and, for example, it is determined whether there is a missing work, there is a useless work, the work time is slow, or the work is faster than the work of other workers. This is merely an example, and the content comparison unit 253 may compare the work content by other methods.
The content comparison unit 253 outputs the work content comparison result to the output control unit 260.
 条件判定部254は、関連付け部240が決定した、作業者120の体の部位121と物体131との関連付け情報を用いて、作業内容が規定の条件を満たすか否かを判定する。具体的には、予め記憶されている、上述したような作業種別情報と、関連付け情報とに基づいて、条件判定部254は、作業者120が作業に要した時間が長すぎないか等の条件を判定する。なお、条件は適宜設定可能であり、条件判定部254はその他の条件を満たすか否かを判定するようにしてもよい。 The condition determination unit 254 determines whether or not the work content satisfies a specified condition, using the association information between the body part 121 and the object 131 of the worker 120 determined by the association unit 240. Specifically, based on the work type information as described above and the association information stored in advance, the condition determination unit 254 determines whether the time taken by the worker 120 for the work is too long. Determine. The conditions can be set as appropriate, and the condition determination unit 254 may determine whether or not other conditions are satisfied.
 なお、この実施の形態1の認識結果分析部250は、図2に示すように、成否判定部251と、種別識別部252と、内容比較部253と、条件判定部254とを備えているが、これに限らず、認識結果分析部250は、成否判定部251と、種別識別部252と、内容比較部253と、条件判定部254のうち、少なくとも1つを備えていればよい。 The recognition result analysis unit 250 according to the first embodiment includes a success / failure determination unit 251, a type identification unit 252, a content comparison unit 253, and a condition determination unit 254 as shown in FIG. Not limited to this, the recognition result analysis unit 250 only needs to include at least one of the success / failure determination unit 251, the type identification unit 252, the content comparison unit 253, and the condition determination unit 254.
 また、認識結果分析部250は、成否判定部251、種別識別部252、内容比較部253、または、条件判定部254による作業全体の認識結果を用いた作業分析結果とともに、関連付け部240が出力した関連付け情報、すなわち、作業全体の認識結果の情報もあわせて出力制御部260に出力する。 The recognition result analysis unit 250 outputs the success / failure determination unit 251, the type identification unit 252, the content comparison unit 253, or the work analysis result using the overall work recognition result by the condition determination unit 254, and the association unit 240 outputs the result. The association information, that is, information on the recognition result of the entire work is also output to the output control unit 260.
 出力制御部260は、認識結果分析部250が出力した作業全体の認識結果を、出力装置の少なくとも1つに出力する。
 また、出力制御部260は、分析結果出力指示を受け付けた場合は、認識結果分析部250が出力した作業全体の認識結果を用いた作業分析結果を、出力装置の少なくとも1つに、出力する。
The output control unit 260 outputs the recognition result of the entire work output by the recognition result analysis unit 250 to at least one of the output devices.
In addition, when receiving an analysis result output instruction, the output control unit 260 outputs a work analysis result using the recognition result of the entire work output by the recognition result analysis unit 250 to at least one of the output devices.
 ユーザは、入力装置(図示省略)を用いて、出力させたい分析結果を指定することができ、指示受付部(図示省略)が、当該ユーザからの指定を分析結果出力指示として受け付け、認識結果分析部250に出力指示を行う。
 そして、出力制御部260は、認識結果分析部250から出力された作業分析結果を、出力装置の少なくとも1つに、出力する。
 具体的には、例えば、認識結果分析部250の成否判定部251から、規定の作業が成されたか否かの判定結果の情報が出力されると、出力制御部260は、ディスプレイ151に、判定結果を○、×で表示させる。あるいは、出力制御部260は、スピーカ152に、判定結果に応じた音を出力させてもよいし、記憶装置153に判定結果を記憶させるようにしてもよい。出力制御部260は、制御装置154に判定結果に応じて異なる制御信号を送信するようにしてもよい。
The user can specify an analysis result to be output by using an input device (not shown), and an instruction receiving unit (not shown) receives the designation from the user as an analysis result output instruction, and performs recognition result analysis. An output instruction is given to the unit 250.
Then, the output control unit 260 outputs the work analysis result output from the recognition result analysis unit 250 to at least one of the output devices.
Specifically, for example, when information on a determination result indicating whether or not a prescribed work has been performed is output from the success / failure determination unit 251 of the recognition result analysis unit 250, the output control unit 260 determines whether or not the determination is made on the display 151. The result is displayed with ○ and ×. Alternatively, the output control unit 260 may cause the speaker 152 to output a sound corresponding to the determination result, or may cause the storage device 153 to store the determination result. The output control unit 260 may transmit different control signals to the control device 154 according to the determination result.
 また、例えば、認識結果分析部250の種別識別部252から、作業種別の情報が出力されると、出力制御部260は、ディスプレイ151に作業種別を示す映像を表示させる。あるいは、出力制御部260は、スピーカ152で作業種別に応じた音を出力させてもよいし、記憶装置153に作業種別を記憶させるようにしてもよい。出力制御部260は、制御装置154に作業種別に応じて異なる制御信号を送信するようにしてもよい。 Also, for example, when the information on the work type is output from the type identification unit 252 of the recognition result analysis unit 250, the output control unit 260 causes the display 151 to display a video indicating the work type. Alternatively, the output control unit 260 may cause the speaker 152 to output a sound corresponding to the work type, or may cause the storage device 153 to store the work type. The output control unit 260 may transmit different control signals to the control device 154 according to the work type.
 また、例えば、認識結果分析部250の内容比較部253から、作業の比較結果の情報が出力されると、出力制御部260は、ディスプレイ151に作業の比較結果の情報を示す映像を表示させる。あるいは、出力制御部260は、スピーカ152で不足している作業を音または音声を出力させてもよいし、記憶装置153に比較結果を記憶させるようにしてもよい。出力制御部260は、制御装置154に比較結果に応じて異なる制御信号を送信するようにしてもよい。 Further, for example, when the information of the comparison result of the work is output from the content comparison unit 253 of the recognition result analysis unit 250, the output control unit 260 causes the display 151 to display a video indicating the information of the comparison result of the work. Alternatively, the output control unit 260 may output a sound or sound for work that is lacking in the speaker 152, or may store the comparison result in the storage device 153. The output control unit 260 may transmit different control signals to the control device 154 according to the comparison result.
 また、例えば、認識結果分析部250の条件判定部254から、作業内容が規定の条件を満たすか否かの判定結果の情報が出力されると、出力制御部260は、ディスプレイ151に作業が遅すぎる旨の表示を表示させる。あるいは、出力制御部260は、スピーカ152で作業が遅すぎる旨の音または音声を出力させてもよいし、記憶装置153に判定結果を記憶させるようにしてもよい。出力制御部260は、制御装置154に比較結果に応じて異なる制御信号を送信するようにしてもよい。 Further, for example, when information on a determination result indicating whether the work content satisfies a specified condition is output from the condition determination unit 254 of the recognition result analysis unit 250, the output control unit 260 delays the operation on the display 151. A message indicating that it is too much is displayed. Alternatively, the output control unit 260 may cause the speaker 152 to output a sound or sound indicating that the work is too slow, or may cause the storage device 153 to store the determination result. The output control unit 260 may transmit different control signals to the control device 154 according to the comparison result.
 体部位情報記憶部270は、体部位情報取得部220が取得した作業者120の体の部位121に関する体部位情報を記憶する。
 物体情報記憶部280は、物体情報取得部230が取得した物体131に関する物体情報を記憶する。
 なお、ここでは、図2に示すように、体部位情報記憶部270と、物体情報記憶部280は、作業認識装置100が備えるようにしたが、これに限らず、体部位情報記憶部270と物体情報記憶部280は、作業認識装置100の外部の、作業認識装置100が参照可能な場所に備えるものとしてもよい。
 また、体部位情報記憶部270と物体情報記憶部280とを1つの記憶媒体としてもよい。
The body part information storage unit 270 stores body part information regarding the body part 121 of the worker 120 acquired by the body part information acquisition unit 220.
The object information storage unit 280 stores object information related to the object 131 acquired by the object information acquisition unit 230.
Here, as shown in FIG. 2, the body part information storage unit 270 and the object information storage unit 280 are provided in the work recognition device 100, but not limited thereto, the body part information storage unit 270 and The object information storage unit 280 may be provided outside the work recognition apparatus 100 in a place where the work recognition apparatus 100 can refer to it.
The body part information storage unit 270 and the object information storage unit 280 may be a single storage medium.
 図6A,図6Bは、この発明の実施の形態1に係る作業認識装置100のハードウェア構成の一例を示す図である。
 この発明の実施の形態1において、体部位情報取得部220と、物体情報取得部230と、関連付け部240と、認識結果分析部250と、出力制御部260の各機能は、処理回路601により実現される。すなわち、作業認識装置100は、取得したセンサデータに作業者120による作業を認識する処理を行い、認識結果等を出力する制御を行うための処理回路601を備える。
 処理回路601は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリ607に格納されるプログラムを実行するCPU(Central Processing Unit)606であってもよい。
6A and 6B are diagrams showing an example of a hardware configuration of the work recognition apparatus 100 according to Embodiment 1 of the present invention.
In the first embodiment of the present invention, the functions of the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 are realized by the processing circuit 601. Is done. That is, the work recognition apparatus 100 includes a processing circuit 601 for performing processing for recognizing work performed by the worker 120 on the acquired sensor data and performing control for outputting a recognition result or the like.
The processing circuit 601 may be dedicated hardware as shown in FIG. 6A or may be a CPU (Central Processing Unit) 606 that executes a program stored in the memory 607 as shown in FIG. 6B.
 処理回路601が専用のハードウェアである場合、処理回路601は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、またはこれらを組み合わせたものが該当する。 When the processing circuit 601 is dedicated hardware, the processing circuit 601 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), and an FPGA (Field-Programmable). Gate Array) or a combination of these.
 処理回路601がCPU606の場合、体部位情報取得部220と、物体情報取得部230と、関連付け部240と、認識結果分析部250と、出力制御部260の各機能は、ソフトウェア、ファームウェア、または、ソフトウェアとファームウェアとの組み合わせにより実現される。すなわち、体部位情報取得部220と、物体情報取得部230と、関連付け部240と、認識結果分析部250と、出力制御部260は、HDD(Hard Disk Drive)602、メモリ607等に記憶されたプログラムを実行するCPU606、システムLSI(Large-Scale Integration)等の処理回路により実現される。また、HDD602、メモリ607等に記憶されたプログラムは、体部位情報取得部220と、物体情報取得部230と、関連付け部240と、認識結果分析部250と、出力制御部260の手順や方法をコンピュータに実行させるものであるとも言える。ここで、メモリ607とは、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read-Only Memory)等の、不揮発性または揮発性の半導体メモリや、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)等が該当する。 When the processing circuit 601 is the CPU 606, each function of the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 is software, firmware, or Realized by a combination of software and firmware. That is, the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 are stored in the HDD (Hard Disk Drive) 602, the memory 607, and the like. It is realized by a processing circuit such as a CPU 606 for executing a program and a system LSI (Large-Scale Integration). The programs stored in the HDD 602, the memory 607, and the like are the procedures and methods of the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260. It can be said that it is what is executed by a computer. Here, the memory 607 is, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory, an EEPROM). And volatile semiconductor memories, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
 なお、体部位情報取得部220と、物体情報取得部230と、関連付け部240と、認識結果分析部250と、出力制御部260の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。例えば、体部位情報取得部220については専用のハードウェアとしての処理回路601でその機能を実現し、物体情報取得部230と、関連付け部240と、認識結果分析部250と、出力制御部260については処理回路がメモリ607に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。
 また、作業認識装置100は、可視光カメラ111、赤外線カメラ112、深度センサ113、加速度センサ114、ジャイロセンサ115、ディスプレイ151、スピーカ152等の外部機器との通信を行う通信装置605を有する。
 センサデータ取得部210は、通信装置605を利用して、可視光カメラ111、赤外線カメラ112、深度センサ113、加速度センサ114、ジャイロセンサ115等からセンサデータを取得する入力インタフェース装置303を構成する。
 また、作業認識装置100は、作業認識結果等を出力するための出力インタフェース装置304を有する。出力制御部260は、通信装置605を利用してディスプレイ151、スピーカ152等と通信を行い、出力インタフェース装置604を利用して、ディスプレイ151、スピーカ152等に対して作業認識結果等を出力する。
 体部位情報記憶部270と、物体情報記憶部280は、例えば、HDD602を使用する。なお、これは一例に過ぎず、体部位情報記憶部270と、物体情報記憶部280は、DVD、メモリ607等によって構成されるものであってもよい。
 また、作業認識装置100は、作業認識結果等を記憶しておくための補助記憶装置(図示省略)を有するように構成してもよい。
Note that some of the functions of the body part information acquisition unit 220, the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 are realized by dedicated hardware. The unit may be realized by software or firmware. For example, the function of the body part information acquisition unit 220 is realized by a processing circuit 601 as dedicated hardware, and the object information acquisition unit 230, the association unit 240, the recognition result analysis unit 250, and the output control unit 260 are realized. The processing circuit can realize the function by reading and executing the program stored in the memory 607.
The work recognition apparatus 100 also includes a communication device 605 that communicates with external devices such as a visible light camera 111, an infrared camera 112, a depth sensor 113, an acceleration sensor 114, a gyro sensor 115, a display 151, and a speaker 152.
The sensor data acquisition unit 210 configures an input interface device 303 that acquires sensor data from the visible light camera 111, the infrared camera 112, the depth sensor 113, the acceleration sensor 114, the gyro sensor 115, and the like using the communication device 605.
The work recognition apparatus 100 also includes an output interface device 304 for outputting work recognition results and the like. The output control unit 260 communicates with the display 151, the speaker 152, and the like using the communication device 605, and outputs work recognition results and the like to the display 151, the speaker 152, and the like using the output interface device 604.
The body part information storage unit 270 and the object information storage unit 280 use, for example, the HDD 602. This is only an example, and the body part information storage unit 270 and the object information storage unit 280 may be configured by a DVD, a memory 607, and the like.
Further, the work recognition device 100 may be configured to have an auxiliary storage device (not shown) for storing work recognition results and the like.
 図7は、実施の形態1における制御装置154のハードウェア構成の一例を示す図である。
 制御装置154は、作業認識装置100と通信を行う通信装置703を有し、当該通信装置703を利用して、作業認識装置100が通信装置605を経由して送信した作業認識結果等を受信する。
 また、制御装置154は、例えば、サーボ等の駆動装置(図示省略)を動作させるための指令信号を生成するCPU701を有する。
 また、制御装置154は、通信装置703を利用して受信した作業認識結果等と、例えば、サーボ等の駆動装置を動作させるための指令信号を一時的に格納するメモリ702を有する。
 通信装置703は、作業認識装置100からの作業認識結果等の受信、および、サーボ等の駆動装置を動作させるための指令信号の送信を行う。
FIG. 7 is a diagram illustrating an example of a hardware configuration of the control device 154 according to the first embodiment.
The control device 154 includes a communication device 703 that communicates with the work recognition device 100, and receives the work recognition result and the like transmitted by the work recognition device 100 via the communication device 605 using the communication device 703. .
The control device 154 includes a CPU 701 that generates a command signal for operating a driving device (not shown) such as a servo.
In addition, the control device 154 includes a memory 702 that temporarily stores a work recognition result received using the communication device 703 and a command signal for operating a driving device such as a servo.
The communication device 703 receives a work recognition result from the work recognition device 100 and transmits a command signal for operating a drive device such as a servo.
 この実施の形態1に係る作業認識装置100の動作について説明する。
 図8は、この発明の実施の形態1に係る作業認識装置100の動作を説明するフローチャートである。
 センサデータ取得部210は、可視光カメラ111、赤外線カメラ112、深度センサ113、加速度センサ114、または、ジャイロセンサ115の少なくとも1つのセンサからセンサデータを取得する(ステップST801)。
 センサデータ取得部210は、取得したセンサデータを、体部位情報取得部220および物体情報取得部230に出力する。
 体部位情報取得部220は、センサデータ取得部210が出力したセンサデータに基づき、作業者120の体の部位121に関する体部位情報を取得する(ステップST802)。
The operation of the work recognition apparatus 100 according to the first embodiment will be described.
FIG. 8 is a flowchart for explaining the operation of the work recognition apparatus 100 according to Embodiment 1 of the present invention.
Sensor data acquisition section 210 acquires sensor data from at least one of visible light camera 111, infrared camera 112, depth sensor 113, acceleration sensor 114, and gyro sensor 115 (step ST801).
The sensor data acquisition unit 210 outputs the acquired sensor data to the body part information acquisition unit 220 and the object information acquisition unit 230.
Body part information acquisition section 220 acquires body part information related to body part 121 of worker 120 based on the sensor data output from sensor data acquisition section 210 (step ST802).
 図9は、図8のステップST802における、体部位情報取得部220の動作の詳細を説明するフローチャートである。
 体部位情報取得部220の体部位検出部221は、既に取得済みの体部位情報を体部位情報記憶部270から取得する(ステップST901)。
FIG. 9 is a flowchart illustrating details of the operation of body part information acquisition section 220 in step ST802 of FIG.
The body part detection unit 221 of the body part information acquisition unit 220 acquires the already acquired body part information from the body part information storage unit 270 (step ST901).
 体部位検出部221は、体部位情報記憶部270から取得した体部位情報に基づき、センサデータ取得部210が出力したセンサデータに含まれる光学映像、赤外線映像、あるいは、深度映像の前フレームにおいて、作業者120の体の各部位121が検出済かどうかを判定する(ステップST902)。なお、体部位検出部221は、前フレームを、体部位情報に含まれる位置情報取得日時の情報と、現在の日時情報とから判断すればよい。 The body part detection unit 221 is based on the body part information acquired from the body part information storage unit 270, in the optical image, infrared image, or depth image before frame included in the sensor data output from the sensor data acquisition unit 210. It is determined whether each part 121 of the worker's 120 body has been detected (step ST902). The body part detection unit 221 may determine the previous frame from the position information acquisition date / time information included in the body part information and the current date / time information.
 ステップST902において、前フレームで部位121が検出されていなかった場合(ステップST902の“NO”の場合)、体部位検出部221は、センサデータ取得部210が出力したセンサデータに基づき、作業者120の体の部位121を検出し(ステップST903)、作業者120の体の部位121が検出できた場合(ステップST904の“YES”の場合)、当該部位121の位置座標を取得する(ステップST907)。
 そして、体部位検出部221は、取得した作業者120の体の部位121の位置座標の情報を、取得日時の情報と対応付けて、体部位情報記憶部270に記憶させる。
 作業者120の体の部位121が検出できなかった場合(ステップST904の“NO”の場合)、図8のステップST803へ進む。
In step ST902, when the part 121 has not been detected in the previous frame (in the case of “NO” in step ST902), the body part detection unit 221 uses the sensor data output by the sensor data acquisition unit 210 to perform the operator 120. When the body part 121 of the worker 120 is detected ("YES" in step ST904), the position coordinates of the part 121 are acquired (step ST907). .
Then, the body part detection unit 221 stores the acquired position coordinate information of the body part 121 of the worker 120 in the body part information storage unit 270 in association with the acquisition date information.
When the body part 121 of the worker 120 cannot be detected (in the case of “NO” in step ST904), the process proceeds to step ST803 in FIG.
 ステップST902において、前フレームで部位121が検出済みであった場合(ステップST902の“YES”の場合)、体部位情報取得部220の体部位追跡部222は、体部位検出部221が前フレームにおいて位置座標を取得した、作業者120の体の部位121を追跡し(ステップST905)、作業者120の体の部位121が追跡できた場合(ステップST906の“YES”の場合)、当該部位121の移動後の位置座標を取得する(ステップST907)。
 そして、体部位追跡部222は、取得した位置座標の情報を、取得日時の情報と対応付けて、体部位情報記憶部270に記憶させる。
 作業者120の体の部位121が追跡できなかった場合(ステップST906の“NO”の場合)、図8のステップST803へ進む。
In step ST902, when the part 121 has been detected in the previous frame (in the case of “YES” in step ST902), the body part tracking unit 222 of the body part information acquisition unit 220 indicates that the body part detection unit 221 is in the previous frame. The body part 121 of the worker 120 that has acquired the position coordinates is tracked (step ST905), and when the body part 121 of the worker 120 can be tracked (in the case of “YES” in step ST906), The position coordinates after the movement are acquired (step ST907).
Then, the body part tracking unit 222 stores the acquired position coordinate information in the body part information storage unit 270 in association with the acquisition date information.
If the body part 121 of the worker 120 cannot be tracked (in the case of “NO” in step ST906), the process proceeds to step ST803 in FIG.
 体部位認識部223は、体部位検出部221が位置座標を取得し、体部位情報記憶部270に記憶させた、作業者120の体の部位121の種類、形状、または、状態を認識する(ステップST908)。 The body part recognition unit 223 recognizes the type, shape, or state of the body part 121 of the worker 120, which the body part detection unit 221 acquires the position coordinates and stores in the body part information storage unit 270 ( Step ST908).
 このように、図9で示す、ステップST901~ステップST908までの一連の処理が、体部位情報取得部220による、センサデータ取得部210が出力したセンサデータに含まれる光学映像、赤外線映像、あるいは、深度映像の1フレーム分の体部位情報取得の処理である。 In this way, the series of processing from step ST901 to step ST908 shown in FIG. 9 is performed by the body part information acquisition unit 220, such as an optical image, infrared image, or infrared image included in the sensor data output from the sensor data acquisition unit 210. This is a process of acquiring body part information for one frame of a depth image.
 図8のフローチャートに戻る。
 物体情報取得部230は、センサデータ取得部210が出力したセンサデータに基づき、作業者120が作業に用いる物体131に関する物体情報を取得する(ステップST803)。
Returning to the flowchart of FIG.
Object information acquisition section 230 acquires object information related to object 131 used by operator 120 for work based on the sensor data output from sensor data acquisition section 210 (step ST803).
 図10は、図8のステップST803における、物体情報取得部230の動作の詳細を説明するフローチャートである。
 物体情報取得部230の物体検出部231は、既に取得済みの物体情報を物体情報記憶部280から取得する(ステップST1001)。
FIG. 10 is a flowchart illustrating details of the operation of the object information acquisition unit 230 in Step ST803 of FIG.
The object detection unit 231 of the object information acquisition unit 230 acquires already acquired object information from the object information storage unit 280 (step ST1001).
 物体検出部231は、物体情報記憶部280から取得した物体情報に基づき、各物体131について、センサデータ取得部210が出力したセンサデータの光学映像、赤外線映像、あるいは、深度映像の前フレームにおいて、検出済みであるかどうかを判定する(ステップST1002)。なお、物体検出部231は、前フレームを、物体情報に含まれる位置情報取得日時の情報と、現在の日時情報とから判断すればよい。 Based on the object information acquired from the object information storage unit 280, the object detection unit 231 includes, for each object 131, an optical image, an infrared image, or a depth image of the sensor data output by the sensor data acquisition unit 210. It is determined whether or not it has been detected (step ST1002). The object detection unit 231 may determine the previous frame from the position information acquisition date / time information included in the object information and the current date / time information.
 ステップST1002において、前フレームで物体131が検出されていなかった場合(ステップST1002の“NO”の場合)、物体検出部231は、センサデータ取得部210が出力したセンサデータに基づき、物体131を検出し(ステップST1003)、物体131が検出できた場合(ステップST1004の“YES”の場合)、当該物体131の位置座標を取得する(ステップST1007)。
 そして、物体検出部231は、取得した物体131の位置座標の情報を、取得日時の情報と対応付けて、物体情報記憶部280に記憶させる。
 物体131が検出できなかった場合(ステップST1004の“NO”の場合)、図8のステップST804へ進む。
In step ST1002, when the object 131 is not detected in the previous frame (in the case of “NO” in step ST1002), the object detection unit 231 detects the object 131 based on the sensor data output from the sensor data acquisition unit 210. If the object 131 can be detected (in the case of “YES” in step ST1004), the position coordinates of the object 131 are acquired (step ST1007).
Then, the object detection unit 231 stores the acquired position coordinate information of the object 131 in the object information storage unit 280 in association with the acquisition date information.
When the object 131 cannot be detected (in the case of “NO” in step ST1004), the process proceeds to step ST804 in FIG.
 ステップST1002において、前フレームで物体131が検出済みであった場合(ステップST1002の“YES”の場合)、物体情報取得部230の物体追跡部232は、物体検出部231が前フレームにおいて位置座標を取得した物体131を追跡し(ステップST1005)、物体131が追跡できた場合(ステップST1006の“YES”の場合)、当該物体131の移動後の位置座標を取得する(ステップST1007)。
 そして、物体追跡部232は、取得した位置座標の情報を、取得日時の情報と対応付けて、物体情報記憶部280に記憶させる。
 物体131が追跡できなかった場合(ステップST1007の“NO”の場合)、図8のステップST804へ進む。
In step ST1002, when the object 131 has been detected in the previous frame (in the case of “YES” in step ST1002), the object tracking unit 232 of the object information acquisition unit 230 causes the object detection unit 231 to detect the position coordinates in the previous frame. The acquired object 131 is tracked (step ST1005), and when the object 131 can be tracked (in the case of “YES” in step ST1006), the position coordinates after the movement of the object 131 are acquired (step ST1007).
The object tracking unit 232 stores the acquired position coordinate information in the object information storage unit 280 in association with the acquisition date information.
When the object 131 cannot be tracked (in the case of “NO” in step ST1007), the process proceeds to step ST804 in FIG.
 物体認識部233は、物体検出部231が位置座標を取得し、物体情報記憶部280に記憶させた物体131の種類、形状、または、状態を認識する(ステップST1008)。
 このように、図10で示す、ステップST1001~ステップST1008までの一連の処理が、物体情報取得部230による、センサデータ取得部210が出力したセンサデータに含まれる光学映像、赤外線映像、あるいは、深度映像の1フレーム分の物体情報取得の処理である。
The object recognition unit 233 recognizes the type, shape, or state of the object 131 that the object detection unit 231 acquires the position coordinates and stores in the object information storage unit 280 (step ST1008).
As described above, the series of processing from step ST1001 to step ST1008 shown in FIG. 10 is performed by the object information acquisition unit 230 by the optical image, infrared image, or depth included in the sensor data output from the sensor data acquisition unit 210. This is object information acquisition processing for one frame of video.
 図8のフローチャートに戻る。
 作業認識装置100の制御部(図示省略)は、作業者120による作業が終了したかどうかを判定する(ステップST804)。作業が終了したかどうかは、例えば、制御部は、センサから取得した映像から、作業者120を検知できなくなったことによって判定し、作業者120を検知できなくなれば作業終了と判断するようにしてもよい。あるいは、制御部は、作業者120が入力装置(図示省略)から入力した作業終了通知を受け付けたことによって、作業終了と判断するようにしてもよい。なお、これは一例に過ぎず、制御部が、作業者120の作業が終了したかどうかを何等かの手段によって判定できるようになっていればよい。
Returning to the flowchart of FIG.
The control part (illustration omitted) of the work recognition apparatus 100 determines whether the work by the worker 120 is finished (step ST804). For example, the control unit determines whether or not the work has been completed based on the fact that the worker 120 cannot be detected from the video acquired from the sensor, and determines that the work has been completed if the worker 120 cannot be detected. Also good. Alternatively, the control unit may determine that the work is finished when the worker 120 receives a work end notification input from an input device (not shown). This is merely an example, and it is sufficient that the control unit can determine whether or not the work of the worker 120 has been completed by any means.
 ステップST804において、作業が終了していないと判定した場合(ステップST804の“NO”の場合)、ステップST801に戻り、ステップST801~ステップST803の処理を繰り返す。 If it is determined in step ST804 that the work has not ended (in the case of “NO” in step ST804), the process returns to step ST801, and the processes in steps ST801 to ST803 are repeated.
 ステップST804において、作業が終了したと判定した場合(ステップST804の“YES”の場合)、関連付け部240は、作業が終了するまでステップST801において取得したセンサデータの光学映像、赤外線映像、あるいは、深度映像の全フレームについて、物体131と、当該物体131を用いた作業を行った作業者120の体の部位121との関連付けを行う(ステップST805)。具体的には、関連付け部240は、ステップST502において体部位情報取得部220が体部位情報記憶部270に記憶させた、作業者120の体の部位121に関する体部位情報と、ステップST503において物体情報取得部230が物体情報記憶部280に記憶させた物体131に関する物体情報とに基づき、作業者120の体の部位121との関連付けを行う。
 関連付け部240は、物体131と、当該物体131を用いた作業を行った作業者120の体の部位121との関連付け結果を、認識結果分析部250に出力する。
If it is determined in step ST804 that the work has been completed (in the case of “YES” in step ST804), the associating unit 240 performs the optical image, infrared image, or depth of the sensor data acquired in step ST801 until the work is completed. For all the frames of the video, the object 131 is associated with the body part 121 of the worker 120 who performed the work using the object 131 (step ST805). Specifically, the associating unit 240 stores the body part information regarding the body part 121 of the worker 120, which is stored in the body part information storage unit 270 by the body part information acquiring unit 220 in step ST502, and the object information in step ST503. Based on the object information regarding the object 131 stored in the object information storage unit 280 by the acquisition unit 230, the acquisition unit 230 associates the body part 121 of the worker 120 with the body information.
The associating unit 240 outputs the result of associating the object 131 with the body part 121 of the worker 120 who performed the work using the object 131 to the recognition result analyzing unit 250.
 図11は、図8のステップST805における、関連付け部240の動作の詳細を説明するフローチャートである。
 関連付け部240のスコア算出部241は、作業者120の体の部位121と物体131との組み合わせに関して、物体131の動きにどの体の部位121が関連した可能性が高いかを表す関連付けスコアを算出する(ステップST1101)。具体的には、スコア算出部241は、体部位情報取得部220が体部位情報記憶部270に記憶させた、作業者120の体の部位121に関する体部位情報と、物体情報取得部230が物体情報記憶部280に記憶させた物体131に関する物体情報とを用いて、関連付けスコアを算出する。すなわち、スコア算出部241は、作業者120の体の部位121と物体131との組み合わせに関し、作業者120の体の部位121と物体131との関連度を表す関連付けスコアを算出する。
 スコア算出部241は、算出した関連付けスコアの情報を、関連付け部240の整合部242に出力する。
FIG. 11 is a flowchart illustrating details of the operation of the associating unit 240 in step ST805 of FIG.
The score calculation unit 241 of the association unit 240 calculates an association score indicating which body part 121 is likely to be related to the movement of the object 131 with respect to the combination of the body part 121 and the object 131 of the worker 120. (Step ST1101). Specifically, the score calculation unit 241 includes the body part information related to the body part 121 of the worker 120 and the object information acquisition unit 230 that are stored in the body part information storage unit 270 by the body part information acquisition unit 220. The association score is calculated using the object information regarding the object 131 stored in the information storage unit 280. That is, the score calculation unit 241 calculates an association score representing the degree of association between the body part 121 of the worker 120 and the object 131 with respect to the combination of the body part 121 and the object 131 of the worker 120.
The score calculation unit 241 outputs the calculated association score information to the matching unit 242 of the association unit 240.
 整合部242は、スコア算出部241で算出した関連付けスコアを用いて、作業としての整合が取れる範囲で、関連付けスコアを最大化するような、作業者120の体の部位121と物体131との組み合わせを決定する(ステップST1102)。
 整合部242は、決定した、作業者120の体の部位121と物体との組み合わせを関連付け部240の位置補正部243に出力する。
 位置補正部243は、整合部242で決定した、作業者120の体の部位121と物体131との組み合わせを用いて、作業者120の体の部位121または物体131、あるいはその両方の位置座標を補正する(ステップST1103)。
The matching unit 242 uses the association score calculated by the score calculation unit 241 to combine the body part 121 of the worker 120 and the object 131 so as to maximize the association score within a range where the matching as the work can be achieved. Is determined (step ST1102).
The matching unit 242 outputs the determined combination of the body part 121 of the worker 120 and the object to the position correction unit 243 of the association unit 240.
The position correction unit 243 uses the combination of the body part 121 and the object 131 of the worker 120 determined by the matching unit 242 to determine the position coordinates of the body part 121 and / or the object 131 of the worker 120 or both. Correction is performed (step ST1103).
 図8のフローチャートに戻る。
 認識結果分析部250は、関連付け部240が出力した関連付け結果に基づき、作業全体の認識結果を分析する(ステップST806)。
 認識結果分析部250は、関連付け部240から出力された関連付け情報である作業全体の認識結果、あるいは、当該認識結果を分析した分析結果に関する情報を出力制御部260に出力する。
Returning to the flowchart of FIG.
The recognition result analysis unit 250 analyzes the recognition result of the entire work based on the association result output by the association unit 240 (step ST806).
The recognition result analysis unit 250 outputs to the output control unit 260 information regarding the recognition result of the entire work, which is the association information output from the association unit 240, or the analysis result obtained by analyzing the recognition result.
 出力制御部260は、認識結果分析部250が出力した作業全体の認識結果を、出力装置の少なくとも1つに出力する。
 また、出力制御部260は、分析結果出力指示を受け付けた場合は、認識結果分析部250が出力した作業全体の認識結果を用いて作業分析結果を、出力装置の少なくとも1つに出力する(ステップST807)。
The output control unit 260 outputs the recognition result of the entire work output by the recognition result analysis unit 250 to at least one of the output devices.
When the output control unit 260 receives an analysis result output instruction, the output control unit 260 outputs the work analysis result to at least one of the output devices using the recognition result of the entire work output from the recognition result analysis unit 250 (step) ST807).
 図12~14は、この実施の形態1において、出力制御部260による、作業全体の認識結果、または、作業分析結果の出力例を説明する図である。なお、図12~14では、出力制御部260が、ディスプレイ151に対して、作業全体の認識結果、または、作業分析結果を、映像として出力させた例を示している。 12 to 14 are diagrams for explaining an output example of the recognition result of the whole work or the work analysis result by the output control unit 260 in the first embodiment. 12 to 14 show an example in which the output control unit 260 outputs the recognition result of the entire work or the work analysis result to the display 151 as a video.
 図12は、実施の形態1において、認識結果分析部250が物体131を左右の手で移動させて組み立てる作業を認識した結果に基づき、出力制御部260が、物体131と左右の手の位置座標、および、物体を移動させたタイミングをディスプレイ151に表示させた出力例である。
 図12の図面上左側には、作業認識結果として、あるフレームにおける各物体131の3次元位置座標を、鉛直上方向から見た2次元位置座標に変換して、各物体131のアイコンを表示している。あるフレームとは、任意のフレームであり、図12では、一例として、物体131を左手で移動させた作業終了時点のフレームを示している。
FIG. 12 is a diagram illustrating the position coordinates of the object 131 and the left and right hands based on the result of the recognition result analysis unit 250 recognizing the assembly work by moving the object 131 with the left and right hands in the first embodiment. , And an output example in which the timing at which the object is moved is displayed on the display 151.
On the left side of the drawing of FIG. 12, as a work recognition result, the three-dimensional position coordinates of each object 131 in a certain frame are converted into two-dimensional position coordinates viewed from the vertical direction, and an icon of each object 131 is displayed. ing. A certain frame is an arbitrary frame, and FIG. 12 shows, as an example, a frame at the end of the work in which the object 131 is moved with the left hand.
 図12の図面上右側には、左右の手に対して、どのタイミングで物体131を関連付けしたか、すなわち、どのタイミングで移動させたかを表示させるようにしている。
 なお、図12では、一例として、0を作業開始時点として、図面上下方向に時間軸を取っている。このように、出力制御部260は、例えば、作業者120が行った作業に関し、ある時点での認識結果(図12の図面上左側参照)と、時間経過にあわせた作業全体の流れの認識結果(図12の図面上右側参照)とを、表示させるようにすることができる。
 出力制御部260は、認識結果分析部250から作業者120の作業の認識結果に関する情報を受け、物体131を移動させた時間区間を、その物体131を示す色の矩形とともに表示させる。なお、図12では、物体131は、色分けではなく、ドットの濃さの違いで表現している。
On the right side of the drawing of FIG. 12, the timing at which the object 131 is associated with the left and right hands, that is, the timing at which the object 131 is moved is displayed.
In FIG. 12, as an example, the time axis is taken in the vertical direction of the drawing, with 0 as the work start time. As described above, the output control unit 260, for example, regarding the work performed by the worker 120, the recognition result at a certain time (see the left side of the drawing in FIG. 12) and the recognition result of the flow of the entire work in accordance with the passage of time. (See the right side of the drawing in FIG. 12) can be displayed.
The output control unit 260 receives information related to the recognition result of the work of the worker 120 from the recognition result analysis unit 250, and displays the time interval in which the object 131 is moved together with a color rectangle indicating the object 131. In FIG. 12, the object 131 is represented not by color coding but by the difference in dot density.
 図12の図面上左側は、あるフレームにおける作業認識結果を表示させたものであるが、出力制御部260は、当該作業認識結果の表示を、図12の図面上右側に表示された時間経過にあわせた作業全体の流れの表示と同期させて表示させるようにすることができる。
 具体的には、例えば、ユーザが、図12の図面上右側に表示させた作業分析結果の時間軸方向のある時間を指定すると、出力制御部260は、指定された時間のフレームにおける作業認識結果を表示させるように、図12の図面上左側の表示を更新することができる。これにより、作業内容をより高精度に可視化できる。なお、図12においては、作業開始から約12秒後の作業認識結果を表示させるように指定されたものとしている。
 また、このとき、出力制御部260は、対応する、図12の図面上左側に表示させている物体131のアイコンと、図12の図面上右側に表示させている、物体131を示す矩形とを、線で結んで関係性を示すようにしてもよい。
The left side of the drawing of FIG. 12 displays the work recognition result in a certain frame, but the output control unit 260 displays the display of the work recognition result after the time displayed on the right side of the drawing of FIG. It can be displayed in synchronization with the display of the flow of the combined work.
Specifically, for example, when the user designates a certain time in the time axis direction of the work analysis result displayed on the right side of the drawing of FIG. 12, the output control unit 260 performs the work recognition result in the frame of the designated time. The display on the left side of the drawing of FIG. 12 can be updated so as to be displayed. Thereby, the work content can be visualized with higher accuracy. In FIG. 12, it is assumed that the operation recognition result about 12 seconds after the start of the operation is designated to be displayed.
At this time, the output control unit 260 displays the corresponding icon of the object 131 displayed on the left side of the drawing of FIG. 12 and the rectangle indicating the object 131 displayed on the right side of the drawing of FIG. The relationship may be shown by connecting with a line.
 図12に示す作業認識結果について、詳細に説明する。
 当該図12は、机上に、物体mが2つ(図12のm,m)、物体nが2つ(図12のn,n)、物体oが1つの計6つの物体131がある状況を示している。
 軌跡xは、作業者120が、物体mを、左手(図12のL)で掴んで画面右方向に移動させていることを示している。物体mが移動させられる途中の物体oは、作業者120の作業とは関係なく、偶然、物体mの軌跡x上に置かれていただけ、という状況である。図面上右側に表示された表示に当該物体oが示されていないのは、当該物体oが作業者120の作業と関係ないものであり、作業者120は、当該物体oを右手(図12のR)でも左手でも移動させていないからである。
 図12上、軌跡yは、図面上左側に示された時間よりも少し前に作業者120が物体mを移動させた軌跡である。図面上右側をみると、作業開始後10秒までに、右手で物体mを移動させていることがわかる。
 図12において、軌跡x,yは、物体mの移動完了後も数秒は表示され、数秒後に消えるという仕様が適用されているものとしている。なお、軌跡x,yを表示させたままとする時間は、適宜設定可能とする。
The work recognition result shown in FIG. 12 will be described in detail.
FIG. 12 shows a total of six objects 131 having two objects m (m 1 and m 2 in FIG. 12), two objects n (n 1 and n 2 in FIG. 12), and one object o on the desk. There is a situation.
The trajectory x indicates that the operator 120 is holding the object m 1 with the left hand (L in FIG. 12) and moving it to the right of the screen. The object o in the middle of the movement of the object m 1 is in a situation where it can be accidentally placed on the trajectory x of the object m 1 regardless of the work of the worker 120. The reason why the object o is not shown in the display displayed on the right side of the drawing is that the object o is not related to the work of the worker 120, and the worker 120 holds the object o in the right hand (see FIG. 12). This is because neither the left hand nor the left hand is moved.
12 on the locus y is the operator 120 slightly before the time indicated in the drawing on the left is a locus obtained by moving the object m 2. Looking at the right side of the drawing, it can be seen that the object m 2 is moved with the right hand by 10 seconds after the start of the work.
In FIG. 12, it is assumed that the trajectories x and y are displayed for several seconds even after the movement of the object m is completed and disappear after a few seconds. Note that the time for which the trajectories x and y remain displayed can be set as appropriate.
 このように、認識結果分析部250による作業認識の結果、そのフレームにおいて、ある物体131と左手を関連付けた場合、すなわち、ある物体131を左手で移動したと認識した場合、出力制御部260は、当該認識結果を受けて、物体のアイコンに重ねて左手のアイコンを表示させる。なお、これは一例であり、ある物体131と右手を関連付けた場合も同様である。また、左右の手以外の体の部位を用いる場合も同様である。
 また、物体131が移動した場合、出力制御部260は、その物体131の位置座標の軌跡を線で表示させる。出力制御部260は、表示させる軌跡の線は、表示させ続けるようにしてもよいし、一定時間経過後に消去させる、すなわち、表示させないようにしてもよい。
 また、出力制御部260は、物体131や作業者120の体の部位121の3次元位置座標をそのまま用いて立体的に表示させるようにしてもよい。
As described above, when the recognition result analysis unit 250 recognizes that a certain object 131 is associated with the left hand in the frame, that is, when it is recognized that the certain object 131 is moved with the left hand, the output control unit 260 In response to the recognition result, the left hand icon is displayed over the object icon. This is only an example, and the same applies when an object 131 is associated with the right hand. The same applies when using body parts other than the left and right hands.
When the object 131 moves, the output control unit 260 displays the locus of the position coordinates of the object 131 with a line. The output control unit 260 may continue to display the trace line to be displayed, or may delete the line after a certain period of time, that is, not display it.
Further, the output control unit 260 may display the object 131 and the three-dimensional position coordinates of the body part 121 of the worker 120 as they are three-dimensionally.
 なお、図12では、左右の手に対して、どのタイミングで物体131を関連付けしたか(図12の図面上左側)と、左右の手に対して、どのタイミングで物体131を関連付けしたか(図12の図面上右側)とをあわせて表示する例を示したが、これに限らず、左右の手に対して、どのタイミングで物体131を関連付けしたか(図12の図面上左側)と、左右の手に対して、どのタイミングで物体131を関連付けしたか(図12の図面上右側)のいずれか一方のみを表示させるようにしてもよい。 In FIG. 12, at which timing the object 131 is associated with the left and right hands (left side of the drawing in FIG. 12) and at what timing the object 131 is associated with the left and right hands (FIG. 12). 12, the right side of the drawing) is displayed together. However, the present invention is not limited to this, and the timing at which the object 131 is associated with the left and right hands (left side of the drawing in FIG. 12) Only one of the timings when the object 131 is associated with the hand (right side in the drawing of FIG. 12) may be displayed.
 図13は、実施の形態1において、認識結果分析部250が物体131を左右の手で移動させて組み立てる作業を認識した結果に基づき作業内容を分析し、出力制御部260が、当該分析結果をディスプレイ151に表示させた出力例である。
 図13では、図12の図面上右側に示したような、物体移動タイミングの表示に加え、出力制御部260は、認識結果分析部250が、作業の認識結果の情報を用いて行った分析結果に関する情報を表示させるようにしている。なお、図12の図面上右側に示したような物体移動タイミングの表示に代えて、図13に示すような、分析結果に関する情報を表示させるようにしてもよい。
FIG. 13 shows an analysis of work contents based on the recognition result analysis unit 250 recognizes the work of moving and assembling the object 131 with the left and right hands in the first embodiment, and the output control unit 260 analyzes the analysis result. It is an output example displayed on the display 151.
In FIG. 13, in addition to the display of the object movement timing as shown on the right side of the drawing of FIG. 12, the output control unit 260 is the analysis result that the recognition result analysis unit 250 performs using the information of the work recognition result. The information about is displayed. Instead of displaying the object movement timing as shown on the right side of the drawing in FIG. 12, information regarding the analysis result as shown in FIG. 13 may be displayed.
 図13では、一例として、作業者120が、土台に、2つの物体n,nを、それぞれ、右手(図13のR)または左手(図13のL)を使って載せ、さらに、物体n,n上に、それぞれ、右手または左手を使って、物体m,mを載せた後、右手を使って、当該物体mとmとを橋渡しするように物体qを、物体m,m上に載せる作業を行ったものとしている。
 また、図13において、認識結果分析部250が、上述したような作業において、左右の手で物体n,n,m,m,qを移動させた掃除間を計算して8.0秒と算出し、作業の開始から終了までの時間を17.3秒と算出し、出力制御部260が、当該算出結果を表示させたものとしている。
 なお、物体131を左右の手で移動させた総時間、および、作業の開始から終了までの時間の算出は、認識結果分析部250の、成否判定部251、種別識別部252、内容比較部253、条件判定部254のいずれが行うようにしてもよく、認識結果分析部250は、映像のフレームに対応付けられた時刻から、総時間、物体131を左右の手で移動させた総時間、および、作業の開始から終了までの時間を算出すればよい。
In FIG. 13, as an example, the worker 120 places two objects n 3 and n 4 on the base using the right hand (R in FIG. 13) or the left hand (L in FIG. 13), respectively, to n 3, on n 4, respectively, with the right hand or left hand, after placing the object m 3, m 4, with the right hand, an object q to bridge the the object m 3 and m 4, It is assumed that the work of placing on the objects m 3 and m 4 has been performed.
In FIG. 13, the recognition result analysis unit 250 calculates a cleaning interval in which the objects n 3 , n 4 , m 3 , m 4 , and q are moved by the left and right hands in the above-described operation. It is calculated as 0 seconds, the time from the start to the end of the work is calculated as 17.3 seconds, and the output control unit 260 displays the calculation result.
The total time for moving the object 131 with the left and right hands and the calculation of the time from the start to the end of the work are calculated by the success / failure determination unit 251, the type identification unit 252, and the content comparison unit 253 of the recognition result analysis unit 250. Any one of the condition determination units 254 may perform the recognition result analysis unit 250. The recognition result analysis unit 250 determines the total time from the time associated with the video frame, the total time when the object 131 is moved by the left and right hands, and The time from the start to the end of the work may be calculated.
 また、図13では、一例として、認識結果分析部250が、物体n,n,m,m,qを左右の手で移動させた距離の和を計算して254cmと算出し、当該算出結果を受けて、出力制御部260が、当該認識結果分析部250によって算出された総距離を表示させたものとしている。
 当該総距離についても、認識結果分析部250の成否判定部251、種別識別部252、内容比較部253、条件判定部254のいずれが行うようにしてもよく、認識結果分析部250は、物体131の位置座標から、当該総距離を算出すればよい。
In FIG. 13, as an example, the recognition result analysis unit 250 calculates the sum of the distances of moving the objects n 3 , n 4 , m 3 , m 4 , and q with the left and right hands to calculate 254 cm, In response to the calculation result, the output control unit 260 displays the total distance calculated by the recognition result analysis unit 250.
The total distance may be determined by any of the success / failure determination unit 251, the type identification unit 252, the content comparison unit 253, and the condition determination unit 254 of the recognition result analysis unit 250. The total distance may be calculated from the position coordinates.
 また、図13では、一例として、成否判定部251が、規定の順序通りに、規定の作業が成されたと判定し、内容比較部253が、間違った物体131を用いることなく、不要な作業が含まれてもいないと判定し、条件判定部254が、作業が規定の時間より遅くないと判定した結果を受けて、出力制御部260が、「◎よくできました!」と表示させるようにしている。
 これは一例に過ぎず、認識結果分析部250が備える成否判定部251、種別識別部252、内容比較部253、条件判定部254の各判定結果に応じて、どのような表示をさせるかを適宜設定しておき、出力制御部260は、当該適宜設定された条件に応じて、分析結果を表示させるようにすればよい。
In FIG. 13, as an example, the success / failure determination unit 251 determines that the prescribed work has been performed in the prescribed order, and the content comparison unit 253 performs unnecessary work without using the wrong object 131. It is determined that it is not included, and the condition determination unit 254 receives the result that the work is determined not to be slower than the specified time, and causes the output control unit 260 to display “◎ Good!” ing.
This is only an example, and depending on the determination results of the success / failure determination unit 251, the type identification unit 252, the content comparison unit 253, and the condition determination unit 254 included in the recognition result analysis unit 250, what kind of display is appropriately displayed The output control unit 260 may display the analysis result in accordance with the appropriately set conditions.
 また、成否判定部251、種別識別部252、内容比較部253、条件判定部254の判定結果を個別にそれぞれ表示させるようにしてもよい。
 例えば、種別識別部252が、作業認識結果から、規定の作業種別のうちの「組み立て作業W1」に該当する作業であると判定した場合、当該分析結果を受けて、出力制御部260は、「組み立て作業W1」という見出しを表示させるようにしてもよい。
 また、例えば、成否判定部251が、規制の作業が成されたと判定すると、当該分析結果を受けて、出力制御部260は、「○(成功)」と表示させるようにしてもよい。
 また、例えば、内容比較部253が、他の作業内容と比較し、ある部品の組み立てが未実施であると判定すると、当該分析結果を受けて、出力制御部260は、「部品Z組み立て未実施」と表示させるようにしてもよい。
In addition, the determination results of the success / failure determination unit 251, the type identification unit 252, the content comparison unit 253, and the condition determination unit 254 may be individually displayed.
For example, when the type identification unit 252 determines from the work recognition result that the work corresponds to “assembly work W1” of the prescribed work types, the output control unit 260 receives the analysis result, The heading “Assembly work W1” may be displayed.
Further, for example, when the success / failure determination unit 251 determines that the restriction work has been performed, the output control unit 260 may display “◯ (success)” in response to the analysis result.
Further, for example, when the content comparison unit 253 compares with other work content and determines that the assembly of a certain part is not performed, the output control unit 260 receives “the component Z assembly not performed” in response to the analysis result. May be displayed.
 このように、作業分析結果を表示させるようにすることで、作業者120等は、作業内容について、作業が正確に行われたか、作業が正確に行われていない場合は、どの点がいけなかったか等を、表示を確認することで、簡単に把握することができる。 In this way, by displaying the work analysis result, the worker 120 or the like cannot make any point regarding the work contents if the work is performed correctly or the work is not performed accurately. By confirming the display, it can be easily grasped.
 図14も、図13同様、実施の形態1において、認識結果分析部250が物体131を左右の手で移動させて組み立てる作業を認識した結果に基づき作業内容を分析し、出力制御部260が、当該分析結果をディスプレイ151に表示させた出力例である。
 図13では、認識結果分析部250が、作業が規定どおり行われたと判断した場合の一例を示したが、図14は、認識結果分析部250が、作業内容を分析した結果、作業が規定どおり行われなかったと判断した場合の一例を示している。具体的には、認識結果分析部250の内容比較部253が、他の基準となる作業と比較し、間違った、色の違う物体131を用いて作業を行ったと判定した場合の一例を示している。
また、認識結果分析部250の条件判定部254が、ある物体131を用いた作業について、作業が遅すぎると判定した場合の一例を示している。
14, as in FIG. 13, in the first embodiment, the recognition result analysis unit 250 analyzes the work content based on the result of recognizing the work of moving and assembling the object 131 with the left and right hands, and the output control unit 260 It is an output example in which the analysis result is displayed on the display 151.
Although FIG. 13 shows an example in which the recognition result analysis unit 250 determines that the work has been performed as specified, FIG. 14 illustrates the result of the analysis performed by the recognition result analysis unit 250 as a result of the work being performed as specified. An example in the case where it is determined that it has not been performed is shown. Specifically, an example is shown in which the content comparison unit 253 of the recognition result analysis unit 250 compares with other reference work and determines that the work is performed using the wrong object 131 having a different color. Yes.
Further, an example is shown in which the condition determination unit 254 of the recognition result analysis unit 250 determines that the work is too late for the work using a certain object 131.
 具体的には、図14では、一例として、作業者120が、右手(図14のR)で物体nを移動→左手(図14のL)で物体rを移動→右手で物体mを移動→左手で物体mを移動の順で、作業を行ったものとしている。当該作業において、左手で物体rを移動させる作業は、本来であれば、左手で物体nを移動させるべきだったところ、作業者120は、間違って、色の違う物体rを移動させてしまったものとする。
 そして、認識結果分析部250の内容比較部253は、他の基準となる作業と比較し、間違った、色の違う物体rを用いて作業を行ったと判定したものとしている。
 また、認識結果分析部250の条件判定部254が、物体mを用いた作業について、作業が遅すぎると判定したものとしている。
Specifically, in FIG. 14, as an example, the operator 120 moves the object n 5 with the right hand (R in FIG. 14) → moves the object r with the left hand (L in FIG. 14) → moves the object m 5 with the right hand. moving → the object m 6 in the order of movement in the left hand, it is assumed that the work was done. In this work, the work for moving the object r with the left hand should have moved the object n 6 with the left hand, but the operator 120 mistakenly moved the object r of a different color. Shall be.
Then, the content comparison unit 253 of the recognition result analysis unit 250 determines that the operation is performed using the wrong object r having a different color as compared with other reference operations.
The condition determining unit 254 of the recognition result analysis unit 250, the work using the object m 5, it is assumed that it is determined that the operation is too slow.
 出力制御部260は、内容比較部253による、色の違う物体rを用いて作業を行ったとの分析結果を受けて、該当の作業を示す箇所に、当該間違った作業がわかるように、「×色が違う」と表示させるようにしている。
 また、出力制御部260は、条件判定部254による、物体m5を用いた作業が遅すぎるとの分析結果を受けて、該当の作業を示す箇所に、当該遅すぎると判定された作業がわかるように、「△作業が遅い」と表示させるようにしている。
 なお、図14に示すような表示例は一例に過ぎず、出力制御部260は、予め設定された方法に基づき、間違った作業および間違いの内容、あるいは、作業時間が基準時間よりも多くかかっていることがわかるようにディスプレイ151に表示させるようになっていればよい。
The output control unit 260 receives the analysis result that the content comparison unit 253 has performed the work using the object r having a different color, and “x” so that the wrong work can be recognized at the position indicating the work. "The color is different".
In addition, the output control unit 260 receives the analysis result by the condition determination unit 254 that the work using the object m5 is too late, so that the work that is determined to be too late can be found at a location indicating the corresponding work. In addition, “△ work is slow” is displayed.
Note that the display example as shown in FIG. 14 is merely an example, and the output control unit 260 is based on a preset method, and the wrong work and the content of the mistake, or the work time is longer than the reference time. It is only necessary to display the image on the display 151 so that it can be seen.
 このように、出力制御部260は、認識結果分析部250から出力される作業認識結果、あるいは、作業認識結果を分析した分析結果の情報に基づき、作業認識結果、あるいは、分析結果を、ディスプレイ151に適宜表示させるようにする。
 これにより、作業者120等は、ディスプレイ151の表示内容を確認することで、作業内容について、詳細に把握することができる。
As described above, the output control unit 260 displays the work recognition result or the analysis result on the display 151 based on the work recognition result output from the recognition result analysis unit 250 or the analysis result information obtained by analyzing the work recognition result. Is displayed as appropriate.
Thereby, the operator 120 etc. can grasp | ascertain in detail about the work content by confirming the display content of the display 151. FIG.
 上記では、出力制御部260が、認識結果分析部250から出力される作業認識結果、あるいは、分析結果の情報を、ディスプレイ151に表示させる例をあげて説明したが、出力制御部260による作業認識結果、あるいは、分析結果に関する情報の出力はこれに限らない。
 例えば、出力制御部260は、認識結果分析部250の成否判定部251が、規定の作業が成されていないと判定した場合に、規定の作業が成されていない旨を示す、予め設定された音をスピーカ152から出力させる等、認識結果分析部250から出力される作業認識結果、あるいは、分析結果の情報を、スピーカ152から音、あるいは、音声等によって出力させることができる。
In the above description, an example in which the output control unit 260 displays the work recognition result or the analysis result information output from the recognition result analysis unit 250 on the display 151 has been described. The output of the result or the information on the analysis result is not limited to this.
For example, when the success / failure determination unit 251 of the recognition result analysis unit 250 determines that the specified work is not performed, the output control unit 260 is set in advance to indicate that the specified work is not performed. The work recognition result or the analysis result information output from the recognition result analysis unit 250, such as outputting sound from the speaker 152, can be output from the speaker 152 by sound or voice.
 また、出力制御部260は、規定の作業が成されていないと判断した作業に関する情報を、判定結果とともに、記憶装置153に記憶させておくことができる。
 記憶装置153に作業結果の情報を記憶させておくことで、作業者120等は、記憶させたデータから、規定の作業が成されていないと判断された原因について、分析することができる。
 また、出力制御部260は、規定の作業が成されていないと判断した旨の制御信号を制御装置154に送信するようにしてもよい。
 制御装置154では、当該制御信号を受信し、例えば、マニュアルをダウンロードして作業認識装置100に提供する等、規定の作業を行う一助となる制御を行う。
 このように、出力制御部260は、認識結果分析部250の分析結果に応じた制御信号を制御装置154に送信し、制御装置154において、分析結果に応じた制御を行うようにさせることができる。
In addition, the output control unit 260 can store, in the storage device 153, information related to the work that is determined to have not been performed as a prescribed work, along with the determination result.
By storing work result information in the storage device 153, the worker 120 or the like can analyze the cause determined that the prescribed work is not performed from the stored data.
Further, the output control unit 260 may transmit a control signal to the control device 154 to the effect that it is determined that the prescribed work has not been performed.
The control device 154 receives the control signal, and performs control that helps to perform the prescribed work, for example, downloads a manual and provides it to the work recognition device 100.
As described above, the output control unit 260 can transmit a control signal corresponding to the analysis result of the recognition result analysis unit 250 to the control device 154 so that the control device 154 performs control according to the analysis result. .
 以上のように、この実施の形態1によれば、センサデータを取得するセンサデータ取得部210と、センサデータ取得部210が取得したセンサデータに基づき、作業者120の体の部位121を検出して、当該作業者120の体の部位121に関する体部位情報を取得する体部位情報取得部220と、センサデータ取得部210が取得したセンサデータに基づき、物体131を検出して、当該物体131に関する物体情報を取得する物体情報取得部230と、体部位情報取得部220が取得した体部位情報と、物体情報取得部230が取得した物体情報とに基づき、物体131と、当該物体131を用いた作業を行った、作業者120の体の部位121との関連付けを行う関連付け部240と、関連付け部240で関連付けられた関連付け結果に関する関連付け情報に基づき、作業者120によって実施された作業を認識する認識結果分析部250とを備えるように構成したので、物体131と作業者120の体の部位121との対応関係を考慮せず、作業者120の体の部位121あるいは物体131単独で移動軌跡を求める従来の技術と比べ、作業者120の体の部位121あるいは物体131の位置を一部推定できない場合、または、推定した部位121あるいは物体131の移動軌跡と実際の部位121あるいは物体131の移動軌跡との推定誤差が大きい場合であっても、高精度に作業者120の作業を認識することができる。 As described above, according to the first embodiment, the sensor data acquisition unit 210 that acquires sensor data and the body part 121 of the worker 120 are detected based on the sensor data acquired by the sensor data acquisition unit 210. Based on the sensor data acquired by the body part information acquisition unit 220 that acquires the body part information related to the body part 121 of the worker 120 and the sensor data acquisition unit 210, the object 131 is detected. Based on the body part information acquired by the object information acquisition unit 230, the body part information acquisition unit 220, and the object information acquired by the object information acquisition unit 230, the object 131 and the object 131 are used. An association unit 240 that performs association with the body part 121 of the worker 120 that performed the work, and an association associated with the association unit 240 Since the recognition result analysis unit 250 that recognizes the work performed by the worker 120 based on the association information regarding the result is provided, the correspondence between the object 131 and the body part 121 of the worker 120 should be taken into consideration. The position of the body part 121 or the object 131 of the worker 120 cannot be estimated in part or compared with the conventional technique for obtaining the movement locus of the body part 121 or the object 131 alone. Even when the estimation error between the movement locus of the object 121 or the object 131 and the movement locus of the actual part 121 or the object 131 is large, the work of the operator 120 can be recognized with high accuracy.
 なお、本願発明はその発明の範囲内において、実施の形態の任意の構成要素の変形、もしくは実施の形態の任意の構成要素の省略が可能である。 In the present invention, any component of the embodiment can be modified or any component of the embodiment can be omitted within the scope of the invention.
 この発明に係る作業認識装置は、作業者の体の部位と物体とにかかわる一連の流れの中での作業者による作業内容を高精度に認識することができるように構成したため、作業者による作業内容を認識する作業認識装置等に適用することができる。 The work recognition apparatus according to the present invention is configured to recognize the work content by the worker in a series of flows related to the body part and the object of the worker with high accuracy. The present invention can be applied to a work recognition device that recognizes contents.
 100 作業認識装置、111 可視光カメラ、112 赤外線カメラ、113 深度センサ、114 加速度センサ、115 ジャイロセンサ、120 作業者、121 部位、131 物体、151 ディスプレイ、152 スピーカ、153 記憶装置、154 制御装置、210 センサデータ取得部、220 体部位情報取得部、221 体部位検出部、222 体部位追跡部、223 体部位認識部、230 物体情報取得部、231 物体検出部、232 物体追跡部、233 物体認識部、240 関連付け部、241 スコア算出部、242 整合部、243 位置補正部、250 認識結果分析部、251 成否判定部、252 種別識別部、253 内容比較部、254 条件判定部、260 出力制御部、270 体部位情報記憶部、280 物体情報記憶部、601 処理回路、602 HDD、603 入力インタフェース装置、604 出力インタフェース装置、605,703 通信装置、606,701 CPU、607,702 メモリ。 100 work recognition device, 111 visible light camera, 112 infrared camera, 113 depth sensor, 114 acceleration sensor, 115 gyro sensor, 120 worker, 121 parts, 131 object, 151 display, 152 speaker, 153 storage device, 154 control device, 210 sensor data acquisition unit, 220 body part information acquisition unit, 221 body part detection unit, 222 body part tracking unit, 223 body part recognition unit, 230 object information acquisition unit, 231 object detection unit, 232 object tracking unit, 233 object recognition Unit, 240 association unit, 241 score calculation unit, 242 matching unit, 243 position correction unit, 250 recognition result analysis unit, 251 success / failure determination unit, 252 type identification unit, 253 content comparison unit, 254 condition determination unit, 260 output control unit 270 Body part information storing unit, 280 object information storage unit, 601 processing circuits, 602 HDD, 603 input interface device 604 output interface device, 605,703 communication device, 606,701 CPU, 607,702 memory.

Claims (17)

  1.  センサデータを取得するセンサデータ取得部と、
     前記センサデータ取得部が取得したセンサデータに基づき、作業者の体の部位を検出して、当該作業者の体の部位に関する体部位情報を取得する体部位情報取得部と、
     前記センサデータ取得部が取得したセンサデータに基づき、物体を検出して、当該物体に関する物体情報を取得する物体情報取得部と、
     前記体部位情報取得部が取得した体部位情報と、前記物体情報取得部が取得した物体情報とに基づき、前記物体と、当該物体を用いた作業を行った、前記作業者の体の部位との関連付けを行う関連付け部と、
     前記関連付け部で関連付けられた関連付け結果に関する関連付け情報に基づき、前記作業者によって実施された作業を認識する認識結果分析部
     とを備えた作業認識装置。
    A sensor data acquisition unit for acquiring sensor data;
    Based on the sensor data acquired by the sensor data acquisition unit, a body part information acquisition unit that detects a part of the worker's body and acquires body part information regarding the part of the worker's body;
    An object information acquisition unit that detects an object based on the sensor data acquired by the sensor data acquisition unit and acquires object information related to the object;
    Based on the body part information acquired by the body part information acquisition unit and the object information acquired by the object information acquisition unit, the object, and a part of the body of the worker who performed the work using the object An association unit for associating
    A work recognition apparatus comprising: a recognition result analysis unit for recognizing work performed by the worker based on association information related to an association result associated by the association unit.
  2.  前記認識結果分析部が認識した、前記関連付け情報に基づく情報を出力する出力制御部を備えた
     ことを特徴とする請求項1記載の作業認識装置。
    The work recognition apparatus according to claim 1, further comprising an output control unit that outputs information based on the association information recognized by the recognition result analysis unit.
  3.  前記体部位情報取得部は、
     前記作業者の体の部位を検出して、当該作業者の体の部位の位置座標を取得する体部位検出部を備えた
     ことを特徴とする請求項1記載の作業認識装置。
    The body part information acquisition unit
    The work recognition apparatus according to claim 1, further comprising a body part detection unit that detects a body part of the worker and acquires position coordinates of the worker's body part.
  4.  前記体部位情報取得部は、
     前記体部位検出部が検出した前記作業者の体の部位を追跡して、移動後の、当該作業者の体の部位の位置座標を取得する体部位追跡部を備えた
     ことを特徴とする請求項3記載の作業認識装置。
    The body part information acquisition unit
    The body part tracking part which tracks the part of the worker's body detected by the body part detection part and acquires the position coordinates of the part of the worker's body after movement is provided. Item 4. The work recognition apparatus according to Item 3.
  5.  前記体部位情報取得部は、
     前記体部位検出部が検出した前記作業者の体の部位の種類、形状、または、状態を認識する体部位認識部を備えた
     ことを特徴とする請求項3記載の作業認識装置。
    The body part information acquisition unit
    The work recognition apparatus according to claim 3, further comprising a body part recognition unit that recognizes the type, shape, or state of the body part of the worker detected by the body part detection unit.
  6.  前記物体情報取得部は、
     前記物体を検出して、当該物体の位置座標を取得する物体検出部を備えた
     ことを特徴とする請求項1記載の作業認識装置。
    The object information acquisition unit
    The work recognition apparatus according to claim 1, further comprising an object detection unit that detects the object and obtains position coordinates of the object.
  7.  前記物体情報取得部は、
     前記物体検出部が検出した前記物体を追跡して、移動後の、当該物体の位置座標を取得する物体追跡部を備えた
     ことを特徴とする請求項6記載の作業認識装置。
    The object information acquisition unit
    The work recognition apparatus according to claim 6, further comprising an object tracking unit that tracks the object detected by the object detection unit and acquires the position coordinates of the object after movement.
  8.  前記物体情報取得部は、
     前記物体検出部が検出した前記物体の種類、形状、または、状態を認識する物体認識部を備えた
     ことを特徴とする請求項6記載の作業認識装置。
    The object information acquisition unit
    The work recognition apparatus according to claim 6, further comprising: an object recognition unit that recognizes a type, shape, or state of the object detected by the object detection unit.
  9.  前記関連付け部は、
     前記体部位情報取得部が取得した体部位情報と、前記物体情報取得部が取得した物体情報とを用いて、前記作業者の体の部位と前記物体との関連度を表す関連付けスコアを算出するスコア算出部を備えた
     ことを特徴とする請求項1記載の作業認識装置。
    The association unit includes
    Using the body part information acquired by the body part information acquisition unit and the object information acquired by the object information acquisition unit, an association score representing a degree of association between the body part of the worker and the object is calculated. The work recognition apparatus according to claim 1, further comprising a score calculation unit.
  10.  前記関連付け部は、
     前記スコア算出部が算出した関連付けスコアを用いて、作業の整合に応じた、前記作業者の体の部位と前記物体との組み合わせを決定する整合部を備えた
     ことを特徴とする請求項9記載の作業認識装置。
    The association unit includes
    The matching unit that determines a combination of the body part of the worker and the object according to the matching of work using the association score calculated by the score calculation unit. Work recognition device.
  11.  前記関連付け部は、
     前記整合部が決定した前記作業者の体の部位と前記物体との組み合わせに基づき、前記作業者の体の部位、または、前記物体の位置を補正する位置補正部を備えた
     ことを特徴とする請求項10記載の作業認識装置。
    The association unit includes
    A position correction unit that corrects the position of the body part of the worker or the position of the object based on a combination of the body part of the worker and the object determined by the matching unit is provided. The work recognition apparatus according to claim 10.
  12.  前記認識結果分析部は、
     前記関連付け部が関連付けた前記作業者の体の部位と前記物体との関連付け情報に基づき、規定の作業が成されたか否かを判定する成否判定部を備えた
     ことを特徴とする請求項1記載の作業認識装置。
    The recognition result analysis unit
    The success / failure determination unit that determines whether or not a prescribed work has been performed based on association information between the body part of the worker and the object associated by the association unit. Work recognition device.
  13.  前記認識結果分析部は、
     前記関連付け部が関連付けた前記作業者の体の部位と前記物体との関連付け情報に基づき、規定の作業種別のいずれに該当するかを識別する種別識別部を備えた
     ことを特徴とする請求項1記載の作業認識装置。
    The recognition result analysis unit
    2. A type identification unit for identifying which of the prescribed work types corresponds based on association information between the body part of the worker and the object associated by the association unit. The work recognition apparatus described.
  14.  前記認識結果分析部は、
     前記関連付け部が関連付けた前記作業者の体の部位と前記物体との関連付け情報に基づき、前記作業者が実施した作業とは別の作業者による作業との比較によって、前記作業者が実施した作業内容を分析する内容比較部を備えた
     ことを特徴とする請求項1記載の作業認識装置。
    The recognition result analysis unit
    The work performed by the worker by comparing the work performed by the worker with a work different from the work performed by the worker based on the association information between the body part of the worker and the object associated by the associating unit. The work recognition apparatus according to claim 1, further comprising a content comparison unit that analyzes the content.
  15.  前記認識結果分析部は、
     前記関連付け部が関連付けた前記作業者の体の部位と前記物体との関連付け情報に基づき、前記作業者が実施した作業内容が、規定の条件を満たすか否かを判定する条件判定部を備えた
     ことを特徴とする請求項1記載の作業認識装置。
    The recognition result analysis unit
    A condition determining unit that determines whether or not the work content performed by the worker satisfies a predetermined condition based on association information between the body part of the worker and the object associated by the associating unit; The work recognition apparatus according to claim 1.
  16.  前記出力制御部は、
     前記認識結果分析部が認識した認識結果に基づく情報を、映像として表示させる
     ことを特徴とする請求項2記載の作業認識装置。
    The output control unit
    The work recognition apparatus according to claim 2, wherein information based on the recognition result recognized by the recognition result analysis unit is displayed as an image.
  17.  センサデータ取得部が、センサデータを取得するステップと、
     体部位情報取得部が、前記センサデータ取得部が取得したセンサデータに基づき、作業者の体の部位を検出して、当該作業者の体の部位に関する体部位情報を取得するステップと、
     物体情報取得部が、前記センサデータ取得部が取得したセンサデータに基づき、物体を検出して、当該物体に関する物体情報を取得するステップと、
     関連付け部が、前記体部位情報取得部が取得した体部位情報と、前記物体情報取得部が取得した物体情報とに基づき、前記物体と、当該物体を用いた作業を行った、前記作業者の体の部位との関連付けを行うステップと、
     認識結果分析部が、前記関連付け部で関連付けられた関連付け結果に関する関連付け情報に基づき、前記作業者によって実施された作業を認識するステップ
     とを備えた作業認識方法。
    A step in which a sensor data acquisition unit acquires sensor data;
    A body part information acquiring unit detecting a body part of the worker based on the sensor data acquired by the sensor data acquiring unit, and acquiring body part information relating to the body part of the worker;
    An object information acquisition unit detecting an object based on the sensor data acquired by the sensor data acquisition unit and acquiring object information related to the object;
    Based on the body part information acquired by the body part information acquisition unit and the object information acquired by the object information acquisition unit, the associating unit performs the work using the object and the object. Associating with a body part;
    And a step of recognizing the work performed by the worker based on association information related to the association result associated by the association unit.
PCT/JP2016/083243 2016-11-09 2016-11-09 Work recognition device and work recognition method WO2018087844A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2016/083243 WO2018087844A1 (en) 2016-11-09 2016-11-09 Work recognition device and work recognition method
JP2018546920A JP6444573B2 (en) 2016-11-09 2016-11-09 Work recognition device and work recognition method
TW106107761A TW201818297A (en) 2016-11-09 2017-03-09 Work recognition device and work recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/083243 WO2018087844A1 (en) 2016-11-09 2016-11-09 Work recognition device and work recognition method

Publications (1)

Publication Number Publication Date
WO2018087844A1 true WO2018087844A1 (en) 2018-05-17

Family

ID=62110224

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/083243 WO2018087844A1 (en) 2016-11-09 2016-11-09 Work recognition device and work recognition method

Country Status (3)

Country Link
JP (1) JP6444573B2 (en)
TW (1) TW201818297A (en)
WO (1) WO2018087844A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020107071A (en) * 2018-12-27 2020-07-09 日本放送協会 Object tracking device and program thereof
JP2020129287A (en) * 2019-02-08 2020-08-27 コニカミノルタ株式会社 Process information acquisition system, process information acquisition method, and process information acquisition program
JP2021015507A (en) * 2019-07-12 2021-02-12 マツダ株式会社 Body motion acquisition device and method
US20210133444A1 (en) * 2019-11-05 2021-05-06 Hitachi, Ltd. Work recognition apparatus
EP3929837A1 (en) * 2020-06-22 2021-12-29 Denso Corporation Work content analyzing apparatus, work content analyzing method, program, and sensor
US20220083769A1 (en) * 2020-09-14 2022-03-17 Kabushiki Kaisha Toshiba Work estimation apparatus, method and non-transitory computer-readable storage medium
DE112019007626T5 (en) 2019-09-18 2022-04-28 Mitsubishi Electric Corporation WORKING ELEMENT ANALYZER AND WORKING ELEMENT ANALYZING METHOD

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019200560A (en) * 2018-05-16 2019-11-21 パナソニックIpマネジメント株式会社 Work analyzing device and work analyzing method
JP2023074948A (en) 2021-11-18 2023-05-30 オムロン株式会社 Work recognition apparatus, work recognition method, and work recognition program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005250990A (en) * 2004-03-05 2005-09-15 Mitsubishi Electric Corp Operation support apparatus
JP2006252036A (en) * 2005-03-09 2006-09-21 Nippon Telegr & Teleph Corp <Ntt> Projection image creating device, program, projection image creating method, and image projection system
JP2007034738A (en) * 2005-07-27 2007-02-08 Advanced Telecommunication Research Institute International Warning system and warning method
JP2009123181A (en) * 2007-10-26 2009-06-04 Advanced Telecommunication Research Institute International Information presentation system
JP2013025478A (en) * 2011-07-19 2013-02-04 Panasonic Corp Work detection system
JP2013145419A (en) * 2012-01-13 2013-07-25 Hitachi Ltd Maintenance navigation system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005250990A (en) * 2004-03-05 2005-09-15 Mitsubishi Electric Corp Operation support apparatus
JP2006252036A (en) * 2005-03-09 2006-09-21 Nippon Telegr & Teleph Corp <Ntt> Projection image creating device, program, projection image creating method, and image projection system
JP2007034738A (en) * 2005-07-27 2007-02-08 Advanced Telecommunication Research Institute International Warning system and warning method
JP2009123181A (en) * 2007-10-26 2009-06-04 Advanced Telecommunication Research Institute International Information presentation system
JP2013025478A (en) * 2011-07-19 2013-02-04 Panasonic Corp Work detection system
JP2013145419A (en) * 2012-01-13 2013-07-25 Hitachi Ltd Maintenance navigation system

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020107071A (en) * 2018-12-27 2020-07-09 日本放送協会 Object tracking device and program thereof
JP7198661B2 (en) 2018-12-27 2023-01-04 日本放送協会 Object tracking device and its program
JP2020129287A (en) * 2019-02-08 2020-08-27 コニカミノルタ株式会社 Process information acquisition system, process information acquisition method, and process information acquisition program
JP7139987B2 (en) 2019-02-08 2022-09-21 コニカミノルタ株式会社 Process information acquisition system, process information acquisition method, and process information acquisition program
JP2021015507A (en) * 2019-07-12 2021-02-12 マツダ株式会社 Body motion acquisition device and method
JP7375351B2 (en) 2019-07-12 2023-11-08 マツダ株式会社 Body motion acquisition device and method
DE112019007626T5 (en) 2019-09-18 2022-04-28 Mitsubishi Electric Corporation WORKING ELEMENT ANALYZER AND WORKING ELEMENT ANALYZING METHOD
US20210133444A1 (en) * 2019-11-05 2021-05-06 Hitachi, Ltd. Work recognition apparatus
EP3929837A1 (en) * 2020-06-22 2021-12-29 Denso Corporation Work content analyzing apparatus, work content analyzing method, program, and sensor
US20220083769A1 (en) * 2020-09-14 2022-03-17 Kabushiki Kaisha Toshiba Work estimation apparatus, method and non-transitory computer-readable storage medium

Also Published As

Publication number Publication date
JP6444573B2 (en) 2018-12-26
TW201818297A (en) 2018-05-16
JPWO2018087844A1 (en) 2018-12-27

Similar Documents

Publication Publication Date Title
JP6444573B2 (en) Work recognition device and work recognition method
JP6038417B1 (en) Robot teaching apparatus and robot control program creating method
US10737396B2 (en) Method and apparatus for robot path teaching
JP4961860B2 (en) Robot apparatus and control method of robot apparatus
JP5528151B2 (en) Object tracking device, object tracking method, and object tracking program
US11148299B2 (en) Teaching apparatus and teaching method for robots
US20140156125A1 (en) Autonomous electronic apparatus and navigation method thereof
JP2014516816A (en) Tracking and following of moving objects by mobile robot
JP2008009849A (en) Person tracking device
JP6499716B2 (en) Shape recognition apparatus, shape recognition method, and program
JP6075888B2 (en) Image processing method, robot control method
JP2018153874A (en) Presentation device, presentation method, program and work system
WO2019087638A1 (en) Information processing device and information processing method
JP5012589B2 (en) Human tracking system using image information
US20130054028A1 (en) System and method for controlling robot
US11199561B2 (en) System and method for standardized evaluation of activity sequences
JP2019159389A (en) Information processing device, information processing method, and recording medium
KR20150066845A (en) Process inspection device, method and system for assembling process in product manufacturing using depth map sensors
US10996235B2 (en) System and method for cycle duration measurement in repeated activity sequences
JP4198676B2 (en) Robot device, robot device movement tracking method, and program
US20180307302A1 (en) Electronic device and method for executing interactive functions
JP3953450B2 (en) 3D object posture operation method and program
KR101566964B1 (en) Method of monitoring around view tracking moving object, attaratus performing the same and storage media storing the same
US20200125066A1 (en) Measurement program selection assisting apparatus and measurement control apparatus
JP2012236266A (en) Robot control system, robot system, and program

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2018546920

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16921042

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16921042

Country of ref document: EP

Kind code of ref document: A1