WO2017017737A1 - Display method, monitor result output method, information processing device, and display program - Google Patents

Display method, monitor result output method, information processing device, and display program Download PDF

Info

Publication number
WO2017017737A1
WO2017017737A1 PCT/JP2015/071170 JP2015071170W WO2017017737A1 WO 2017017737 A1 WO2017017737 A1 WO 2017017737A1 JP 2015071170 W JP2015071170 W JP 2015071170W WO 2017017737 A1 WO2017017737 A1 WO 2017017737A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
image
motion
captured image
information
Prior art date
Application number
PCT/JP2015/071170
Other languages
French (fr)
Japanese (ja)
Inventor
伊藤 史
威彦 西村
一樹 ▲高▼橋
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2015/071170 priority Critical patent/WO2017017737A1/en
Publication of WO2017017737A1 publication Critical patent/WO2017017737A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]

Definitions

  • the present invention relates to a display method, a monitor result output method, an information processing apparatus, and a display program.
  • Non-Patent Documents 1 and 2 Conventionally, a technique for recognizing correct operation by displaying a model image and an actual image side by side is known (see, for example, Non-Patent Documents 1 and 2).
  • the purpose is to enhance the effect of learning the work.
  • a completed work is identified and completed among work included in a predetermined work scenario according to a motion detected based on a captured image.
  • the computer executes a process of displaying the image stored in association with the work specified in the work scenario as the next work after the work and the captured image after the detection of the motion.
  • FIG. 3 is a diagram illustrating a software configuration example of an information processing apparatus. It is a figure (the 1) which shows the example of a data structure of various information. It is FIG. (2) which shows the data structure example of various information. It is a figure which shows the hardware structural example of information processing apparatus. It is a flowchart which shows the process example of work management.
  • FIG. 10 is a diagram (part 1) illustrating an example of a display screen; It is a flowchart which shows the process example which determines entering / exiting of the hand to a pallet. It is FIG. (2) which shows the example of a display screen.
  • FIG. 10 is a third diagram illustrating an example of a display screen; It is a flowchart which shows the process example of expert image superimposition display. It is a flowchart which shows the example of a process of adjustment of a trunk axis. It is a figure which shows the example of acquisition of a trunk axis. It is a figure which shows the example of a screen which superimposed the image of the expert on the operator's image. It is a flowchart which shows the process example of a timing adjustment. It is a flowchart which shows the process example of deviation specific evaluation. It is a figure which shows the example of application of an error range. It is a figure which shows the example of a display or reporting of deviation
  • FIG. 1 is a diagram illustrating an example of the appearance of a system according to an embodiment.
  • a motion sensing device 2 and a display / speaker 3 are connected to an information processing device 1 such as a PC (Personal Computer).
  • the motion sensing device 2 performs imaging with a view of a plurality of pallets P and a worker H arranged on a workbench, and the movement (motion) of the worker's hand and body parts (skeletal elements such as joints). Is detected.
  • the pallet P accommodates parts and tools used for product assembly.
  • the motion sensing device 2 is arranged above the front of the worker H, but the place where the motion sensing device 2 can be arranged is not limited as long as the pallet P and the worker H can be viewed.
  • the display / speaker 3 displays work instructions and alerts to the worker H, and outputs sound as necessary.
  • the display / speaker 3 may be provided with a touch panel.
  • FIG. 2 is a diagram showing an example of the appearance of the pallet P, and markers (reference objects) M including code information are pasted at positions such as four corners that are easily imaged from the motion sensing device 2.
  • the marker M is associated with information for designating an image region (a three-dimensional image region including depth information) used to determine whether or not the worker H has entered the corresponding pallet P.
  • An image area (pallet area) as indicated by a broken line with reference to the marker M is set.
  • at least two markers M may be captured by the motion sensing device 2 with respect to each pallet P.
  • An image area can also be set by the marker M.
  • the pallet area is automatically set based on the marker M, it is not necessary to arrange the pallet P at a strict position, and even if the position is slightly shifted, the detection of the operation is not affected.
  • FIG. 3 is a diagram illustrating a software configuration example of the information processing apparatus 1.
  • the information processing apparatus 1 includes a motion sensing processing unit 11, a work management unit 12, an expert image superimposing display unit 13, and a work evaluation unit 14.
  • the motion sensing processing unit 11 includes, in addition to the captured image, a data stream of the captured image (color moving image) and the depth image (the distance information from the viewpoint associated with each pixel of the captured image) of the motion sensing device 2. It has a function of generating skeleton information (skeleton information, joint information) including the position of a human body part (such as a joint) by calculation.
  • Skeletal information includes "head”, “shoulder center” “left shoulder” “right shoulder” “left elbow” “left wrist” “left palm” “right elbow” “right wrist” “right palm” “center of spine” Position information such as the center of the waist, the base of the left foot, the base of the right foot, the left knee, the left ankle, the right knee, the right ankle, the left toe, and the right toe can be obtained in real time. Note that when the function of the motion sensing processing unit 11 is included in the motion sensing device 2, it is not necessary to provide the motion sensing processing unit 11 in the information processing device 1.
  • the work management unit 12 has a function of managing the work performed by the worker H based on the captured image and the skeleton information output from the motion sensing processing unit 11. Details of the processing contents will be described later.
  • the expert image superimposing display unit 13 superimposes and displays the expert's image on the work performed by the worker H based on the captured image and the skeleton information output from the motion sensing processing unit 11, so that the skill of the skilled worker is displayed on the worker H. Has a function to promote learning. Details of the processing contents will be described later.
  • the work evaluation unit 14 has a function of numerically evaluating the work of the worker H based on the captured image and the skeleton information output from the motion sensing processing unit 11. Details of the processing contents will be described later.
  • Information used for processing by the work management unit 12 or the like includes work information D1, pallet information D2, marker information D3, worker information D4, delay log information D5, error log information D6, skilled worker work information D7, and error range information D8. , Deviation information D9, model locus information D10, allowable range information D11, and evaluation information D12 are held.
  • the work information D1 includes “product ID / product name”, “process ID / process name”, “work ID”, “work title”, “instruction screen”, “work instruction text”, “corresponding pallet ID”, “hand to pallet”. “In / out”, “Right / left / both hands”, “Maximum start time / minimum time”, “Standard time / allowable time”, and the like.
  • Product ID / Product name is information for identifying a product to be assembled.
  • the “process ID / process name” is information for identifying a process in which product assembly or the like is divided into several stages.
  • Work ID is information for identifying a work (procedure) obtained by subdividing a process. The work includes picking up an object, placing an object, assembling an object, and the like.
  • “Work title” is a title indicating an outline of the work.
  • the “instruction image” is an image showing details of work.
  • the “work instruction text” is a character string indicating details of the work.
  • the “corresponding pallet ID” is information indicating a pallet used for the work. When the process is combined, the pallet required in the process is specified.
  • “Hand entering / exiting the pallet” is information indicating whether or not the operator's hand entering / exiting the pallet is used to determine the operation.
  • “Separate right hand / left hand / both hands” is information indicating a worker's hand (right hand / left hand / both hands) designated to be entered / exited in the work.
  • the position of the pallet on the work table is associated with the hand that the worker enters and exits, and is a position at which the work can proceed optimally.
  • This position information can be used as a reference for placement by displaying on the screen when placing the pallet on the workbench before assembly of the product is started.
  • Start maximum time / minimum time is information indicating the maximum time and minimum time allowed from the start of the work to the hand when the work involves entering / exiting the pallet.
  • Standard time / allowable time is the standard time and allowable time until the work is completed when the work involves entering and exiting the pallet (allowable time for long and allowable time for short) ). Note that a series of work by the worker H is performed in units of processes, and work information regarding one process can be referred to as a “work scenario”.
  • the pallet information D2 has items such as “pallet ID” and “corresponding marker ID”. “Pallet ID” is information for identifying a pallet. The “corresponding marker ID” is information for identifying a marker attached to the palette.
  • the marker information D3 has items such as “marker ID” and “pallet area designation”. “Marker ID” is information for identifying a marker. “Palette area designation” is information for designating an image area for detecting the entry / exit of a hand with reference to a marker.
  • the worker information D4 has items such as “worker ID” and “attribute”. “Worker ID” is information for identifying a worker. “Attribute” is information such as the name and affiliation of the worker.
  • the delay log information D5 includes items such as “log ID”, “date and time”, “worker ID”, “product ID / process ID / work ID”, and “delay content”.
  • Log ID is information for identifying delay log information.
  • Date / time is information indicating the occurrence date / time of an event recorded as delay log information.
  • the “worker ID” is information for identifying a worker who has performed work that is the target of the delay log information.
  • “Product ID / Process ID / Work ID” is information for identifying the product / process / work of the work that is the target of the delay log information.
  • the “delay content” is information indicating the content of the delay (such as the delay in the time when the hand has been put into the pallet and the delay time).
  • the error log information D6 has items such as “log ID”, “date and time”, “worker ID”, “product ID / process ID / work ID”, and “error content”.
  • Log ID is information for identifying error log information.
  • Date and time is information indicating the date and time of occurrence of an event recorded as error log information.
  • the “worker ID” is information for identifying a worker who has performed a work that is a target of error log information.
  • “Product ID / Process ID / Work ID” is information for identifying the product / process / work of the work that is the target of the error log information.
  • Error contents is information indicating the contents of an error (such as putting a hand different from the designated hand in the palette).
  • the expert work information D7 includes items such as “expert work information ID”, “video”, “depth”, “skeleton information”, and “frame / work correspondence information”.
  • the “skilled worker work information ID” is information for identifying skilled worker work information.
  • Video is information of a captured image of work of an expert.
  • Depth is information on the depth image of the work of an expert.
  • Skleton information is skeleton information of the work of an expert.
  • the “frame / work correspondence information” is information for associating a frame of a captured image of work of an expert with a target product / process / work.
  • the error range information D8 has items such as “error range information ID” and “error range”. “Error range information ID” is information for identifying error range information. “Error range” is information indicating an error range (for example, a radius centered on a reference point).
  • the deviation information D9 includes items such as “deviation information ID”, “date and time”, “worker ID”, “product ID / process ID / work ID”, and “deviation content”.
  • the “deviation information ID” is information for identifying deviation information.
  • “Date and time” is information indicating the date and time of occurrence of an event recorded as deviation information.
  • the “worker ID” is information for identifying a worker who has performed a work that is a target of deviation information.
  • “Product ID / Process ID / Work ID” is information for identifying the product / process / work of the work that is the target of the deviation information.
  • the “deviation content” is information indicating the content of the deviation.
  • the model locus information D10 has items such as “example information ID”, “frame number”, and “coordinate”.
  • the “example information ID” is information for identifying example trajectory information as a reference for evaluation.
  • the model trajectory information is in principle hand trajectory information, but if there is another body part whose motion is to be evaluated, the trajectory information about the part may be added.
  • “Frame number” is information indicating the frame number of an image (moving image) in which a locus is recorded.
  • Coordinats is information indicating the coordinates of the locus (two-dimensional coordinates and depth on the screen).
  • the allowable range information D11 includes items such as “allowable range information ID” and “allowable range”. “Allowable range information ID” is information for identifying allowable range information. “Allowable range” is information indicating an allowable range (for example, a radius centered on a reference point).
  • the evaluation information D12 includes items such as “evaluation information ID”, “date and time”, “worker ID”, “product ID / process ID / work ID”, and “evaluation value”.
  • “Evaluation information ID” is information for identifying evaluation information.
  • “Date and time” is information indicating the date and time of evaluation.
  • “Worker ID” is information for identifying a worker who has performed a work recorded as evaluation information.
  • “Product ID / Process ID / Work ID” is information for identifying the product / process / work of the work to be evaluated.
  • evaluation value is information indicating the contents of evaluation.
  • FIG. 6 is a diagram illustrating a hardware configuration example of the information processing apparatus 1.
  • the information processing apparatus 1 includes a CPU (Central Processing Unit) 102, a ROM (Read Only Memory) 103, a RAM (Random Access Memory) 104, and an NVRAM (Non-Volatile Random Access Memory) connected to a system bus 101. 105.
  • the information processing apparatus 1 includes an I / F (Interface) 106, an I / O (Input / Output Device) 107, an HDD (Hard Disk Drive) / flash memory 108, a NIC (Network) connected to the I / F 106.
  • I / F Interface
  • I / O Input / Output Device
  • HDD Hard Disk Drive
  • flash memory 108 Flash memory
  • NIC Network
  • a monitor 110 connected to the I / O 107, a keyboard 111, a mouse 112, and the like.
  • a CD / DVD (Compact Disk / Digital Versatile Disk) drive or the like can also be connected to the I / O 107.
  • the monitor 110 is not necessary.
  • the functions of the information processing apparatus 1 described with reference to FIG. 3 are realized by the CPU 102 executing a predetermined program.
  • the program may be acquired via a recording medium or may be acquired via a network.
  • FIG. 7 is a flowchart illustrating an example of work management processing performed by the work management unit 12 of the information processing apparatus 1.
  • the work management unit 12 acquires a situation (product ID, process ID, pallet ID, worker ID) set in advance by an administrator or the like (step S101).
  • the situation is set based on the work scenario (work information D1) by the administrator or the worker via the user interface screen provided by the information processing apparatus 1 for each work table.
  • a product to be assembled product ID
  • process ID an assembly process
  • worker work ID
  • the set situation data is held in a storage area in the information processing apparatus 1.
  • the work management unit 12 waits for a work start trigger (step S102).
  • the manager sends a start signal based on time, etc.
  • the worker H operates the information processing apparatus 1 to send a start signal, or the worker H stands in front of the work table. Suppose that it is detected that the vehicle has stopped for a predetermined time.
  • the work management unit 12 starts substantial processing (step S103).
  • the work management unit 12 acquires the next work ID from the work information D1 and displays the screen (step S104).
  • the process shifts to the next work after a predetermined time considering the standard time of work. Even in the case where the worker H does not enter or leave the pallet P, the work content is monitored by image processing or the like, and when it is determined that the work is completed, the process proceeds to the next work. Also good.
  • FIG. 8 is a diagram showing an example of the display screen.
  • the product number and the process number are displayed in the display area A1, the work title is displayed in the display area A2, and the work included in the current process is displayed in the display area A3.
  • the title is displayed as a list, and the current work is highlighted as indicated by C1.
  • An instruction image is displayed in the display area A4, and a work instruction text is displayed in the display area A5.
  • the display area A6 is an area where error information is displayed, but no error information is displayed in the illustrated example.
  • B1 and B2 are buttons for moving the work forward and backward.
  • the work management unit 12 starts measurement from the initial value 0 for the first elapsed time (step S105).
  • the first elapsed time is for determining the time to start moving from the instruction to start work on the screen until the worker H starts some operation.
  • the work management unit 12 determines whether or not the worker H has entered the pallet associated with the work in the work scenario (work information D1) (step S106).
  • FIG. 9 is a flowchart showing an example of a process for determining entry and exit of the hand from the pallet P, which is a process executed in parallel with each operation.
  • the work management unit 12 detects a marker M corresponding to the current work ID from the captured image (step S121), and specifies a pallet area (step S122).
  • the palette area is as shown by the broken line in FIG.
  • the work management unit 12 compares the three-dimensional position (two-dimensional coordinates and depth of the captured image) of the skeletal information of the hand (right hand / left hand) with the three-dimensional position of the pallet area.
  • a detection status (detection ⁇ non-detection, non-detection ⁇ detection) of whether the hand has entered or not is determined (step S123).
  • Step S107 the work management unit 12 determines whether the first elapsed time has passed a predetermined value. If it is determined that the time has not elapsed (NO in step S107), the process returns to the determination of whether or not the hand has entered the pallet P (step S106). It should be noted that hand entry / exit detection is performed for all pallets on the workbench, and when a hand enters a pallet that is not associated with the current work, the screen display and recording are performed as an error. Good.
  • the work management unit 12 determines whether or not the hand matches the hand associated with the work. (Step S108). When it is determined that the hand of the worker H has entered the predetermined pallet (YES in step S106), even when the first elapsed time is shorter than the minimum time, an alert display and recording that the work is started too early May be performed.
  • the work management unit 12 determines that the hand matches the hand associated with the work (YES in step S108)
  • the work management unit 12 starts measurement of the second elapsed time from an initial value: 0 (step S109).
  • the second elapsed time is for determining whether or not the time for putting the hand in the pallet P is appropriate.
  • the work management unit 12 determines whether or not the hand of the worker H has come out of the pallet P (step S110). For this determination, the hand detection situation described with reference to FIG. 9 is used.
  • step S111 when the work management unit 12 determines that the hand of the worker H has not come out of the pallet P (NO in step S110), the work management unit 12 determines whether or not the second elapsed time is within an allowable range (step S111). If it is determined that the value is within the allowable range (YES in step S111), the process returns to the determination (step S110) as to whether or not the hand of the worker H has come out of the pallet P.
  • step S110 When the work management unit 12 determines that the hand of the worker H has come out of the pallet P (YES in step S110), the work management unit 12 stops counting the second elapsed time (step S112), and displays the screen for the next work (step S112). Return to S104).
  • step S112 When it is determined that the hand of the worker H has come out of the pallet P (YES in step S110), the work is too early even when the second elapsed time is shorter than the allowable time with respect to the standard time of the work.
  • the alert display and recording may be performed.
  • step S107 when the work management unit 12 determines that the first elapsed time has passed the predetermined value when the worker H's hand is not in the predetermined pallet (NO in step S106), (YES in step S107). It is determined that the worker H is at a loss. In this case, the work management unit 12 records the delay log information D5 as a delay, displays the delay information on the screen (step S113), resets the first elapsed time (step S114), and whether the pallet P has entered the hand. It returns to judgment (step S106) of no.
  • FIG. 10 is a diagram showing an example of the display screen, and shows a state in which an alert “Error: This operation is not performed” is displayed in the display area A6.
  • step S ⁇ b> 106 when the work management unit 12 determines that the hand of the worker H has entered the predetermined pallet (YES in step S ⁇ b> 106), the work management unit 12 determines that the hand does not match the hand associated with the work. If so (NO in step S108), it is determined that a work error (work error) has occurred. In this case, the work management unit 12 records the error log information D6 as an error (step S115), displays the error information on the screen (step S116), resets the first elapsed time (step S117), and stores it in the palette P. The process returns to the determination of whether or not a hand has entered (step S106).
  • FIG. 11 is a diagram showing an example of the display screen, and shows a state in which an alert “Error: using right hand” is displayed in the display area A6.
  • the work management unit 12 determines that the second elapsed time is not within the allowable range when it is determined that the hand of the worker H is not out of the pallet P (NO in step S110) (
  • the delay log information D5 is recorded as a delay (step S111), the delay information is displayed on the screen (step S119), and the process proceeds to the reset of the first elapsed time (step S117).
  • FIG. 12 is a flowchart illustrating a processing example of expert image superimposition display by the expert image superimposition display unit 13 of the information processing apparatus 1.
  • the expert image superimposed display process may be performed independently or in parallel with the work management process illustrated in FIG.
  • the expert image overlay display unit 13 acquires a situation (step S21), detects a work start trigger, and starts substantial processing (step S22). This corresponds to steps S101 to S103 in FIG. 7. However, when there are a plurality of expert work information and error range information to be superimposed, information for specifying what to use for processing (experts) Work information ID, error range information ID) are added to the situation.
  • the process starts from step S23 after step S103 of FIG. In step S103 in FIG. 7, the first work of each process is started, and the next work is started when it is determined that the previous work has been completed by motion detection (such as a hand coming out of the pallet).
  • the expert image superimposing display unit 13 acquires expert work information D7 and work error range information D8 (step S23).
  • the expert image superimposed display unit 13 adjusts the trunk axis with respect to the expert work information D7 and the worker work information (captured image, skeleton information) (step S24).
  • FIG. 13 is a flowchart showing a processing example of adjustment of the trunk axis (step S24 in FIG. 12).
  • the expert image superimposing display unit 13 acquires the trunk axis of the expert on the two-dimensional coordinates from the expert work information D7 (step S241).
  • the expert image overlay display unit 13 acquires the operator's trunk axis on the two-dimensional coordinates (step S242).
  • FIG. 14 is a diagram showing an example of acquiring a trunk axis. For each of the worker H indicated by a solid line and the skilled person SH indicated by a broken line, the center of the head / shoulder, the center of the spine, and the waist The line that passes through the center of is acquired as the trunk axis.
  • the expert image superimposing display unit 13 calculates the deviation of the trunk axes of the two (step S243), and rotates the image display of the expert so as to correct the deviation (step S244). .
  • FIG. 15 is a diagram illustrating a screen example in which the image of the worker SH is superimposed on the image of the worker H.
  • the image of the expert SH is set to have high transparency so as not to hinder the visual recognition of the image of the worker H. Since the trunk axis is adjusted so as to coincide with each other as much as possible, even if there is a difference in standing position or the like, a difference in a portion to be watched such as a hand movement is easily understood.
  • the expert image overlay display unit 13 adjusts the timing with respect to the expert work information D7 and the worker work information (step S25).
  • FIG. 16 is a flowchart showing a processing example of timing adjustment (step S25 in FIG. 12).
  • the expert image overlay display unit 13 compares the current work IDs of the worker and the skilled worker (step S251), and determines whether or not the same work is being performed based on the work ID (step S252).
  • the timing adjustment ends.
  • the expert image superimposing display unit 13 determines whether the operation is advanced by the expert from the order of the operation ID (step S253).
  • step S253 If it is determined that the expert is working (YES in step S253), the expert image superimposing display unit 13 pauses the reproduction of the expert work video and delays the timing (step S254). The process returns to ID acquisition (step S251).
  • step S253 when it is determined that the expert is not working (the worker is working) (NO in step S253), the expert image superimposing display unit 13 displays the work image of the expert. (Step S255), the process returns to the acquisition of the work ID (step S251).
  • the expert image overlay display unit 13 evaluates the degree of deviation of the expert work information D7 and the worker work information (step S26).
  • FIG. 17 is a flowchart showing an example of processing for evaluating the deviation (step S26 in FIG. 12).
  • the expert image superimposing display unit 13 acquires the joint positions of the expert and the worker (step S261).
  • FIG. 18 is a diagram showing an example of application of the error range.
  • a sphere whose radius is centered on the position a of the expert's joint is set as an error range, and it is determined whether or not the joint position of the operator is included therein. . Since the position b is inside the sphere, it is determined that it is within the error range, and since the position c is outside the sphere, it is determined that it exceeds the error range.
  • the expert image superimposing display unit 13 marks the frame as a shifted frame (moving image frame) (step S264). Then, the expert image overlay display unit 13 highlights a range between the displaced joint and the adjacent joint (step S265).
  • FIG. 19 is a diagram showing an example of misalignment display or reporting. Since the positions of the right hand and the left hand are greatly deviated from the skilled person, the joint of the hand is highlighted (indicated by a bold line in the figure), and information about the misalignment is displayed. it's shown. In addition, you may display in real time and you may preserve
  • the expert image overlay display unit 13 stores the information on the deviation (step S27), and when the work is not completed (NO in step S28), the trunk axis is adjusted (step S24). If the work is finished (YES in step S28), the process is finished. Note that when storing the information on the shift (step S27), the moving image superimposed and displayed during the shift period may be cut out and stored separately so that it can be played back later.
  • the process may be performed based on the captured image and the skeleton information of the worker already recorded. Good.
  • FIG. 20 is a flowchart illustrating an example of work evaluation processing performed by the work evaluation unit 14 of the information processing apparatus 1.
  • the work evaluation process may be performed independently, or may be performed in parallel with the work management process illustrated in FIG. 7 and the expert image superimposed display process illustrated in FIG.
  • step S31 when processing is performed independently, the work evaluation unit 14 acquires a situation (step S31), detects a work start trigger, and starts substantial processing (step S32).
  • step S32 This corresponds to steps S101 to S103 in FIG. 7, but when there are a plurality of model trajectory information and allowable range information which are the criteria for evaluation, information for identifying the one to be used for processing (example trajectory). Information ID, allowable range information ID) is added to the situation.
  • step S33 when performing in parallel with the process of FIG. 7, the process starts from step S33 after step S103 of FIG.
  • step S103 in FIG. 7 the first work of each process is started, and the next work is started when it is determined that the previous work has been completed by motion detection (such as a hand coming out of the pallet).
  • the work evaluation unit 14 acquires the model trajectory information D10 specified in the situation (step S33), and acquires the trajectory information of the worker (step S34).
  • the model trajectory information is, in principle, hand trajectory information, and the trajectory of the worker to be compared with the hand trajectory information is also hand trajectory information.
  • the trajectory information may be added. For example, for parts such as the head, shoulders, and back muscles, it is possible to evaluate whether or not the posture is appropriate from the deviation from the trace of the model.
  • the work evaluation unit 14 determines the degree of coincidence between the trace information of the model and the trace information of the worker (Step S35).
  • FIG. 21 is a flowchart showing a processing example of determination of the degree of coincidence of trajectories (step S35 in FIG. 20).
  • the work evaluation unit 14 compares the model trajectory information with the joint position of the worker (step S351).
  • FIG. 22 is a diagram showing an example of application of the allowable range, in which a sphere having a radius around the position d in each frame of the model trajectory information is set, and the trajectory position of the worker is included therein. Determine whether or not.
  • step S36 the work evaluation unit 14 determines the degree of coincidence between the trace information of the model and the trace information of the worker (step S35). Return.
  • step S36 the work evaluation unit 14 calculates the degree of trajectory mismatch (step S37).
  • FIG. 23 is a flowchart showing a processing example of the calculation of the degree of discrepancy (step S37 in FIG. 20).
  • the work evaluation unit 14 determines the inconsistency as The number of unmatched frames / the number of frames of the entire work is calculated (step S371).
  • the number of unmatched frames is the number of frames marked as unmatched frames.
  • a tube-shaped area considering the allowable range is set in the model trajectory, and the operator's trajectory is included in the area.
  • the evaluation may be performed without considering the time required for the movement depending on whether it is included. Further, the time axis of either the model trajectory or the worker trajectory may be enlarged or reduced, and evaluation may be performed in consideration of both positional coincidence and temporal coincidence.
  • the work evaluation unit 14 stores the calculated degree of mismatch (step S38), and ends the process.
  • the stored inconsistency is used as a report as operator evaluation information. Evaluation of posture is useful for guidance for performing work safely.

Abstract

In a method for displaying, in order to improve task-learning effectiveness, a task state imaged using an imaging device, a computer executes processing in which a completed task among tasks included in a prescribed task scenario is specified in accordance with a motion detected on the basis of a captured image, and a stored image associated with a task in the task scenario specified as the task following the completed task, is displayed superimposed on an image captured after the motion was detected.

Description

表示方法、モニタ結果出力方法、情報処理装置および表示プログラムDisplay method, monitor result output method, information processing apparatus, and display program
 本発明は、表示方法、モニタ結果出力方法、情報処理装置および表示プログラムに関する。 The present invention relates to a display method, a monitor result output method, an information processing apparatus, and a display program.
 製品の組み立ては、一部では自動化されているものもあるが、人(作業者)により行われる場合も依然として多い。製品の品質を維持し、作業者の安全を確保する観点から、作業が正確に行われていることを監視するとともに、作業者に効率的に作業を習得させることは、重要なテーマとなっている。 The assembly of products is partly automated, but it is still often performed by people (operators). From the perspective of maintaining product quality and ensuring worker safety, it is important to monitor that work is being performed accurately and to allow workers to learn work efficiently. Yes.
 従来、手本となる画像と実際の画像とを並べて表示することで、正しい動作を気づかせる技術が知られている(例えば、非特許文献1、2等を参照。)。 Conventionally, a technique for recognizing correct operation by displaying a model image and an actual image side by side is known (see, for example, Non-Patent Documents 1 and 2).
 上述した従来の技術では、手本となる画像と実際の画像とを並べて表示するものであるため、微妙な手の動きや体の姿勢等の違いを認識するのが難しく、作業を習得させる上で充分ではなかった。 In the conventional technology described above, the model image and the actual image are displayed side by side, so it is difficult to recognize subtle differences in hand movements, body postures, etc. It was not enough.
 そこで、一側面では、作業の習得の効果を高めることを目的とする。 Therefore, in one aspect, the purpose is to enhance the effect of learning the work.
 一つの形態では、撮像装置を用いて撮像された作業状況の表示方法において、撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定し、完了した前記作業の次の作業として前記作業シナリオで規定された作業に対応づけて記憶された画像と、前記モーションの検出後の撮像画像とを重ねて表示する、処理をコンピュータが実行する。 In one embodiment, in a method for displaying a work situation imaged using an imaging device, a completed work is identified and completed among work included in a predetermined work scenario according to a motion detected based on a captured image. The computer executes a process of displaying the image stored in association with the work specified in the work scenario as the next work after the work and the captured image after the detection of the motion.
 作業の習得の効果を高めることができる。 * The effect of mastering work can be enhanced.
一実施形態にかかるシステムの外観の例を示す図である。It is a figure which shows the example of the external appearance of the system concerning one Embodiment. パレットの外観の例を示す図である。It is a figure which shows the example of the external appearance of a pallet. 情報処理装置のソフトウェア構成例を示す図である。FIG. 3 is a diagram illustrating a software configuration example of an information processing apparatus. 各種情報のデータ構造例を示す図(その1)である。It is a figure (the 1) which shows the example of a data structure of various information. 各種情報のデータ構造例を示す図(その2)である。It is FIG. (2) which shows the data structure example of various information. 情報処理装置のハードウェア構成例を示す図である。It is a figure which shows the hardware structural example of information processing apparatus. 作業管理の処理例を示すフローチャートである。It is a flowchart which shows the process example of work management. 表示画面の例を示す図(その1)である。FIG. 10 is a diagram (part 1) illustrating an example of a display screen; パレットへの手の入と出を判定する処理例を示すフローチャートである。It is a flowchart which shows the process example which determines entering / exiting of the hand to a pallet. 表示画面の例を示す図(その2)である。It is FIG. (2) which shows the example of a display screen. 表示画面の例を示す図(その3)である。FIG. 10 is a third diagram illustrating an example of a display screen; 熟練者画像重畳表示の処理例を示すフローチャートである。It is a flowchart which shows the process example of expert image superimposition display. 体幹軸の調整の処理例を示すフローチャートである。It is a flowchart which shows the example of a process of adjustment of a trunk axis. 体幹軸の取得の例を示す図である。It is a figure which shows the example of acquisition of a trunk axis. 作業者の画像に熟練者の画像を重畳した画面例を示す図である。It is a figure which shows the example of a screen which superimposed the image of the expert on the operator's image. タイミング調整の処理例を示すフローチャートである。It is a flowchart which shows the process example of a timing adjustment. ずれ具体の評価の処理例を示すフローチャートである。It is a flowchart which shows the process example of deviation specific evaluation. 誤差範囲の適用の例を示す図である。It is a figure which shows the example of application of an error range. ずれの表示またはレポーティングの例を示す図である。It is a figure which shows the example of a display or reporting of deviation | shift. 作業評価の処理例を示すフローチャートである。It is a flowchart which shows the process example of work evaluation. 軌跡の一致度の判定の処理例を示すフローチャートである。It is a flowchart which shows the process example of determination of the coincidence degree of a locus | trajectory. 許容範囲の適用の例を示す図である。It is a figure which shows the example of application of a tolerance | permissible_range. 軌跡の不一致度の計算の処理例を示すフローチャートである。It is a flowchart which shows the example of a process of calculation of the mismatch degree of a locus | trajectory.
 以下、本発明の好適な実施形態につき説明する。 Hereinafter, preferred embodiments of the present invention will be described.
 <構成>
 図1は一実施形態にかかるシステムの外観の例を示す図である。図1において、PC(Personal Computer)等の情報処理装置1にはモーションセンシング装置2とディスプレイ/スピーカ3が接続されている。モーションセンシング装置2は、作業台の上に配置された複数のパレットPと作業者Hを視野に撮像を行い、作業者Hの手や体の部位(関節等の骨格要素)の動き(モーション)を検出する。パレットPには製品の組み立てに用いる部品や工具が収容される。図示の例では作業者Hの前方の上方にモーションセンシング装置2を配置しているが、パレットPと作業者Hを視野にできれば配置される場所は限定されない。ディスプレイ/スピーカ3は、作業者Hに対して作業指示やアラートを表示し、必要に応じて音声を出力する。ディスプレイ/スピーカ3にはタッチパネルが設けられていてもよい。
<Configuration>
FIG. 1 is a diagram illustrating an example of the appearance of a system according to an embodiment. In FIG. 1, a motion sensing device 2 and a display / speaker 3 are connected to an information processing device 1 such as a PC (Personal Computer). The motion sensing device 2 performs imaging with a view of a plurality of pallets P and a worker H arranged on a workbench, and the movement (motion) of the worker's hand and body parts (skeletal elements such as joints). Is detected. The pallet P accommodates parts and tools used for product assembly. In the example shown in the drawing, the motion sensing device 2 is arranged above the front of the worker H, but the place where the motion sensing device 2 can be arranged is not limited as long as the pallet P and the worker H can be viewed. The display / speaker 3 displays work instructions and alerts to the worker H, and outputs sound as necessary. The display / speaker 3 may be provided with a touch panel.
 図2はパレットPの外観の例を示す図であり、モーションセンシング装置2から撮像されやすい四隅等の位置に、コード情報を含むマーカ(基準物)Mが貼り付けられている。マーカMには対応するパレットPに作業者Hの手が入ったか否かの判定に用いる画像領域(深度情報を含めた3次元の画像領域)を指定するための情報が対応付けられており、マーカMを基準に破線で示すような画像領域(パレット領域)が設定される。パレットPの四隅を基準とする場合には、マーカMは各パレットPに対して最低2個がモーションセンシング装置2により撮像されればよいが、マーカMから方向を認識できる場合には、1つのマーカMにより画像領域を設定することもできる。 FIG. 2 is a diagram showing an example of the appearance of the pallet P, and markers (reference objects) M including code information are pasted at positions such as four corners that are easily imaged from the motion sensing device 2. The marker M is associated with information for designating an image region (a three-dimensional image region including depth information) used to determine whether or not the worker H has entered the corresponding pallet P. An image area (pallet area) as indicated by a broken line with reference to the marker M is set. When the four corners of the pallet P are used as a reference, at least two markers M may be captured by the motion sensing device 2 with respect to each pallet P. An image area can also be set by the marker M.
 マーカMを基準にパレット領域が自動的に設定されるため、パレットPを厳密な位置に配置する必要はなく、位置が多少ずれても動作の検出に影響を与えることはない。 Since the pallet area is automatically set based on the marker M, it is not necessary to arrange the pallet P at a strict position, and even if the position is slightly shifted, the detection of the operation is not affected.
 図3は情報処理装置1のソフトウェア構成例を示す図である。図3において、情報処理装置1は、モーションセンシング処理部11と作業管理部12と熟練者画像重畳表示部13と作業評価部14とを備えている。モーションセンシング処理部11は、モーションセンシング装置2の撮像画像(カラー動画)と深度画像(撮像画像の各ピクセルに視点からの距離情報が対応付けられたもの)のデータストリームから、撮像画像の他に、人の体の部位(関節等)の位置を含む骨格情報(スケルトン情報、関節情報)を演算により生成する機能を有している。骨格情報としては、「頭」「肩の中央」「左肩」「右肩」「左肘」「左手首」「左手のひら」「右肘」「右手首」「右手のひら」「背骨の中心」「腰の中央」「左足の付け根」「右足の付け根」「左ひざ」「左足首」「右ひざ」「右足首」「左足先」「右足先」等の位置情報をリアルタイムに得ることができる。なお、モーションセンシング装置2内にモーションセンシング処理部11の機能を含む場合には、情報処理装置1内にモーションセンシング処理部11を設ける必要はない。 FIG. 3 is a diagram illustrating a software configuration example of the information processing apparatus 1. In FIG. 3, the information processing apparatus 1 includes a motion sensing processing unit 11, a work management unit 12, an expert image superimposing display unit 13, and a work evaluation unit 14. The motion sensing processing unit 11 includes, in addition to the captured image, a data stream of the captured image (color moving image) and the depth image (the distance information from the viewpoint associated with each pixel of the captured image) of the motion sensing device 2. It has a function of generating skeleton information (skeleton information, joint information) including the position of a human body part (such as a joint) by calculation. Skeletal information includes "head", "shoulder center" "left shoulder" "right shoulder" "left elbow" "left wrist" "left palm" "right elbow" "right wrist" "right palm" "center of spine" Position information such as the center of the waist, the base of the left foot, the base of the right foot, the left knee, the left ankle, the right knee, the right ankle, the left toe, and the right toe can be obtained in real time. Note that when the function of the motion sensing processing unit 11 is included in the motion sensing device 2, it is not necessary to provide the motion sensing processing unit 11 in the information processing device 1.
 作業管理部12は、モーションセンシング処理部11の出力する撮像画像と骨格情報に基づき、作業者Hによる作業を管理する機能を有している。処理内容の詳細については後述する。熟練者画像重畳表示部13は、モーションセンシング処理部11の出力する撮像画像と骨格情報に基づき、作業者Hによる作業に熟練者の画像を重畳表示して、作業者Hに熟練者の技の習得を促す機能を有している。処理内容の詳細については後述する。作業評価部14は、モーションセンシング処理部11の出力する撮像画像と骨格情報に基づき、作業者Hの作業を数値的に評価する機能を有している。処理内容の詳細については後述する。 The work management unit 12 has a function of managing the work performed by the worker H based on the captured image and the skeleton information output from the motion sensing processing unit 11. Details of the processing contents will be described later. The expert image superimposing display unit 13 superimposes and displays the expert's image on the work performed by the worker H based on the captured image and the skeleton information output from the motion sensing processing unit 11, so that the skill of the skilled worker is displayed on the worker H. Has a function to promote learning. Details of the processing contents will be described later. The work evaluation unit 14 has a function of numerically evaluating the work of the worker H based on the captured image and the skeleton information output from the motion sensing processing unit 11. Details of the processing contents will be described later.
 作業管理部12等の処理に用いられる情報として、作業情報D1、パレット情報D2、マーカ情報D3、作業者情報D4、遅延ログ情報D5、エラーログ情報D6、熟練者作業情報D7、誤差範囲情報D8、ずれ情報D9、手本軌跡情報D10、許容範囲情報D11、評価情報D12が保持されている。 Information used for processing by the work management unit 12 or the like includes work information D1, pallet information D2, marker information D3, worker information D4, delay log information D5, error log information D6, skilled worker work information D7, and error range information D8. , Deviation information D9, model locus information D10, allowable range information D11, and evaluation information D12 are held.
 図4および図5は各種情報のデータ構造例を示す図である。図4において、作業情報D1は、「製品ID/製品名」「工程ID/工程名」「作業ID」「作業タイトル」「指示画面」「作業指示テキスト」「対応パレットID」「パレットへの手の入・出」「右手・左手・両手の別」「開始最大時間/最小時間」「標準時間/許容時間」等の項目を有している。「製品ID/製品名」は、組み立て等の対象となる製品を識別する情報である。「工程ID/工程名」は、製品の組み立て等をいくつかの段階に分けた工程を識別する情報である。「作業ID」は、工程を細分化した作業(手順)を識別する情報である。作業には、物をピックアップする、物を置く、物を組み立てる等が含まれる。 4 and 5 are diagrams showing examples of data structures of various types of information. In FIG. 4, the work information D1 includes “product ID / product name”, “process ID / process name”, “work ID”, “work title”, “instruction screen”, “work instruction text”, “corresponding pallet ID”, “hand to pallet”. “In / out”, “Right / left / both hands”, “Maximum start time / minimum time”, “Standard time / allowable time”, and the like. “Product ID / Product name” is information for identifying a product to be assembled. The “process ID / process name” is information for identifying a process in which product assembly or the like is divided into several stages. “Work ID” is information for identifying a work (procedure) obtained by subdividing a process. The work includes picking up an object, placing an object, assembling an object, and the like.
 「作業タイトル」は、作業の概要を示すタイトルである。「指示画像」は、作業の詳細を示す画像である。「作業指示テキスト」は、作業の詳細を示す文字列である。「対応パレットID」は、該作業に用いられるパレットを示す情報である。工程でくくった場合、該工程において必要となるパレットが特定される。「パレットへの手の入・出」は、該作業において作業者の手のパレットへの入・出が作業の判断に用いられるか否かを示す情報である。「右手・左手・両手の別」は、該作業において入・出されることが指定される作業者の手(右手・左手・両手)を示す情報である。なお、作業または工程に対応付けて、各パレットを作業台に置く場合の位置の情報を含めることもできる。作業台上のパレットの位置は、作業者が入・出する手と対応付けられ、作業を最適に進められるような位置とされる。この位置の情報は、製品の組み立てを開始する前の作業台にパレットの配置を行う際に、画面に表示を行うことで、配置の参考に用いることができる。 “Work title” is a title indicating an outline of the work. The “instruction image” is an image showing details of work. The “work instruction text” is a character string indicating details of the work. The “corresponding pallet ID” is information indicating a pallet used for the work. When the process is combined, the pallet required in the process is specified. “Hand entering / exiting the pallet” is information indicating whether or not the operator's hand entering / exiting the pallet is used to determine the operation. “Separate right hand / left hand / both hands” is information indicating a worker's hand (right hand / left hand / both hands) designated to be entered / exited in the work. In addition, it is possible to include information on the position when each pallet is placed on the work table in association with the work or process. The position of the pallet on the work table is associated with the hand that the worker enters and exits, and is a position at which the work can proceed optimally. This position information can be used as a reference for placement by displaying on the screen when placing the pallet on the workbench before assembly of the product is started.
 「開始最大時間/最小時間」は、該作業がパレットへの手の入・出を伴う場合に、作業の開始から手の入までに許容される最大時間と最小時間を示す情報である。「標準時間/許容時間」は、該作業がパレットへの手の入・出を伴う場合に、作業が完了するまでの標準的な時間と許容時間(長い場合の許容時間と短い場合の許容時間)を示す情報である。なお、作業者Hによる一連の作業は工程を単位に行われるものとしており、一の工程に関する作業情報は「作業シナリオ」と呼ぶことができる。 “Start maximum time / minimum time” is information indicating the maximum time and minimum time allowed from the start of the work to the hand when the work involves entering / exiting the pallet. “Standard time / allowable time” is the standard time and allowable time until the work is completed when the work involves entering and exiting the pallet (allowable time for long and allowable time for short) ). Note that a series of work by the worker H is performed in units of processes, and work information regarding one process can be referred to as a “work scenario”.
 パレット情報D2は、「パレットID」「対応マーカID」等の項目を有している。「パレットID」は、パレットを識別する情報である。「対応マーカID」は、該パレットに付されるマーカを識別する情報である。 The pallet information D2 has items such as “pallet ID” and “corresponding marker ID”. “Pallet ID” is information for identifying a pallet. The “corresponding marker ID” is information for identifying a marker attached to the palette.
 マーカ情報D3は、「マーカID」「パレット領域指定」等の項目を有している。「マーカID」は、マーカを識別する情報である。「パレット領域指定」は、マーカを基準とした手の入・出を検出するための画像領域を指定する情報である。 The marker information D3 has items such as “marker ID” and “pallet area designation”. “Marker ID” is information for identifying a marker. “Palette area designation” is information for designating an image area for detecting the entry / exit of a hand with reference to a marker.
 作業者情報D4は、「作業者ID」「属性」等の項目を有している。「作業者ID」は、作業者を識別する情報である。「属性」は、該作業者の氏名、所属等の情報である。 The worker information D4 has items such as “worker ID” and “attribute”. “Worker ID” is information for identifying a worker. “Attribute” is information such as the name and affiliation of the worker.
 遅延ログ情報D5は、「ログID」「日時」「作業者ID」「製品ID/工程ID/作業ID」「遅延内容」等の項目を有している。「ログID」は、遅延ログ情報を識別する情報である。「日時」は、遅延ログ情報として記録される事象の発生日時を示す情報である。「作業者ID」は、遅延ログ情報の対象となる作業を行った作業者を識別する情報である。「製品ID/工程ID/作業ID」は、遅延ログ情報の対象となる作業の製品・工程・作業を識別する情報である。「遅延内容」は、遅延の内容(作業開始からパレットへの手の入が遅れた旨および遅れた時間等)を示す情報である。 The delay log information D5 includes items such as “log ID”, “date and time”, “worker ID”, “product ID / process ID / work ID”, and “delay content”. “Log ID” is information for identifying delay log information. “Date / time” is information indicating the occurrence date / time of an event recorded as delay log information. The “worker ID” is information for identifying a worker who has performed work that is the target of the delay log information. “Product ID / Process ID / Work ID” is information for identifying the product / process / work of the work that is the target of the delay log information. The “delay content” is information indicating the content of the delay (such as the delay in the time when the hand has been put into the pallet and the delay time).
 エラーログ情報D6は、「ログID」「日時」「作業者ID」「製品ID/工程ID/作業ID」「エラー内容」等の項目を有している。「ログID」は、エラーログ情報を識別する情報である。「日時」は、エラーログ情報として記録される事象の発生日時を示す情報である。「作業者ID」は、エラーログ情報の対象となる作業を行った作業者を識別する情報である。「製品ID/工程ID/作業ID」は、エラーログ情報の対象となる作業の製品・工程・作業を識別する情報である。「エラー内容」は、エラーの内容(指定された手と異なる手をパレットに入れた等)を示す情報である。 The error log information D6 has items such as “log ID”, “date and time”, “worker ID”, “product ID / process ID / work ID”, and “error content”. “Log ID” is information for identifying error log information. “Date and time” is information indicating the date and time of occurrence of an event recorded as error log information. The “worker ID” is information for identifying a worker who has performed a work that is a target of error log information. “Product ID / Process ID / Work ID” is information for identifying the product / process / work of the work that is the target of the error log information. “Error contents” is information indicating the contents of an error (such as putting a hand different from the designated hand in the palette).
 図5において、熟練者作業情報D7は、「熟練者作業情報ID」「映像」「深度」「骨格情報」「フレーム・作業対応情報」等の項目を有している。「熟練者作業情報ID」は、熟練者作業情報を識別する情報である。「映像」は、熟練者の作業の撮像画像の情報である。「深度」は、熟練者の作業の深度画像の情報である。「骨格情報」は、熟練者の作業の骨格情報である。「フレーム・作業対応情報」は、熟練者の作業の撮像画像のフレームと対象となる製品・工程・作業を対応付ける情報である。 5, the expert work information D7 includes items such as “expert work information ID”, “video”, “depth”, “skeleton information”, and “frame / work correspondence information”. The “skilled worker work information ID” is information for identifying skilled worker work information. “Video” is information of a captured image of work of an expert. “Depth” is information on the depth image of the work of an expert. “Skeleton information” is skeleton information of the work of an expert. The “frame / work correspondence information” is information for associating a frame of a captured image of work of an expert with a target product / process / work.
 誤差範囲情報D8は、「誤差範囲情報ID」「誤差範囲」等の項目を有している。「誤差範囲情報ID」は、誤差範囲情報を識別する情報である。「誤差範囲」は、誤差範囲(例えば、基準となる点を中心とする半径)を示す情報である。 The error range information D8 has items such as “error range information ID” and “error range”. “Error range information ID” is information for identifying error range information. “Error range” is information indicating an error range (for example, a radius centered on a reference point).
 ずれ情報D9は、「ずれ情報ID」「日時」「作業者ID」「製品ID/工程ID/作業ID」「ずれ内容」等の項目を有している。「ずれ情報ID」は、ずれ情報を識別する情報である。「日時」は、ずれ情報として記録される事象の発生日時を示す情報である。「作業者ID」は、ずれ情報の対象となる作業を行った作業者を識別する情報である。「製品ID/工程ID/作業ID」は、ずれ情報の対象となる作業の製品・工程・作業を識別する情報である。「ずれ内容」は、ずれの内容を示す情報である。 The deviation information D9 includes items such as “deviation information ID”, “date and time”, “worker ID”, “product ID / process ID / work ID”, and “deviation content”. The “deviation information ID” is information for identifying deviation information. “Date and time” is information indicating the date and time of occurrence of an event recorded as deviation information. The “worker ID” is information for identifying a worker who has performed a work that is a target of deviation information. “Product ID / Process ID / Work ID” is information for identifying the product / process / work of the work that is the target of the deviation information. The “deviation content” is information indicating the content of the deviation.
 手本軌跡情報D10は、「手本情報ID」「フレーム番号」「座標」等の項目を有している。「手本情報ID」は、評価の基準となる手本軌跡情報を識別する情報である。手本軌跡情報は、原則として手の軌跡情報であるが、他に動作を評価すべき体の部位がある場合には、その部位についての軌跡情報を加えてもよい。「フレーム番号」は、軌跡の記録された画像(動画像)のフレーム番号を示す情報である。「座標」は、軌跡の座標(画面上の2次元座標と深度)を示す情報である。 The model locus information D10 has items such as “example information ID”, “frame number”, and “coordinate”. The “example information ID” is information for identifying example trajectory information as a reference for evaluation. The model trajectory information is in principle hand trajectory information, but if there is another body part whose motion is to be evaluated, the trajectory information about the part may be added. “Frame number” is information indicating the frame number of an image (moving image) in which a locus is recorded. “Coordinates” is information indicating the coordinates of the locus (two-dimensional coordinates and depth on the screen).
 許容範囲情報D11は、「許容範囲情報ID」「許容範囲」等の項目を有している。「許容範囲情報ID」は、許容範囲情報を識別する情報である。「許容範囲」は、許容範囲(例えば、基準となる点を中心とする半径)を示す情報である。 The allowable range information D11 includes items such as “allowable range information ID” and “allowable range”. “Allowable range information ID” is information for identifying allowable range information. “Allowable range” is information indicating an allowable range (for example, a radius centered on a reference point).
 評価情報D12は、「評価情報ID」「日時」「作業者ID」「製品ID/工程ID/作業ID」「評価値」等の項目を有している。「評価情報ID」は、評価情報を識別する情報である。「日時」は、評価を行った日時を示す情報である。「作業者ID」は、評価情報として記録される作業を行った作業者を識別する情報である。「製品ID/工程ID/作業ID」は、評価情報の対象となる作業の製品・工程・作業を識別する情報である。「評価値」は、評価の内容を示す情報である。 The evaluation information D12 includes items such as “evaluation information ID”, “date and time”, “worker ID”, “product ID / process ID / work ID”, and “evaluation value”. “Evaluation information ID” is information for identifying evaluation information. “Date and time” is information indicating the date and time of evaluation. “Worker ID” is information for identifying a worker who has performed a work recorded as evaluation information. “Product ID / Process ID / Work ID” is information for identifying the product / process / work of the work to be evaluated. “Evaluation value” is information indicating the contents of evaluation.
 図6は情報処理装置1のハードウェア構成例を示す図である。図6において、情報処理装置1は、システムバス101に接続されたCPU(Central Processing Unit)102、ROM(Read Only Memory)103、RAM(Random Access Memory)104、NVRAM(Non-Volatile Random Access Memory)105を備えている。また、情報処理装置1は、I/F(Interface)106と、I/F106に接続された、I/O(Input/Output Device)107、HDD(Hard Disk Drive)/フラッシュメモリ108、NIC(Network Interface Card)109と、I/O107に接続されたモニタ110、キーボード111、マウス112等を備えている。I/O107にはCD/DVD(Compact Disk/Digital Versatile Disk)ドライブ等を接続することもできる。ディスプレイ/スピーカ3によりモニタを兼用する場合は、モニタ110は不要である。 FIG. 6 is a diagram illustrating a hardware configuration example of the information processing apparatus 1. 6, the information processing apparatus 1 includes a CPU (Central Processing Unit) 102, a ROM (Read Only Memory) 103, a RAM (Random Access Memory) 104, and an NVRAM (Non-Volatile Random Access Memory) connected to a system bus 101. 105. In addition, the information processing apparatus 1 includes an I / F (Interface) 106, an I / O (Input / Output Device) 107, an HDD (Hard Disk Drive) / flash memory 108, a NIC (Network) connected to the I / F 106. Interface Card) 109, a monitor 110 connected to the I / O 107, a keyboard 111, a mouse 112, and the like. A CD / DVD (Compact Disk / Digital Versatile Disk) drive or the like can also be connected to the I / O 107. When the display / speaker 3 is also used as a monitor, the monitor 110 is not necessary.
 図3で説明した情報処理装置1の機能は、CPU102において所定のプログラムが実行されることで実現される。プログラムは、記録媒体を経由して取得されるものでもよいし、ネットワークを経由して取得されるものでもよい。 The functions of the information processing apparatus 1 described with reference to FIG. 3 are realized by the CPU 102 executing a predetermined program. The program may be acquired via a recording medium or may be acquired via a network.
 <作業管理の処理例>
 図7は情報処理装置1の作業管理部12による作業管理の処理例を示すフローチャートである。図7において、作業管理部12は、処理を開始すると、予め管理者等により設定されたシチュエーション(製品ID、工程ID、パレットID、作業者ID)を取得する(ステップS101)。シチュエーションは、作業台毎に情報処理装置1の提供するユーザインタフェース画面を介し、管理者または作業者によって作業シナリオ(作業情報D1)に基づいて設定される。この際、組み立ての対象となる製品(製品ID)、組み立ての工程(工程ID)、作業者(作業ID)が設定される。工程が決まることで、組み立てにおいて使用されるパレット(パレットID)は一意に決まる。設定されたシチュエーションのデータは、情報処理装置1内の記憶領域に保持される。
<Example of work management processing>
FIG. 7 is a flowchart illustrating an example of work management processing performed by the work management unit 12 of the information processing apparatus 1. In FIG. 7, when the process is started, the work management unit 12 acquires a situation (product ID, process ID, pallet ID, worker ID) set in advance by an administrator or the like (step S101). The situation is set based on the work scenario (work information D1) by the administrator or the worker via the user interface screen provided by the information processing apparatus 1 for each work table. At this time, a product to be assembled (product ID), an assembly process (process ID), and a worker (work ID) are set. By determining the process, the pallet (pallet ID) used in the assembly is uniquely determined. The set situation data is held in a storage area in the information processing apparatus 1.
 次いで、作業管理部12は、作業開始トリガを待機する(ステップS102)。作業開始トリガは、管理者が時間等を基準に開始の合図を送ったり、作業者Hが情報処理装置1を操作して開始の合図を送ったり、作業者Hが作業台の前に立って所定時間にわたって立ち止まったのを検出したりした場合とする。 Next, the work management unit 12 waits for a work start trigger (step S102). As for the work start trigger, the manager sends a start signal based on time, etc., the worker H operates the information processing apparatus 1 to send a start signal, or the worker H stands in front of the work table. Suppose that it is detected that the vehicle has stopped for a predetermined time.
 作業開始トリガ検出した場合、作業管理部12は実質的な処理を開始する(ステップS103)。 When the work start trigger is detected, the work management unit 12 starts substantial processing (step S103).
 作業管理部12は、作業情報D1から次の作業IDを取得して画面表示を行う(ステップS104)。なお、パレットPへの作業者Hの手の入・出がない作業の場合は、作業の標準時間を考慮した所定時間の後に次の作業に移行する。また、パレットPへの作業者Hの手の入・出がない作業の場合であっても、作業内容を画像処理等により監視し、完了と判定した場合に次の作業に移行するようにしてもよい。 The work management unit 12 acquires the next work ID from the work information D1 and displays the screen (step S104). In addition, in the case of work in which the worker H does not enter or leave the pallet P, the process shifts to the next work after a predetermined time considering the standard time of work. Even in the case where the worker H does not enter or leave the pallet P, the work content is monitored by image processing or the like, and when it is determined that the work is completed, the process proceeds to the next work. Also good.
 図8は表示画面の例を示す図であり、表示エリアA1には製品番号と工程番号が表示され、表示エリアA2には作業タイトルが表示され、表示エリアA3には現在の工程に含まれる作業タイトルがリスト表示され、現在の作業がC1で示すようにハイライトで表示されている。また、表示エリアA4には指示画像が表示され、表示エリアA5には作業指示テキストが表示されている。表示エリアA6はエラー情報が表示されるエリアであるが、図示の例ではエラー情報は表示されていない。B1、B2は作業を前後させるためのボタンである。 FIG. 8 is a diagram showing an example of the display screen. The product number and the process number are displayed in the display area A1, the work title is displayed in the display area A2, and the work included in the current process is displayed in the display area A3. The title is displayed as a list, and the current work is highlighted as indicated by C1. An instruction image is displayed in the display area A4, and a work instruction text is displayed in the display area A5. The display area A6 is an area where error information is displayed, but no error information is displayed in the illustrated example. B1 and B2 are buttons for moving the work forward and backward.
 図7に戻り、作業管理部12は、第1経過時間について初期値:0から計測をスタートする(ステップS105)。第1経過時間は、画面による作業の開始指示から作業者Hが何らかの操作を始めるまでの動き出しの時間を判断するためのものである。 Returning to FIG. 7, the work management unit 12 starts measurement from the initial value 0 for the first elapsed time (step S105). The first elapsed time is for determining the time to start moving from the instruction to start work on the screen until the worker H starts some operation.
 次いで、作業管理部12は、作業シナリオ(作業情報D1)において作業に対応付けられているパレットに作業者Hの手が入ったか否か判断する(ステップS106)。 Next, the work management unit 12 determines whether or not the worker H has entered the pallet associated with the work in the work scenario (work information D1) (step S106).
 図9はパレットPへの手の入と出を判定する処理例を示すフローチャートであり、各作業と並行して実行される処理である。図9において、作業管理部12は、撮像画像から現在の作業IDに対応するマーカMを検出し(ステップS121)、パレット領域を特定する(ステップS122)。パレット領域は図2において破線で示したようなものである。 FIG. 9 is a flowchart showing an example of a process for determining entry and exit of the hand from the pallet P, which is a process executed in parallel with each operation. In FIG. 9, the work management unit 12 detects a marker M corresponding to the current work ID from the captured image (step S121), and specifies a pallet area (step S122). The palette area is as shown by the broken line in FIG.
 次いで、図9において、作業管理部12は、手(右手・左手)の骨格情報の3次元位置(撮像画像の2次元座標と深度)とパレット領域の3次元位置とを比較し、パレットPに手が入ったか出たかの検出状況(検出→不検出、不検出→検出)を判定する(ステップS123)。 Next, in FIG. 9, the work management unit 12 compares the three-dimensional position (two-dimensional coordinates and depth of the captured image) of the skeletal information of the hand (right hand / left hand) with the three-dimensional position of the pallet area. A detection status (detection → non-detection, non-detection → detection) of whether the hand has entered or not is determined (step S123).
 図7に戻り、作業管理部12は、所定のパレットに作業者Hの手が入っていないと判断した場合(ステップS106のNO)、第1経過時間が所定値を経過したか否か判断する(ステップS107)。経過していないと判断した場合(ステップS107のNO)、パレットPに手が入ったか否かの判断(ステップS106)に戻る。なお、作業台上の全てのパレットについて手の入・出の検出を行い、現在の作業に対応付けられていないパレットに手が入った場合に、エラーとして画面表示と記録を行うようにしてもよい。 Returning to FIG. 7, when the work management unit 12 determines that the hand of the worker H is not in the predetermined pallet (NO in step S <b> 106), the work management unit 12 determines whether the first elapsed time has passed a predetermined value. (Step S107). If it is determined that the time has not elapsed (NO in step S107), the process returns to the determination of whether or not the hand has entered the pallet P (step S106). It should be noted that hand entry / exit detection is performed for all pallets on the workbench, and when a hand enters a pallet that is not associated with the current work, the screen display and recording are performed as an error. Good.
 作業管理部12は、作業に対応付けられているパレットに作業者Hの手が入ったと判断した場合(ステップS106のYES)、作業に対応付けられた手と一致する手であるか否か判断する(ステップS108)。なお、所定のパレットに作業者Hの手が入ったと判断した場合(ステップS106のYES)に第1経過時間が最小時間より短かった場合にも、作業の開始が早すぎる旨のアラート表示と記録を行うようにしてもよい。 If the work management unit 12 determines that the hand of the worker H has entered the pallet associated with the work (YES in step S106), the work management unit 12 determines whether or not the hand matches the hand associated with the work. (Step S108). When it is determined that the hand of the worker H has entered the predetermined pallet (YES in step S106), even when the first elapsed time is shorter than the minimum time, an alert display and recording that the work is started too early May be performed.
 作業管理部12は、作業に対応付けられた手と一致する手であると判断した場合(ステップS108のYES)、第2経過時間について初期値:0から計測をスタートする(ステップS109)。第2経過時間は、パレットPに手を入れている時間が適切であるか否かを判断するためのものである。 When the work management unit 12 determines that the hand matches the hand associated with the work (YES in step S108), the work management unit 12 starts measurement of the second elapsed time from an initial value: 0 (step S109). The second elapsed time is for determining whether or not the time for putting the hand in the pallet P is appropriate.
 次いで、作業管理部12は、作業者Hの手がパレットPから出たか否か判断する(ステップS110)。この判断には、図9で説明した手の検出状況が用いられる。 Next, the work management unit 12 determines whether or not the hand of the worker H has come out of the pallet P (step S110). For this determination, the hand detection situation described with reference to FIG. 9 is used.
 図7において、作業管理部12は、作業者Hの手がパレットPから出ていないと判断した場合(ステップS110のNO)、第2経過時間は許容範囲内であるか否か判断する(ステップS111)。許容範囲内であると判断した場合(ステップS111のYES)、作業者Hの手がパレットPから出たか否かの判断(ステップS110)に戻る。 In FIG. 7, when the work management unit 12 determines that the hand of the worker H has not come out of the pallet P (NO in step S110), the work management unit 12 determines whether or not the second elapsed time is within an allowable range (step S111). If it is determined that the value is within the allowable range (YES in step S111), the process returns to the determination (step S110) as to whether or not the hand of the worker H has come out of the pallet P.
 作業管理部12は、作業者Hの手がパレットPから出たと判断した場合(ステップS110のYES)、第2経過時間の計時をストップし(ステップS112)、次の作業についての画面表示(ステップS104)に戻る。なお、作業者Hの手がパレットPから出たと判断した場合(ステップS110のYES)、第2経過時間が作業の標準時間に対して許容時間よりも短かった場合にも、作業が早すぎる旨のアラート表示と記録を行うようにしてもよい。 When the work management unit 12 determines that the hand of the worker H has come out of the pallet P (YES in step S110), the work management unit 12 stops counting the second elapsed time (step S112), and displays the screen for the next work (step S112). Return to S104). When it is determined that the hand of the worker H has come out of the pallet P (YES in step S110), the work is too early even when the second elapsed time is shorter than the allowable time with respect to the standard time of the work. The alert display and recording may be performed.
 また、作業管理部12は、所定のパレットに作業者Hの手が入っていない場合(ステップS106のNO)で第1経過時間が所定値を経過したと判断した場合(ステップS107のYES)、作業者Hに迷いがあると判断する。この場合、作業管理部12は、遅延として遅延ログ情報D5を記録し、遅延情報を画面に表示し(ステップS113)、第1経過時間をリセットし(ステップS114)、パレットPに手が入ったか否かの判断(ステップS106)に戻る。図10は表示画面の例を示す図であり、表示エリアA6に「エラー : この作業を実施していません」のアラートが表示された状態を示している。 Further, when the work management unit 12 determines that the first elapsed time has passed the predetermined value when the worker H's hand is not in the predetermined pallet (NO in step S106), (YES in step S107). It is determined that the worker H is at a loss. In this case, the work management unit 12 records the delay log information D5 as a delay, displays the delay information on the screen (step S113), resets the first elapsed time (step S114), and whether the pallet P has entered the hand. It returns to judgment (step S106) of no. FIG. 10 is a diagram showing an example of the display screen, and shows a state in which an alert “Error: This operation is not performed” is displayed in the display area A6.
 また、図7に戻り、作業管理部12は、所定のパレットに作業者Hの手が入ったと判断した場合(ステップS106のYES)で作業に対応付けられた手と一致する手でないと判断した場合(ステップS108のNO)、作業エラー(作業ミス)が発生したと判断する。この場合、作業管理部12は、エラーとしてエラーログ情報D6を記録し(ステップS115)、エラー情報を画面に表示し(ステップS116)、第1経過時間をリセットし(ステップS117)、パレットPに手が入ったか否かの判断(ステップS106)に戻る。図11は表示画面の例を示す図であり、表示エリアA6に「エラー:右手を使っています」のアラートが表示された状態を示している。 Returning to FIG. 7, when the work management unit 12 determines that the hand of the worker H has entered the predetermined pallet (YES in step S <b> 106), the work management unit 12 determines that the hand does not match the hand associated with the work. If so (NO in step S108), it is determined that a work error (work error) has occurred. In this case, the work management unit 12 records the error log information D6 as an error (step S115), displays the error information on the screen (step S116), resets the first elapsed time (step S117), and stores it in the palette P. The process returns to the determination of whether or not a hand has entered (step S106). FIG. 11 is a diagram showing an example of the display screen, and shows a state in which an alert “Error: using right hand” is displayed in the display area A6.
 また、図7に戻り、作業管理部12は、作業者Hの手がパレットPから出ていないと判断した場合(ステップS110のNO)で第2経過時間は許容範囲内でないと判断した場合(ステップS111のNO)、遅延として遅延ログ情報D5を記録し(ステップS118)、遅延情報を画面に表示し(ステップS119)、第1経過時間のリセット(ステップS117)に移行する。 Returning to FIG. 7, the work management unit 12 determines that the second elapsed time is not within the allowable range when it is determined that the hand of the worker H is not out of the pallet P (NO in step S110) ( The delay log information D5 is recorded as a delay (step S111), the delay information is displayed on the screen (step S119), and the process proceeds to the reset of the first elapsed time (step S117).
 <熟練者画像重畳表示の処理例>
 図12は情報処理装置1の熟練者画像重畳表示部13による熟練者画像重畳表示の処理例を示すフローチャートである。熟練者画像重畳表示の処理は、単独に行ってもよいし、図7に示した作業管理の処理と並行して行ってもよい。
<Example of expert image overlay display>
FIG. 12 is a flowchart illustrating a processing example of expert image superimposition display by the expert image superimposition display unit 13 of the information processing apparatus 1. The expert image superimposed display process may be performed independently or in parallel with the work management process illustrated in FIG.
 図12において、単独に処理を行う場合、熟練者画像重畳表示部13は、シチュエーションを取得し(ステップS21)、作業開始トリガを検出して実質的な処理を開始する(ステップS22)。これは、図7のステップS101~S103に対応するが、重畳表示の対象となる熟練者作業情報や誤差範囲情報が複数ある場合には、その中から処理に用いるものを特定する情報(熟練者作業情報ID、誤差範囲情報ID)がシチュエーションに加わる。図7の処理と並行に行う場合、図7のステップS103の後にステップS23から処理を開始する。図7のステップS103では、各工程の初回の作業が開始されるほか、前回の作業がモーション検出(パレットから手が出る等)により完了したと判断された場合に次の作業が開始される。 In FIG. 12, when processing is performed independently, the expert image overlay display unit 13 acquires a situation (step S21), detects a work start trigger, and starts substantial processing (step S22). This corresponds to steps S101 to S103 in FIG. 7. However, when there are a plurality of expert work information and error range information to be superimposed, information for specifying what to use for processing (experts) Work information ID, error range information ID) are added to the situation. When performing in parallel with the process of FIG. 7, the process starts from step S23 after step S103 of FIG. In step S103 in FIG. 7, the first work of each process is started, and the next work is started when it is determined that the previous work has been completed by motion detection (such as a hand coming out of the pallet).
 図12において、熟練者画像重畳表示部13は、熟練者作業情報D7と作業の誤差範囲情報D8を取得する(ステップS23)。 Referring to FIG. 12, the expert image superimposing display unit 13 acquires expert work information D7 and work error range information D8 (step S23).
 次いで、熟練者画像重畳表示部13は、熟練者作業情報D7と作業者作業情報(撮像画像、骨格情報)に対して体幹軸を調整する(ステップS24)。 Next, the expert image superimposed display unit 13 adjusts the trunk axis with respect to the expert work information D7 and the worker work information (captured image, skeleton information) (step S24).
 図13は体幹軸の調整(図12のステップS24)の処理例を示すフローチャートである。図13において、熟練者画像重畳表示部13は、熟練者作業情報D7から、熟練者の体幹軸を2次元座標上で取得する(ステップS241)。 FIG. 13 is a flowchart showing a processing example of adjustment of the trunk axis (step S24 in FIG. 12). In FIG. 13, the expert image superimposing display unit 13 acquires the trunk axis of the expert on the two-dimensional coordinates from the expert work information D7 (step S241).
 次いで、熟練者画像重畳表示部13は、作業者の体幹軸を2次元座標上で取得する(ステップS242)。 Next, the expert image overlay display unit 13 acquires the operator's trunk axis on the two-dimensional coordinates (step S242).
 図14は体幹軸の取得の例を示す図であり、実線で輪郭を示した作業者Hと破線で輪郭を示した熟練者SHのそれぞれにつき、頭・肩の中央・背骨の中心・腰の中央を通る線を体幹軸として取得している。 FIG. 14 is a diagram showing an example of acquiring a trunk axis. For each of the worker H indicated by a solid line and the skilled person SH indicated by a broken line, the center of the head / shoulder, the center of the spine, and the waist The line that passes through the center of is acquired as the trunk axis.
 次いで、図13に戻り、熟練者画像重畳表示部13は、2者の体幹軸のずれを算出し(ステップS243)、ずれを補正する形で熟練者の映像表示を回転する(ステップS244)。 Next, returning to FIG. 13, the expert image superimposing display unit 13 calculates the deviation of the trunk axes of the two (step S243), and rotates the image display of the expert so as to correct the deviation (step S244). .
 図15は作業者Hの画像に熟練者SHの画像を重畳した画面例を示す図である。この際、熟練者SHの画像は透過度を高く設定し、作業者Hの画像の視認を妨げないようにする。体幹軸ができるだけ一致するように調整されるため、立ち位置等の違いがあっても、手の動き等の注視すべき部分の違いがわかりやすくなる。 FIG. 15 is a diagram illustrating a screen example in which the image of the worker SH is superimposed on the image of the worker H. At this time, the image of the expert SH is set to have high transparency so as not to hinder the visual recognition of the image of the worker H. Since the trunk axis is adjusted so as to coincide with each other as much as possible, even if there is a difference in standing position or the like, a difference in a portion to be watched such as a hand movement is easily understood.
 次いで、図12に戻り、熟練者画像重畳表示部13は、熟練者作業情報D7と作業者作業情報に対してタイミングを調整する(ステップS25)。 Next, returning to FIG. 12, the expert image overlay display unit 13 adjusts the timing with respect to the expert work information D7 and the worker work information (step S25).
 図16はタイミング調整(図12のステップS25)の処理例を示すフローチャートである。図16において、熟練者画像重畳表示部13は、作業者と熟練者の現在の作業IDを比較し(ステップS251)、作業IDから同じ作業を実施しているか否か判断する(ステップS252)。熟練者画像重畳表示部13は、同じ作業を実施していると判断した場合(ステップS252のYES)、タイミング調整を終了する。 FIG. 16 is a flowchart showing a processing example of timing adjustment (step S25 in FIG. 12). In FIG. 16, the expert image overlay display unit 13 compares the current work IDs of the worker and the skilled worker (step S251), and determines whether or not the same work is being performed based on the work ID (step S252). When the expert image superimposing display unit 13 determines that the same operation is being performed (YES in step S252), the timing adjustment ends.
 熟練者画像重畳表示部13は、同じ作業を実施していないと判断した場合(ステップS252のNO)、作業IDの序列から熟練者のほうが作業が進んでいるか否か判断する(ステップS253)。 When it is determined that the expert image superimposing display unit 13 is not performing the same operation (NO in step S252), the expert image superimposing display unit 13 determines whether the operation is advanced by the expert from the order of the operation ID (step S253).
 熟練者画像重畳表示部13は、熟練者のほうが作業が進んでいると判断した場合(ステップS253のYES)、熟練者の作業映像の再生を一時停止してタイミングを遅らせ(ステップS254)、作業IDの取得(ステップS251)に戻る。 If it is determined that the expert is working (YES in step S253), the expert image superimposing display unit 13 pauses the reproduction of the expert work video and delays the timing (step S254). The process returns to ID acquisition (step S251).
 また、熟練者画像重畳表示部13は、熟練者のほうが作業が進んでいない(作業者のほうが作業が進んでいる)と判断した場合(ステップS253のNO)、熟練者の作業映像を作業者の作業まで進め(ステップS255)、作業IDの取得(ステップS251)に戻る。 Further, when it is determined that the expert is not working (the worker is working) (NO in step S253), the expert image superimposing display unit 13 displays the work image of the expert. (Step S255), the process returns to the acquisition of the work ID (step S251).
 次いで、図12に戻り、熟練者画像重畳表示部13は、熟練者作業情報D7と作業者作業情報に対してずれ具合の評価を行う(ステップS26)。 Next, returning to FIG. 12, the expert image overlay display unit 13 evaluates the degree of deviation of the expert work information D7 and the worker work information (step S26).
 図17はずれ具体の評価(図12のステップS26)の処理例を示すフローチャートである。図17において、熟練者画像重畳表示部13は、熟練者および作業者の各関節位置を取得する(ステップS261)。 FIG. 17 is a flowchart showing an example of processing for evaluating the deviation (step S26 in FIG. 12). In FIG. 17, the expert image superimposing display unit 13 acquires the joint positions of the expert and the worker (step S261).
 次いで、熟練者画像重畳表示部13は、あらかじめ定義された誤差範囲の中に作業者の各関節位置が含まれているかどうか判定する(ステップS262)。図18は誤差範囲の適用の例を示す図であり、熟練者の関節の位置aを中心とした半径が誤差範囲の球を設定し、その中に作業者の関節位置が入るかどうか判定する。位置bは球の中にあるため誤差範囲内と判定し、位置cは球の外にあるため誤差範囲を超えると判定する。 Next, the expert image overlay display unit 13 determines whether each joint position of the worker is included in a predefined error range (step S262). FIG. 18 is a diagram showing an example of application of the error range. A sphere whose radius is centered on the position a of the expert's joint is set as an error range, and it is determined whether or not the joint position of the operator is included therein. . Since the position b is inside the sphere, it is determined that it is within the error range, and since the position c is outside the sphere, it is determined that it exceeds the error range.
 次いで、図17に戻り、熟練者画像重畳表示部13は、誤差範囲に含まれている場合(ステップS263のYES)、評価を終了する。 Next, returning to FIG. 17, when the expert image superimposing display unit 13 is included in the error range (YES in step S <b> 263), the evaluation ends.
 熟練者画像重畳表示部13は、誤差範囲に含まれていない場合(ステップS263のNO)、ずれているフレーム(動画像のフレーム)としてフレームにマークする(ステップS264)。そして、熟練者画像重畳表示部13は、ずれている関節と、その隣の関節との間の範囲をハイライトさせる(ステップS265)。 When it is not included in the error range (NO in step S263), the expert image superimposing display unit 13 marks the frame as a shifted frame (moving image frame) (step S264). Then, the expert image overlay display unit 13 highlights a range between the displaced joint and the adjacent joint (step S265).
 図19はずれの表示またはレポーティングの例を示す図であり、右手と左手の位置が熟練者と大きくずれているため、手の関節をハイライト(図では太線で表す)し、ずれについての情報を表示している。なお、リアルタイムに表示を行ってもよいし、事後的に確認のためのレポートとして保存してもよい。また、ずれの発生している部位が手以外の頭や肩や背筋等の部分である場合には、姿勢が適正でない旨をレポートにおいて言及してもよい。 FIG. 19 is a diagram showing an example of misalignment display or reporting. Since the positions of the right hand and the left hand are greatly deviated from the skilled person, the joint of the hand is highlighted (indicated by a bold line in the figure), and information about the misalignment is displayed. it's shown. In addition, you may display in real time and you may preserve | save as a report for confirmation after the fact. Further, when the part where the deviation occurs is a part of the head, shoulders, back muscles or the like other than the hand, it may be mentioned in the report that the posture is not appropriate.
 次いで、図12に戻り、熟練者画像重畳表示部13は、ずれの情報を保存し(ステップS27)、作業が終了していない場合(ステップS28のNO)は体幹軸の調整(ステップS24)に戻り、作業が終了している場合(ステップS28のYES)は処理を終了する。なお、ずれの情報の保存(ステップS27)に際し、ずれの発生している期間の重畳表示された動画像を切り取って別に保存し、後に再生できるようにしてもよい。 Next, returning to FIG. 12, the expert image overlay display unit 13 stores the information on the deviation (step S27), and when the work is not completed (NO in step S28), the trunk axis is adjusted (step S24). If the work is finished (YES in step S28), the process is finished. Note that when storing the information on the shift (step S27), the moving image superimposed and displayed during the shift period may be cut out and stored separately so that it can be played back later.
 なお、リアルタイムに取得される作業者の撮像画像と骨格情報に基づいて処理を行う場合について説明したが、既に記録されている作業者の撮像画像と骨格情報に基づいて処理を行うようにしてもよい。 In addition, although the case where the process is performed based on the captured image and the skeleton information of the worker acquired in real time has been described, the process may be performed based on the captured image and the skeleton information of the worker already recorded. Good.
 また、図7に示した基本的な作業管理の処理と並行に動作させることが可能であることは前述したとおりであるが、その場合、常に熟練者画像との重畳表示を行うのではなく、基本的な作業管理の処理においてエラー等が発生した場合にのみ熟練者画像との重畳表示を行うようにすることもできる。 In addition, as described above, it is possible to operate in parallel with the basic work management process shown in FIG. 7, but in that case, the superimposing display with the expert image is not always performed, Only when an error or the like occurs in the basic work management process, the superimposed display with the expert image can be performed.
 <作業評価の処理例>
 図20は情報処理装置1の作業評価部14による作業評価の処理例を示すフローチャートである。作業評価の処理は、単独に行ってもよいし、図7に示した作業管理の処理や図12に示した熟練者画像重畳表示の処理と並行して行ってもよい。
<Work evaluation processing example>
FIG. 20 is a flowchart illustrating an example of work evaluation processing performed by the work evaluation unit 14 of the information processing apparatus 1. The work evaluation process may be performed independently, or may be performed in parallel with the work management process illustrated in FIG. 7 and the expert image superimposed display process illustrated in FIG.
 図20において、単独に処理を行う場合、作業評価部14は、シチュエーションを取得し(ステップS31)、作業開始トリガを検出して実質的な処理を開始する(ステップS32)。これは、図7のステップS101~S103に対応するが、評価の基準となる手本軌跡情報や許容範囲情報が複数ある場合には、その中から処理に用いるものを特定する情報(手本軌跡情報ID、許容範囲情報ID)がシチュエーションに加わる。図7の処理と並行に行う場合、図7のステップS103の後にステップS33から処理を開始する。図7のステップS103では、各工程の初回の作業が開始されるほか、前回の作業がモーション検出(パレットから手が出る等)により完了したと判断された場合に次の作業が開始される。 In FIG. 20, when processing is performed independently, the work evaluation unit 14 acquires a situation (step S31), detects a work start trigger, and starts substantial processing (step S32). This corresponds to steps S101 to S103 in FIG. 7, but when there are a plurality of model trajectory information and allowable range information which are the criteria for evaluation, information for identifying the one to be used for processing (example trajectory). Information ID, allowable range information ID) is added to the situation. When performing in parallel with the process of FIG. 7, the process starts from step S33 after step S103 of FIG. In step S103 in FIG. 7, the first work of each process is started, and the next work is started when it is determined that the previous work has been completed by motion detection (such as a hand coming out of the pallet).
 図20において、作業評価部14は、シチュエーションにおいて特定された手本軌跡情報D10を取得し(ステップS33)、作業者の軌跡情報を取得する(ステップS34)。手本軌跡情報は、原則として手の軌跡情報であり、それと比較する作業者の軌跡も手の軌跡情報であるが、他に動作を評価すべき体の部位がある場合には、その部位についての軌跡情報を加えてもよい。例えば、頭や肩や背筋等の部位については、手本の軌跡からの乖離から、姿勢が適正であるか否かを評価することができる。 In FIG. 20, the work evaluation unit 14 acquires the model trajectory information D10 specified in the situation (step S33), and acquires the trajectory information of the worker (step S34). The model trajectory information is, in principle, hand trajectory information, and the trajectory of the worker to be compared with the hand trajectory information is also hand trajectory information. The trajectory information may be added. For example, for parts such as the head, shoulders, and back muscles, it is possible to evaluate whether or not the posture is appropriate from the deviation from the trace of the model.
 次いで、作業評価部14は、手本の軌跡情報と作業者の軌跡情報の一致度を判定する(ステップS35)。 Next, the work evaluation unit 14 determines the degree of coincidence between the trace information of the model and the trace information of the worker (Step S35).
 図21は軌跡の一致度の判定(図20のステップS35)の処理例を示すフローチャートである。図21において、作業評価部14は、手本の軌跡情報と作業者の関節位置を比較する(ステップS351)。 FIG. 21 is a flowchart showing a processing example of determination of the degree of coincidence of trajectories (step S35 in FIG. 20). In FIG. 21, the work evaluation unit 14 compares the model trajectory information with the joint position of the worker (step S351).
 次いで、作業評価部14は、作業者の関節位置が手本の軌跡情報の許容範囲から逸脱していた場合は不一致フレームとしてフレームにマークする(ステップS352)。図22は許容範囲の適用の例を示す図であり、手本の軌跡情報の各フレームにおける位置dを中心とした半径が許容範囲の球を設定し、その中に作業者の軌跡位置が入るかどうか判定する。 Next, when the joint position of the worker deviates from the allowable range of the model trajectory information, the work evaluation unit 14 marks the frame as a mismatched frame (step S352). FIG. 22 is a diagram showing an example of application of the allowable range, in which a sphere having a radius around the position d in each frame of the model trajectory information is set, and the trajectory position of the worker is included therein. Determine whether or not.
 次いで、図20に戻り、作業評価部14は、現在の作業が完了していない場合(ステップS36のNO)、手本の軌跡情報と作業者の軌跡情報の一致度の判定(ステップS35)に戻る。 Next, returning to FIG. 20, when the current work is not completed (NO in step S36), the work evaluation unit 14 determines the degree of coincidence between the trace information of the model and the trace information of the worker (step S35). Return.
 作業評価部14は、現在の作業が完了した場合(ステップS36のYES)、軌跡の不一致度を計算する(ステップS37)。 When the current work is completed (YES in step S36), the work evaluation unit 14 calculates the degree of trajectory mismatch (step S37).
 図23は軌跡の不一致度の計算(図20のステップS37)の処理例を示すフローチャートである。図23において、作業評価部14は、不一致度として、
 不一致フレーム数/作業全体のフレーム数
を算出する(ステップS371)。不一致フレーム数は、不一致フレームとしてマークされたフレームの数である。
FIG. 23 is a flowchart showing a processing example of the calculation of the degree of discrepancy (step S37 in FIG. 20). In FIG. 23, the work evaluation unit 14 determines the inconsistency as
The number of unmatched frames / the number of frames of the entire work is calculated (step S371). The number of unmatched frames is the number of frames marked as unmatched frames.
 なお、各フレームにおいて許容範囲を考慮して軌跡の位置を比較する場合について説明したが、手本の軌跡に許容範囲を考慮したチューブ状の領域を設定し、その領域内に作業者の軌跡が含まれるか否かにより、動きに要した時間を考慮せずに評価してもよい。また、手本の軌跡または作業者の軌跡のいずれかの時間軸を拡大・縮小し、位置的な一致と時間的な一致の両者を考慮して評価してもよい。 In addition, although the case where the position of the trajectory is compared in consideration of the allowable range in each frame has been described, a tube-shaped area considering the allowable range is set in the model trajectory, and the operator's trajectory is included in the area. The evaluation may be performed without considering the time required for the movement depending on whether it is included. Further, the time axis of either the model trajectory or the worker trajectory may be enlarged or reduced, and evaluation may be performed in consideration of both positional coincidence and temporal coincidence.
 次いで、図20に戻り、作業評価部14は、算出した不一致度を保存し(ステップS38)、処理を終了する。保存された不一致度は、レポートとして、作業者の評価情報として利用される。姿勢についての評価は、作業を安全に遂行するための指導等に役立てられる。 Next, returning to FIG. 20, the work evaluation unit 14 stores the calculated degree of mismatch (step S38), and ends the process. The stored inconsistency is used as a report as operator evaluation information. Evaluation of posture is useful for guidance for performing work safely.
 <総括>
 以上説明したように、本実施形態によれば、次のような利点がある。
(1)監視の基準となる位置が変動する場合にも作業の監視を行うことができる。
(2)作業の習得の効果を高めることができる。
(3)実際に行われる作業の評価を適切に行うことができる。
<Summary>
As described above, according to the present embodiment, there are the following advantages.
(1) The work can be monitored even when the position serving as a reference for monitoring varies.
(2) The effect of learning the work can be enhanced.
(3) The work actually performed can be appropriately evaluated.
 以上、本発明の好適な実施の形態により本発明を説明した。ここでは特定の具体例を示して本発明を説明したが、特許請求の範囲に定義された本発明の広範な趣旨および範囲から逸脱することなく、これら具体例に様々な修正および変更を加えることができることは明らかである。すなわち、具体例の詳細および添付の図面により本発明が限定されるものと解釈してはならない。 The present invention has been described above by the preferred embodiments of the present invention. While the invention has been described with reference to specific embodiments, various modifications and changes may be made to the embodiments without departing from the broad spirit and scope of the invention as defined in the claims. Obviously you can. In other words, the present invention should not be construed as being limited by the details of the specific examples and the accompanying drawings.
 1     情報処理装置
 11    モーションセンシング処理部
 12    作業管理部
 13    熟練者画像重畳表示部
 14    作業評価部
 D1    作業情報
 D2    パレット情報
 D3    マーカ情報
 D4    作業者情報
 D5    遅延ログ情報
 D6    エラーログ情報
 D7    熟練者作業情報
 D8    誤差範囲情報
 D9    ずれ情報
 D10   手本軌跡情報
 D11   許容範囲情報
 D12   評価情報
 101   バス
 102   CPU
 103   ROM
 104   RAM
 105   NVRAM
 106   I/F
 107   I/O
 108   HDD/フラッシュメモリ
 109   NIC
 110   モニタ
 111   キーボード
 112   マウス
 2     モーションセンシング装置
 3     ディスプレイ/スピーカ
 H     作業者
 P     パレット
 M     マーカ
 SH    熟練者
DESCRIPTION OF SYMBOLS 1 Information processing apparatus 11 Motion sensing process part 12 Work management part 13 Expert image superimposition display part 14 Work evaluation part D1 Work information D2 Pallet information D3 Marker information D4 Worker information D5 Delay log information D6 Error log information D7 Expert work information D8 Error range information D9 Deviation information D10 Model trajectory information D11 Tolerable range information D12 Evaluation information 101 Bus 102 CPU
103 ROM
104 RAM
105 NVRAM
106 I / F
107 I / O
108 HDD / flash memory 109 NIC
110 Monitor 111 Keyboard 112 Mouse 2 Motion sensing device 3 Display / speaker H Worker P Palette M Marker SH Expert

Claims (18)

  1.  撮像装置を用いて撮像された作業状況の表示方法において、
     撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定し、
     完了した前記作業の次の作業として前記作業シナリオで規定された作業に対応づけて記憶された画像と、前記モーションの検出後の撮像画像とを重ねて表示する、
    処理をコンピュータが実行することを特徴とする表示方法。
    In a method for displaying a work situation imaged using an imaging device,
    According to the motion detected based on the captured image, identify the completed work among the work included in the predetermined work scenario,
    The image stored in association with the work specified in the work scenario as the next work of the completed work and the captured image after the detection of the motion are displayed in an overlapping manner.
    A display method characterized in that a computer executes processing.
  2.  前記画像及び前記撮像画像は動画像である、
    ことを特徴とする請求項1記載の表示方法。
    The image and the captured image are moving images.
    The display method according to claim 1, wherein:
  3.  前記画像は、前記撮像画像よりも高い透過度で表示される、
    ことを特徴とする請求項1記載の表示方法。
    The image is displayed with a higher transparency than the captured image.
    The display method according to claim 1, wherein:
  4.  前記表示は、前記モーションの検出対象の位置に応じて、前記作業に対応付けて記憶された画像の表示位置を制御する、
    ことを特徴とする請求項1記載の表示方法。
    The display controls a display position of an image stored in association with the work according to a position of the motion detection target.
    The display method according to claim 1, wherein:
  5.  前記モーションの検出対象の位置は、モーションが検出される人の体幹軸の位置である、
    ことを特徴とする請求項4記載の表示方法。
    The position of the motion detection target is the position of the human trunk axis where the motion is detected.
    The display method according to claim 4, wherein:
  6.  前記画像は動画像であり、前記撮像画像に基づいて検出したモーションに応じて、作業の完了タイミング又は新しい作業の開始タイミングを特定し、動画像である前記画像を前記完了タイミング又は前記開始タイミングで再生する、
    ことを特徴とする請求項1記載の表示方法。
    The image is a moving image, and a work completion timing or a new work start timing is specified according to the motion detected based on the captured image, and the image that is a moving image is determined at the completion timing or the start timing. Reproduce,
    The display method according to claim 1, wherein:
  7.  モーションの検出対象の人の輪郭又は骨格又は体幹軸と、前記画像に含まれる人の輪郭又は骨格又は体幹軸との間のずれを検出し、
     検出した前記ずれに応じてずれの発生を示す表示を行う、
    ことを特徴とする請求項1に記載の表示方法。
    Detecting a shift between the contour or skeleton or trunk axis of a person to be detected in motion and the contour or skeleton or trunk axis of the person included in the image;
    Display the occurrence of the deviation according to the detected deviation,
    The display method according to claim 1, wherein:
  8.  撮像装置を用いて撮像された作業状況の表示方法において、
     所定の作業シナリオに含まれる作業のうち対象となる作業を特定し、
     前記対象となる作業に対応づけて記憶された画像と、撮像画像とを重ねて表示する、
    処理をコンピュータが実行することを特徴とする表示方法。
    In a method for displaying a work situation imaged using an imaging device,
    Identify the target work among the work included in a given work scenario,
    An image stored in association with the target work and a captured image are displayed in an overlapping manner.
    A display method characterized in that a computer executes processing.
  9.  撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定し、
     完了した前記作業の次の作業として前記作業シナリオで規定された作業シナリオに対応づけて記憶された画像と、前記モーションの検出後の撮像画像とに基づいて、モーションの検出対象の人の位置と、前記画像に含まれる人の位置との間のずれを検出し、
     前記ずれが検出されたタイミング情報、又は、前記ずれが検出された際の前記モーションの検出後の撮像画像を出力する、
    処理をコンピュータが実行することを特徴とする作業のモニタ結果出力方法。
    According to the motion detected based on the captured image, identify the completed work among the work included in the predetermined work scenario,
    Based on the image stored in association with the work scenario defined in the work scenario as the next work of the completed work, and the captured image after the detection of the motion, Detecting a deviation from the position of the person included in the image,
    Output timing information when the deviation is detected, or output a captured image after detection of the motion when the deviation is detected,
    A work monitoring result output method, characterized in that a computer executes processing.
  10.  撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定し、
     完了した前記作業の次の作業として前記作業シナリオで規定された作業シナリオに対応づけて記憶された画像と、前記モーションの検出後の撮像画像とに基づいて、モーションの検出対象の人の輪郭又は骨格又は体幹軸と、前記画像に含まれる人の輪郭又は骨格又は体幹軸との間のずれを検出し、
     前記ずれが検出されたタイミング情報、又は、前記ずれが検出された際の前記モーションの検出後の撮像画像を出力する、
    処理をコンピュータが実行することを特徴とする作業のモニタ結果出力方法。
    According to the motion detected based on the captured image, identify the completed work among the work included in the predetermined work scenario,
    Based on the image stored in association with the work scenario specified in the work scenario as the next work of the completed work and the captured image after the detection of the motion, Detecting a shift between the skeleton or the trunk axis and the outline or skeleton or trunk axis of the person included in the image;
    Output timing information when the deviation is detected, or output a captured image after detection of the motion when the deviation is detected,
    A work monitoring result output method, characterized in that a computer executes processing.
  11.  撮像装置を用いて撮像された作業状況を表示する装置において、
     撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定する手段と、
     完了した前記作業の次の作業として前記作業シナリオで規定された作業に対応づけて記憶された画像と、前記モーションの検出後の撮像画像とを重ねて表示する手段と、
    を備えたことを特徴とする情報処理装置。
    In an apparatus for displaying a work situation imaged using an imaging apparatus,
    Means for identifying a completed work among the work included in a predetermined work scenario according to the motion detected based on the captured image;
    Means for displaying the image stored in association with the work defined in the work scenario as the next work of the completed work and the captured image after the detection of the motion,
    An information processing apparatus comprising:
  12.  撮像装置を用いて撮像された作業状況を表示する装置において、
     所定の作業シナリオに含まれる作業のうち対象となる作業を特定する手段と、
     前記対象となる作業に対応づけて記憶された画像と、撮像画像とを重ねて表示する手段と、
    を備えたことを特徴とする情報処理装置。
    In an apparatus for displaying a work situation imaged using an imaging apparatus,
    Means for identifying the target work among the work included in the predetermined work scenario;
    Means for superimposing and displaying an image stored in association with the target work and a captured image;
    An information processing apparatus comprising:
  13.  撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定する手段と、
     完了した前記作業の次の作業として前記作業シナリオで規定された作業シナリオに対応づけて記憶された画像と、前記モーションの検出後の撮像画像とに基づいて、モーションの検出対象の人の位置と、前記画像に含まれる人の位置との間のずれを検出する手段と、
     前記ずれが検出されたタイミング情報、又は、前記ずれが検出された際の前記モーションの検出後の撮像画像を出力する手段と、
    を備えたことを特徴とする情報処理装置。
    Means for identifying a completed work among the work included in a predetermined work scenario according to the motion detected based on the captured image;
    Based on the image stored in association with the work scenario defined in the work scenario as the next work of the completed work, and the captured image after the detection of the motion, Means for detecting a deviation from a position of a person included in the image;
    Means for outputting timing information when the shift is detected, or a captured image after detection of the motion when the shift is detected;
    An information processing apparatus comprising:
  14.  撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定する手段と、
     完了した前記作業の次の作業として前記作業シナリオで規定された作業シナリオに対応づけて記憶された画像と、前記モーションの検出後の撮像画像とに基づいて、モーションの検出対象の人の輪郭又は骨格又は体幹軸と、前記画像に含まれる人の輪郭又は骨格又は体幹軸との間のずれを検出する手段と、
     前記ずれが検出されたタイミング情報、又は、前記ずれが検出された際の前記モーションの検出後の撮像画像を出力する手段と、
    を備えたことを特徴とする情報処理装置。
    Means for identifying a completed work among the work included in a predetermined work scenario according to the motion detected based on the captured image;
    Based on the image stored in association with the work scenario specified in the work scenario as the next work of the completed work and the captured image after the detection of the motion, Means for detecting a deviation between a skeleton or a trunk axis and a human contour or a skeleton or a trunk axis included in the image;
    Means for outputting timing information when the shift is detected, or a captured image after detection of the motion when the shift is detected;
    An information processing apparatus comprising:
  15.  撮像装置を用いて撮像された作業状況の表示プログラムにおいて、
     撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定し、
     完了した前記作業の次の作業として前記作業シナリオで規定された作業に対応づけて記憶された画像と、前記モーションの検出後の撮像画像とを重ねて表示する、
    処理をコンピュータに実行させる表示プログラム。
    In the display program of the work situation imaged using the imaging device,
    According to the motion detected based on the captured image, identify the completed work among the work included in the predetermined work scenario,
    The image stored in association with the work specified in the work scenario as the next work of the completed work and the captured image after the detection of the motion are displayed in an overlapping manner.
    A display program that causes a computer to execute processing.
  16.  撮像装置を用いて撮像された作業状況の表示プログラムにおいて、
     所定の作業シナリオに含まれる作業のうち対象となる作業を特定し、
     前記対象となる作業に対応づけて記憶された画像と、撮像画像とを重ねて表示する、
    処理をコンピュータに実行させる表示プログラム。
    In the display program of the work situation imaged using the imaging device,
    Identify the target work among the work included in a given work scenario,
    An image stored in association with the target work and a captured image are displayed in an overlapping manner.
    A display program that causes a computer to execute processing.
  17.  撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定し、
     完了した前記作業の次の作業として前記作業シナリオで規定された作業シナリオに対応づけて記憶された画像と、前記モーションの検出後の撮像画像とに基づいて、モーションの検出対象の人の位置と、前記画像に含まれる人の位置との間のずれを検出し、
     前記ずれが検出されたタイミング情報、又は、前記ずれが検出された際の前記モーションの検出後の撮像画像を出力する、
    処理をコンピュータに実行させる表示プログラム。
    According to the motion detected based on the captured image, identify the completed work among the work included in the predetermined work scenario,
    Based on the image stored in association with the work scenario defined in the work scenario as the next work of the completed work, and the captured image after the detection of the motion, Detecting a deviation from the position of the person included in the image,
    Output timing information when the deviation is detected, or output a captured image after detection of the motion when the deviation is detected,
    A display program that causes a computer to execute processing.
  18.  撮像画像に基づいて検出したモーションに応じて、所定の作業シナリオに含まれる作業のうち完了した作業を特定し、
     完了した前記作業の次の作業として前記作業シナリオで規定された作業シナリオに対応づけて記憶された画像と、前記モーションの検出後の撮像画像とに基づいて、モーションの検出対象の人の輪郭又は骨格又は体幹軸と、前記画像に含まれる人の輪郭又は骨格又は体幹軸との間のずれを検出し、
     前記ずれが検出されたタイミング情報、又は、前記ずれが検出された際の前記モーションの検出後の撮像画像を出力する、
    処理をコンピュータに実行させる表示プログラム。
    According to the motion detected based on the captured image, identify the completed work among the work included in the predetermined work scenario,
    Based on the image stored in association with the work scenario specified in the work scenario as the next work of the completed work and the captured image after the detection of the motion, Detecting a shift between the skeleton or the trunk axis and the outline or skeleton or trunk axis of the person included in the image;
    Output timing information when the deviation is detected, or output a captured image after detection of the motion when the deviation is detected,
    A display program that causes a computer to execute processing.
PCT/JP2015/071170 2015-07-24 2015-07-24 Display method, monitor result output method, information processing device, and display program WO2017017737A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/071170 WO2017017737A1 (en) 2015-07-24 2015-07-24 Display method, monitor result output method, information processing device, and display program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/071170 WO2017017737A1 (en) 2015-07-24 2015-07-24 Display method, monitor result output method, information processing device, and display program

Publications (1)

Publication Number Publication Date
WO2017017737A1 true WO2017017737A1 (en) 2017-02-02

Family

ID=57884271

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/071170 WO2017017737A1 (en) 2015-07-24 2015-07-24 Display method, monitor result output method, information processing device, and display program

Country Status (1)

Country Link
WO (1) WO2017017737A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011034234A (en) * 2009-07-30 2011-02-17 Kozo Keikaku Engineering Inc Movement analysis device, movement analysis method and movement analysis program
JP2011164694A (en) * 2010-02-04 2011-08-25 Nec Corp Device and method for support of standard operation execution
WO2012039467A1 (en) * 2010-09-22 2012-03-29 パナソニック株式会社 Exercise assistance system
WO2013105443A1 (en) * 2012-01-13 2013-07-18 ソニー株式会社 Information processing device and information processing method, as well as computer program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011034234A (en) * 2009-07-30 2011-02-17 Kozo Keikaku Engineering Inc Movement analysis device, movement analysis method and movement analysis program
JP2011164694A (en) * 2010-02-04 2011-08-25 Nec Corp Device and method for support of standard operation execution
WO2012039467A1 (en) * 2010-09-22 2012-03-29 パナソニック株式会社 Exercise assistance system
WO2013105443A1 (en) * 2012-01-13 2013-07-18 ソニー株式会社 Information processing device and information processing method, as well as computer program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MICHIHIKO GOTO: "AR -Based Supporting System by Overlay Display of Instruction Video", THE JOURNAL OF THE INSTITUTE OF IMAGE ELECTRONICS ENGINEERS OF JAPAN, vol. 39, no. 5, 25 September 2010 (2010-09-25), pages 631 - 643 *

Similar Documents

Publication Publication Date Title
Valero et al. Analysis of construction trade worker body motions using a wearable and wireless motion sensor network
US9761047B2 (en) Virtual mask fitting system
US8509490B2 (en) Trajectory processing apparatus and method
JP6823502B2 (en) Robot setting device, robot setting method, robot setting program, computer-readable recording medium, and recording equipment
JP5525202B2 (en) Motion analysis apparatus, motion analysis method, and motion analysis program
JP4752721B2 (en) Movement pattern identification device, movement pattern identification method, movement pattern identification program, and recording medium recording the same
CN104070266B (en) Weld seam information setting device, program, automatic teaching system and weld seam information setting method
US11842511B2 (en) Work analyzing system and work analyzing method
Wu et al. Human-computer interaction based on machine vision of a smart assembly workbench
JP4932048B1 (en) Work support system
JP7114885B2 (en) Worksite monitoring devices and programs
JP7010542B2 (en) Work analyzer, work analysis method, and program
US11461923B2 (en) Calculation system, calculation method, and storage medium
US20200125839A1 (en) Method and system for automatic repetitive step and cycle detection for manual assembly line operations
WO2019138877A1 (en) Motion-analyzing system, motion-analyzing device, motion analysis method, and motion analysis program
US11138805B2 (en) Quantitative quality assurance for mixed reality
TW201818297A (en) Work recognition device and work recognition method
CN114846514A (en) Job analysis device and job analysis method
WO2020003547A1 (en) Content presentation system and content presentation method
WO2017017739A1 (en) Task management system, task management method, information processing device, and task management program
JP7004218B2 (en) Motion analysis device, motion analysis method, motion analysis program and motion analysis system
KR20160076488A (en) Apparatus and method of measuring the probability of muscular skeletal disease
US11199561B2 (en) System and method for standardized evaluation of activity sequences
JP2019159885A (en) Operation analysis device, operation analysis method, operation analysis program and operation analysis system
WO2017017737A1 (en) Display method, monitor result output method, information processing device, and display program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15899571

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15899571

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP