WO2019087870A1 - Work assistance device, work assistance method, and program - Google Patents

Work assistance device, work assistance method, and program Download PDF

Info

Publication number
WO2019087870A1
WO2019087870A1 PCT/JP2018/039376 JP2018039376W WO2019087870A1 WO 2019087870 A1 WO2019087870 A1 WO 2019087870A1 JP 2018039376 W JP2018039376 W JP 2018039376W WO 2019087870 A1 WO2019087870 A1 WO 2019087870A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
work
worker
unit
wearable terminal
Prior art date
Application number
PCT/JP2018/039376
Other languages
French (fr)
Japanese (ja)
Inventor
友昭 牧野
Original Assignee
Necフィールディング株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necフィールディング株式会社 filed Critical Necフィールディング株式会社
Priority to CN201880070359.2A priority Critical patent/CN111328402A/en
Priority to JP2019551164A priority patent/JP6912150B2/en
Publication of WO2019087870A1 publication Critical patent/WO2019087870A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Definitions

  • the present invention is based on the claim of priority of Japanese Patent Application: Japanese Patent Application No. 2017-209179 (filed on October 30, 2017), and the entire contents of the same application are incorporated and described herein by reference. It shall be.
  • the present invention relates to a work support device, a work support method and a program.
  • Patent Document 1 discloses a maintenance support system capable of reducing the burden on workers and reducing labor of maintenance and inspection work in maintenance and inspection work of field equipment such as power stations and substations. ing.
  • the document describes that a server 10 having a plurality of databases holding information on maintenance and inspection operations is provided.
  • the portable terminal 6 carried by the worker is provided with a function of photographing the progress status for each predetermined process of maintenance work by the camera 2 attached to the helmet 1 worn by the worker and transmitting it to the server 10 .
  • the server 10 extracts the data necessary for the work from the database according to the progress taken by the camera 2 and provides the information necessary for the work from the display 5 attached to the helmet 1 worn by the worker It is described.
  • Patent Document 2 discloses a work management system that can analyze and evaluate a work state, a situation of a work site, and the like without relying on a supervisor.
  • the input unit 1 of the work management system captures an image of a worker with a high degree of skill in progress, and patterns the moving image using a known image processing method before the storage unit.
  • the detection unit 2 of the work management system captures an image of a normal worker performing a work, patterns the moving image using a known image processing method, and then stores the pattern in the storage unit 3.
  • the central processing unit 4 of the work management system compares the two moving images to determine the difference in work speed of a normal worker with respect to a highly skilled worker or an error in the work procedure.
  • Patent Document 1 describes that the main control unit 11 of the server 10 determines whether the maintenance work of the process is good or not from the progress of the maintenance work transmitted from the portable terminal 6 at each predetermined maintenance process. It plays a role of judgment means.
  • Patent Document 2 the work video of a highly skilled worker is taken in advance and compared with the work video of a normal worker, and the evaluation of the work of the normal worker is evaluated as bad or bad Will depend on the work image of a highly skilled worker.
  • the method of patent document 2 can be used for the desk work etc. shown by FIG. 2 of the same document, it may not be able to be used for confirmation of the visit type and business trip type of work where the environment of the work site may be different each time. It is believed that there is.
  • An object of the present invention is to provide a work support apparatus, a work support method, and a program that can notify a worker of occurrence of a work process error or the like with high accuracy and contribute to improvement of the work efficiency.
  • an operation including at least a first step and a second step performed after the first step, an operation performed in each of the first step and the second step
  • a storage unit that associates and holds a representative image for each process representing the image, an input unit that inputs an image during the process in which the worker is working, the input image, and the representative image
  • a work support apparatus including a check unit that checks whether or not the work is performed in a predetermined order, and a notification unit that notifies the operator of the result of the check.
  • an operation including at least a first step and a second step performed after the first step, and each step representing the operations performed in the first and second steps
  • a computer accessible to the means for associating and holding the representative image with the step of inputting an image during the process of the worker, collating the input image with the representative image;
  • a work support method including a step of confirming whether the work is performed in a predetermined order and a step of notifying the worker of the result of the confirmation.
  • the method is tied to a specific machine, a computer that functions as a work support device.
  • an operation including at least a first step and a second step performed after the first step, and each step representing the operations performed in the first and second steps Processing for inputting an image in a process in which the operator is working, collating the input image with the representative image on a computer accessible to the means for associating and holding the representative image
  • a program for executing a process of confirming whether the work is performed in a predetermined order and a process of notifying the worker of the result of the confirmation can be recorded on a computer readable (non-transient) storage medium. That is, the present invention can also be embodied as a computer program product.
  • the present invention it is possible to accurately notify the operator of the occurrence of an operation process error or the like and to improve the operation efficiency. That is, according to the present invention, the work support device described in the background art is converted into one whose leap function of checking the presence or absence of a work process error has been dramatically improved.
  • connection lines between blocks such as drawings referred to in the following description include both bidirectional and unidirectional directions.
  • the unidirectional arrows schematically indicate the flow of main signals (data), and do not exclude bidirectionality.
  • ports or interfaces at the input / output connection points of each block in the drawing they are not shown.
  • the present invention can be realized by a work support apparatus 10a including a storage unit 11a, an input unit 12a, a confirmation unit 13a, and a notification unit 14a, as shown in FIG.
  • the storage unit 11a represents the “work” including at least the first process and the second process performed after the first process, the first process, and the work performed in the second process.
  • the representative image for each process to be performed is associated and held.
  • the input unit 12a inputs an image in a process in which the worker is performing the selected work.
  • the confirmation unit 13a collates the input image with the representative image to confirm whether the work is performed in a predetermined order.
  • the notification unit 14a notifies the worker of the result of the confirmation.
  • the work support apparatus 10a shown in FIG. 1 can also be realized by a computer 10b as shown in FIG.
  • the memory 11b functions as the storage unit 11a described above
  • the input unit 12b functions as the input unit 12a.
  • the memory 11 b also includes auxiliary storage devices such as various disk drives.
  • the input unit 12b may be a camera that captures an image in the process of working, an input interface that receives an image from such a camera, or the like.
  • the confirmation unit 13a and the notification unit 14a can be realized by the processor 13b by the above-described confirmation processing of the confirmation unit 13a and a computer program that causes the output unit 14b to output (notify) the result.
  • FIG. 3 shows an example of a representative image stored in the storage unit 11a in this case.
  • an image in which a subwindow displayed on the monitor during uninstallation is displayed is stored as a representative image of the first step.
  • an image in which a subwindow displayed on the monitor during installation is displayed is stored.
  • the confirmation unit 13a selects the input image and the representative image. The collation is performed to confirm whether the work is performed in a predetermined order. For example, when the input image matches the representative image of the first step, the confirmation unit 13a determines that the work has been correctly started. Thereafter, when an image that matches the representative image of the second step is input, the confirmation unit 13a determines that the work has been correctly performed.
  • the confirmation unit 13a may cause the notification unit 14a to notify the worker that the work is correctly performed. By doing this, the worker can confirm that the application installation work has been correctly performed. After that, even if a problem occurs in the operation of the application program, the worker can recognize that the problem is in another place because the information that the installation work itself was correctly performed is obtained.
  • the confirmation unit 13a performs the second process before the first step. It is determined that the following process has been performed. In this case, it is preferable that the confirmation unit 13a notify the notification unit 14a that the work is not properly performed. By doing this, the worker can recognize that the application installation work was wrong. This allows the program installation to be redone before the next work or process. In addition, at that time, the confirmation unit 13a may instruct the notification unit 14a to notify the content of the specific correction operation. For example, in the case of the example of FIG. 4, after the confirmation unit 13a uninstalls the application of V1.1 and the application of V1.0 for the worker, the installation of V1.1 should be performed. It is conceivable to notify that.
  • FIG. 5 is a diagram showing the configuration of the work support apparatus of the first embodiment of the present invention.
  • a configuration in which a glasses-type wearable terminal (worker attached terminal) 100 worn by a worker 100 a and the server 200 are connected via the network 101 is shown.
  • a wearable terminal is described as a work image input device, but the present invention is not limited to a wearable terminal as a work image input device.
  • a smartphone or other business terminal with a camera as an input device for work images.
  • the camera itself may be configured to be independent, and an image transmitted from the camera may be transmitted to the work support apparatus via the wearable terminal, the smartphone, the business terminal, or the like.
  • a wearable terminal of a glasses type what is called a smart glass or a headset type terminal can be used.
  • connection form communication system
  • various systems can be used. For example, connection via a mobile communication network, connection by wireless LAN (Local Area Network), etc. can be considered.
  • the wearable terminal 100 and the server 200 may be connected to the network 101 via near field wireless communication such as Bluetooth (registered trademark) or infrared communication.
  • near field wireless communication such as Bluetooth (registered trademark) or infrared communication.
  • FIG. 6 is a diagram showing a configuration of the wearable terminal of the first embodiment of the present invention.
  • wearable terminal 100 includes control unit 102, communication unit 103, image input unit 104, voice input unit 105, input unit 106, display unit 107, voice output unit 108, clock unit 109, and storage unit 110. It is shown.
  • the communication unit 103 performs data communication between the wearable terminal 100 and the server 200 via the network 101.
  • the image input unit 104 can be configured, for example, by an imaging device (camera). Specifically, in response to an instruction from the control unit 102, the image input unit 104 displays an image of an area visually recognized by the worker (a moving image or a predetermined interval (the imaging time interval may be wider than the moving image)) It may be a still image captured every time). The image input unit 104 stores the captured image in the process image DB 111 of the storage unit 110.
  • the image input unit 104 does not necessarily have to be an imaging device (camera), and may be an interface to which an imaging device (camera) or the like can be connected.
  • the voice input unit 105 can be configured by a microphone (microphone).
  • a voice instruction may be received from the operator via the voice input unit 105 to control the operation of the wearable terminal 100.
  • a sound emitted from the work target such as a specific device or a sound around the worker may be taken from the voice input unit 105 to confirm whether the work is performed correctly.
  • the input unit 106 includes an information input unit such as a touch panel, a slide pad, a mouse, and a keyboard.
  • An instruction may be received from the worker via the input unit 106, and the operation of the wearable terminal 100 may be controlled.
  • information input by the worker via the input unit 106 (for example, correction data of the process image DB) may be stored in the process image DB 111 or the like of the storage unit 110.
  • the voice input unit 105 and the input unit 106 described above can be omitted as appropriate. For example, when the instruction from the worker is received by the image input unit 104 by recognizing the gesture of the worker, the voice input unit 105 and the input unit 106 can be omitted.
  • the input of information such as an image or voice in the image input unit 104, the voice input unit 105, and the input unit 106 described above is performed at the timing instructed by the control unit 102.
  • the display unit 107 is configured by a liquid crystal display or an organic EL (Electro Luminescence) display that displays an output image (or a moving image) according to an instruction from the control unit 102.
  • a liquid crystal display or an organic EL (Electro Luminescence) display that displays an output image (or a moving image) according to an instruction from the control unit 102.
  • the voice output unit 108 outputs voice according to an instruction from the control unit 102. For example, when it is necessary to obtain an operation sound at the time of an operation performed on a device to be worked, the control unit 102 requires the display unit 107 or the voice output unit 108 to the operator. You may make it output the message (speech guidance etc.) which urges execution of operation. Also, instead of the sound output unit 108, a vibration motor for alerting the operator, a device that generates vibration, or a lamp can be used as the output means.
  • the clock unit 109 measures the required time (elapsed time) from the start of work to each process. By comparing the measured required time with the standard time or the maximum allowable time, it is possible to grasp the presence or absence of a delay in work, the presence or absence of a trouble, and the like.
  • the control unit 102 controls the respective units of the wearable terminal 100 described above, collates the image input from the image input unit 104 with the representative image or the error image, and performs work progress management (see the flowchart of FIG. 10). Further, the control unit 102 according to the present embodiment manages the progress of the work by comparing the required time (elapsed time) of each process with a separately determined standard time or a maximum allowable time. Further, the control unit 102 of the present embodiment has a function of outputting a warning to the worker when a predetermined warning activation condition is satisfied. Such a control unit 102 can also be realized by reading a program 112 for controlling each unit of the wearable terminal 100 from the storage unit 110 and causing the processor to execute the program.
  • the storage unit 110 functions as a process image database (process image DB) 111.
  • the process image information received from the server 200 is registered in the process image DB 111. The details of the process image information will be described in detail later.
  • FIG. 7 is a diagram showing the configuration of the server 200 according to the first embodiment of this invention.
  • a server 200 including a control unit 202, a communication unit 203, an image input unit 204, an audio input unit 205, an input unit 206, a display unit 207, an audio output unit 208, and a storage unit 209 is shown.
  • the communication unit 203 performs data communication between the wearable terminal 100 and the server 200 via the network 101.
  • the image input unit 204 is an image of a region visually recognized by the worker from the wearable terminal 100 side (a moving image or a still image captured at predetermined intervals (the imaging time interval may be wider than the moving image)). Is an interface that receives
  • the voice input unit 205 is an interface that receives voice data from the wearable terminal 100 side.
  • a voice instruction may be received from the worker via the voice input unit 205 to control the operation of the server 200.
  • the sound emitted by the work target such as a specific device or the sound around the worker may be taken in from the voice input unit 205 to check whether the work is properly performed.
  • the input unit 206 is configured by an information input unit such as a touch panel, a slide pad, a mouse, and a keyboard. An instruction may be received from the worker via the input unit 206 to control the operation of the server 200. In addition, information (for example, correction data of the process image DB) input by the worker via the input unit 206 may be stored in the process image DB 211 or the like of the storage unit 209.
  • the display unit 207 is configured by a liquid crystal display or an organic EL display that outputs a message or the like necessary for the operator on the server 200 side, and displays input / output contents.
  • the voice output unit 208 In response to an instruction from the control unit 202, the voice output unit 208 outputs a message (voice guidance or the like) prompting the server side operator to perform a necessary operation. Further, instead of the voice output unit 208, a lamp or the like for alerting the operator with light can be used as an output means.
  • the control unit 202 controls each unit of the server 200 described above.
  • the control unit 202 can also be realized by reading a program (not shown) for controlling each unit of the server 200 from the storage unit 209 and causing the processor to execute the program.
  • the control unit 202 performs processing for performing work progress management (see the flowchart in FIG. 10). Also good. Specifically, the control unit 202 collates the image received from the wearable terminal 100 with the representative image or the error image, and manages the progress of the work.
  • the control unit 202 may record the elapsed time since the start of the work, and determine whether there is a delay in the work by comparison with a separately determined standard required time.
  • Work information and process image information are registered in the storage unit 209, and function as a work information database (work information DB) 210 and a process image database (process image DB) 211, respectively.
  • FIG. 8 is a diagram showing an example of information held in the work information DB 210. As shown in FIG. Referring to FIG. 8, an operation No. uniquely assigned to the “operation” specified by the device name, the operation type, and the fault content. Entries are shown.
  • the work information DB 210 receives a transmission request for a process image from the wearable terminal 100
  • the work information DB 210 receives the work content input from the work information DB 210. Used to identify the For example, when a certain worker carries out failure repair of a personal computer which does not start up, the work is identified as the work of # 1.
  • the content of the work is not limited to the example of FIG. 8, and information for specifying various “work” to which the present invention can be applied may be registered.
  • the device removal operation and the periodic inspection operation may be registered in the operation information DB 210.
  • the operation information DB 210 For example, in the case of oil change of an automobile, information indicating the model name and model number (model) of the automobile is set in the device field, and oil change is set in the operation type field (note that the fault content field). May be blank as in the case of the installation of the server 4 of # 4 in FIG. 8, or the necessity or the like of oil element replacement may be set).
  • FIG. 9 is a diagram showing an example of information held in the process image DB 211.
  • Information for performing process progress management is registered in association with a representative image of each process in “Work” in 2 and this representative image and the like.
  • the example of FIG. 9 corresponds to the task No. in the task information DB of FIG.
  • the information at the time of performing the failure repair (paper jam) of the printer 2 identified as 2 is registered.
  • FIG. Of the six steps constituting step 2 the process proceeds normally to step 3, and a state in which an operation error is detected in step 4 is shown.
  • each item of FIG. 9 will be described.
  • the “representative image” field is a field for registering an image for confirming that the work has been correctly performed in each process.
  • This representative image is assumed to be an area (strictly speaking, an input is made to wearable terminal 100) that a worker (for example, a maintenance worker of the apparatus may) visually recognize so that the matching of the image can be easily confirmed in wearable terminal 100 Image taken in the same region as the image to be More preferably, the angle, the size, the number of pixels, and the like of the representative image are set to be the same as the image input to the wearable terminal 100.
  • the representative image is not an image that can be photographed in another process, but an image obtained only in that process.
  • this representative image may be an image to be collated with a part of the image assumed to be input to the wearable terminal 100.
  • the angle, the size, the number of pixels, and the like of the image assumed to be input to the wearable terminal 100 and the representative image do not necessarily have to match.
  • Examples of representative images include an external appearance image of a hard disk drive in hard disk replacement work, and an external appearance image of a circuit board or mechanical component assuming a mounting state on a target device in a replacement operation of a circuit board or mechanical component. Further, as another example of the representative image, there is an image showing the appearance of the connector exposed to the outside of the work target device in the device connection work and the connection state thereof.
  • the “captured image” field in FIG. 9 is a field in which, of the images input from the image input unit 104 of the wearable terminal 100, an image obtained by matching with a representative image or an error image (described later) is registered.
  • the “captured image” field is provided in the process image DB 211 on the server 200 side, when the comparison with the representative image is performed only on the wearable terminal 100 side, “ The "captured image” field can be omitted.
  • steps 1 to 3 since steps 1 to 3 have been confirmed, an image determined to match the representative image is registered in the “captured image” field. Since a match with the error image is detected in step 4 of FIG. 9, an image determined to match the error image is registered in the “captured image” field.
  • the “execution status” field in FIG. 9 is a field for recording the progress status of the process confirmed by the comparison with the representative image by the control unit 102 of the wearable terminal 100 or the control unit 202 of the server 200.
  • the “implementation status” field is provided in the process image DB 211 on the server 200 side, but when the comparison with the representative image is performed only on the wearable terminal 100 side, “Step image DB 211 of the server 200 side
  • the Implementation Status field can be omitted.
  • steps 1 to 3 since steps 1 to 3 have been confirmed, information such as “operation start” and “step confirmation” are registered in the “execution status” field.
  • step 4 of FIG. 9 since a match with an error image is detected, information “error detection” is registered in the “execution status” field. Further, since the process 5 and the subsequent steps are not performed, the information of the initial state of "not performed” is registered.
  • the “procedure information” field in FIG. 9 is a field in which the work content and the like of the process next to each process are registered.
  • the content of the “procedure information” field is notified to the worker via the display unit 107 or the voice output unit 108 as necessary.
  • the “notification information” field in FIG. 9 is a field in which information to be notified to the worker is registered when each process is reached. For example, when the captured image and the representative image match, and it is determined that the work is started, the worker is notified of the content registered in the notification information field. While the difference between work information and work information is created from the perspective that work information conveys the work content, notification information is used to prevent work errors when performing notes, contact matters, and the next process work. This is the point where alert information is set.
  • movement, a figure can be taken as said work information and notification information.
  • characters and figures indicating the work information and the notification information may be superimposed and displayed on the glasses-type screen of the wearable terminal 100 to support the work of the worker.
  • the "error image” field is a field for registering an image for detecting that an operation error has occurred in each process.
  • this error image is also a region (strictly speaking, wearable) in which the worker (for example, a maintenance worker of the apparatus) may visually check so that the matching of the image can be easily confirmed in the wearable terminal 100
  • the same image as the image assumed to be input to the terminal 100 be captured. More preferably, the angle, the size, the number of pixels, and the like of the error image are set to be the same as the image input to the wearable terminal 100.
  • the error image is not an image that can be photographed in other steps, but an image obtained only in that step.
  • this error image may be an image to be collated with a part of the image assumed to be input to the wearable terminal 100.
  • the angle, the size, the number of pixels, and the like of the image assumed to be input to the wearable terminal 100 and the error image do not necessarily have to match.
  • the “standard time” field in FIG. 9 is a field in which a standard working time between each process is registered. By comparing the standard time with the actual work time, it is possible to grasp the progress (the degree of delay) of the work. Also, the “maximum allowable time” field in FIG. 9 is a field in which an allowable time required for each process is registered (that is, maximum allowable time> standard time). The “elapsed time” field of FIG. 9 is a field in which the actual required time between each process is registered. In the example of FIG. 9, the standard time and the maximum allowable time of the final process, step 6, are blank. However, the operation ends with the agreement with the representative image of this process, and there is no need for determination. It is for.
  • the “delay determination” field in FIG. 9 is a field in which the determination result of whether the progress of each process is advanced or delayed compared to the standard required time is stored.
  • “early determination” is made. For example, since the elapsed time is 8 minutes with respect to the standard time of 15 minutes, the process 3 is “early determination”.
  • “delay determination” is set. For example, in steps 2 and 4, since the elapsed time is longer than the maximum allowable time, it is “delay determination”. In the example of FIG.
  • step 1 when the elapsed time is shorter than the maximum allowable time but longer than the standard time, “delay determination” is made.
  • the elapsed time is 8 minutes and the maximum allowable time is 10 minutes with respect to the standard time of 5 minutes, so “delay determination” is made.
  • these determination criteria are an example to the last, and can be suitably changed with the classification and property of operation
  • the cumulative standard time or the cumulative maximum allowable time may be set, and the determination result may be obtained by comparing with the cumulative elapsed time.
  • the relationship between the standard time and the maximum allowable time can be set appropriately according to the contents of the process. Quite simply, a standard time plus a predetermined time may be set as the maximum allowable time, or a standard time multiplied by a predetermined rate may be set as the maximum allowable time.
  • the determination result or the delay amount may be notified to the operator by voice or image. As a result, the worker can grasp the delay of work and the degree thereof.
  • the “warning information” field in FIG. 9 is a field in which the warning content notified to the worker is stored when the condition set in the “warning activation condition” field is satisfied.
  • the warning information and the warning activation condition are set in the steps 2 and 4. This makes it possible to warn (notify) the worker when an error or a delay occurs with respect to a particularly important process in operation.
  • FIG. 10 is a flowchart showing the operation of the wearable terminal according to the first embodiment of this invention.
  • the worker inputs information (device name, work type, fault content, etc.) for specifying the work shown in FIG. 8 from the input unit 106 of the wearable terminal 100 (step S301). .
  • the control unit 102 of the wearable terminal 100 transmits the input information to the server 200, and requests specification of a work No. and transmission of a process image corresponding to the specified work No.
  • the server 200 specifies, from the work information DB 210, a work No. corresponding to the information (apparatus name, work type, fault content, etc.) received from the wearable terminal 100.
  • the server 200 specifies data corresponding to the work No. from the process image DB 211, and transmits the data together with the work No. to the wearable terminal 100.
  • the wearable terminal 100 stores the information received from the server 200 in the process image DB 111 (step S302).
  • the control unit 102 of the wearable terminal 100 sets “1” to the variable indicating the number of steps, and starts the matching operation with the input image (step S303).
  • the control unit 102 of the wearable terminal 100 captures an image of a region visually recognized by the worker from the image input unit 104 (step S304).
  • the captured image is temporarily stored in the captured image field of the process image DB 111.
  • the control unit 102 of the wearable terminal 100 performs the work Determine that it has not been started. Therefore, the control unit 102 of the wearable terminal 100 continues to capture an image of the area visually recognized by the worker from the image input unit 104 (to step S304).
  • the control unit 102 of the wearable terminal 100 adds 1 to the variable indicating the number of steps (increment), and starts the matching operation between the input image and the next step (step S307).
  • the control unit 102 of the wearable terminal 100 captures an image of a region visually recognized by the worker from the image input unit 104 (step S308).
  • the captured image is temporarily stored in the captured image field of the process image DB 111.
  • the control unit 102 of the wearable terminal 100 compares the captured image with the representative image of the corresponding number of steps of the step image DB 111 (step S309).
  • the control unit 102 of the wearable terminal 100 acquires the captured image and the current number of steps. Matching with representative images of all earlier steps is performed (step S316).
  • step S316 if the captured image does not match the representative images of all the processes before the current number of processes (“do not match” in step S316), the control unit 102 of the wearable terminal 100 returns to step S309 and captures The collation of the base image with the representative image of the corresponding number of steps of the step image DB 111 is continued.
  • step S316 when any one of the representative images compared in step S316 matches the captured image, the control unit 102 of the wearable terminal 100 recognizes that the work is not performed in the correct work order (step S317). Then, the control unit 102 of the wearable terminal 100 notifies the worker of a work order error via the display unit 107 or the voice output unit 108 (step S318). At that time, the control unit 102 of the wearable terminal 100 may output work information and notification information of the corresponding process, and may instruct the worker on the correct work content and the content of the return work.
  • step S309 the control unit 102 of the wearable terminal 100 performs the work in the correct work order It recognizes that it is being performed (step S310). In this case, the control unit 102 of the wearable terminal 100 notifies the worker that the work is being performed in the correct order via the display unit 107 or the audio output unit 108 (step S311).
  • control unit 102 of the wearable terminal 100 confirms whether or not the variable indicating the number of steps matches the final number of steps (step S312).
  • the control unit 102 of the wearable terminal 100 returns to step S308 and corresponds to the captured image and the step image DB 111.
  • the comparison of the number of steps with the representative image is continued ("not the last step" in step S312).
  • step S313 the control unit 102 of the wearable terminal 100 recognizes that all steps of the corresponding work No. have been correctly performed.
  • the control unit 102 of the wearable terminal 100 notifies the worker that the work has been completed in the correct order via the display unit 107 or the audio output unit 108 (step S314). This completes the progress management of the series of tasks.
  • the control unit 102 of the wearable terminal 100 determines the elapsed time at the time when the match between the new image in step S309 and the representative image in the next step is confirmed. Compare the standard time of the previous process with the maximum allowable time. According to the result of the comparison, the control unit 102 of the wearable terminal 100 confirms the delay or the like of the previous process, and updates the content of the delay determination field of the process image DB. In addition, when a delay is detected, the control unit 102 of the wearable terminal 100 notifies the worker of a delay in operation and the like.
  • the control unit 102 of the wearable terminal 100 confirms whether or not the condition defined in the warning activation condition field of the process image DB 111 of FIG. 9 is satisfied at a predetermined cycle. .
  • the control unit 102 of the wearable terminal 100 gives a warning to the worker using the content set in the warning information field.
  • the wearable terminal 100 and the server 200 of the present embodiment described above it is possible to observe in real time the work performed by the worker and to confirm whether or not the work is performed in the correct order. Further, in the present embodiment, it is possible to provide the worker with necessary information in each process, and to issue a warning about whether or not the operation is delayed. For this reason, it is possible to save labor or automate management of the progress of the work process.
  • Step 1 Step of Removing the Cover
  • an image of the entire cover is used.
  • an image of a cover to which a marker is added in advance may be used.
  • an image may be prepared which can recognize the shape and size of the outer shape of the cover (in consideration of the imaging magnification) and the cover. This detects the start of work.
  • Step 2 Fixing of the cover Fixing of the cover shall be screw fixing by means of a plurality of removable bolts (meaning male screws).
  • the cover image after screw fixation as a representative image
  • the positions of these bolts are stored in advance, the cover portion is detected from the acquired captured image, and the relative position portion of each bolt is detected from the position of the entire cover.
  • the image is cut out, and it is detected and judged whether the bolt-specific color (for example, the color of zinc plating) is in the area of the predetermined range in which the size of the bolt head is included in the cut-out image.
  • a method of determining whether there is a bolt at that position can also be employed. If there are parts to attach the cover with bolts at four corners and four places of the cover, it is detected that there are bolts in that area, and if it is judged that there are bolts at all places, it is correct. Judge that the cover is attached and fixed.
  • step 2 if there is a mounting order of the bolts (for example, if there is an order of tightening in the order of upper right, lower left (upper right diagonal), upper left, lower right (upper left diagonal) of the cover) May subdivide step 2. For example, every time the bolt is tightened, a representative image in which an image of the bolt is increased is prepared in advance at the position of the bolt attached in that order, and the captured image, the representative image according to the order of tightening the bolt, and the order Matching can be done as it is. Then, if the images match according to the order, it can be determined that the bolts are tightened according to the defined order.
  • Step 3 Withdrawal of the cable
  • a terminal that is, a representative image obtained by imaging the inserted state and a representative image obtained by imaging the absence of the cable are prepared. Do. Then, when the image obtained from the image input unit 104 changes from the image into which the end of the cable is inserted to an image without a cable, it can be determined that the cable has been pulled out in accordance with the work procedure.
  • the image on the device side of the removed connector may be used as a representative image, and it may be determined that the cable has been correctly removed from the match with the image acquired from the image input unit 104.
  • Step 4 Operation of the switch
  • the switch When the operation part of the switch, toggle switch, etc. changes according to the on / off of the switch, or when there is a change in the color of part of the switch when the switch is turned on / off, the switch An image in the on state and an image in the off state are prepared as representative images. If the image acquired from the image input unit 104 matches the image in the on state after the switch matches the image in the on state, it can be determined that the switch has been correctly operated. Then, when a series of work is completed by turning off the switch, it is possible to confirm that the work is completed according to this process.
  • FIG. 11 is a diagram showing the configuration of the wearable terminal 100b according to the second embodiment of this invention.
  • the difference in configuration from the wearable terminal 100 of the first embodiment shown in FIG. 6 is that the work information DB 113 is arranged in the storage unit 110a.
  • the wearable terminal 100b of the present embodiment since the wearable terminal 100b of the present embodiment includes the work information DB 113, it is possible to identify the work No. by the wearable terminal 100b alone. In addition, data necessary for the work performed by the worker is stored in advance in the process image DB 111 of the wearable terminal 100 b. Then, the wearable terminal 100b according to the present embodiment reads and sets data corresponding to the specified work No. from the process image DB 111. The subsequent operation is the same as that of step S303 and subsequent steps of the first embodiment shown in FIG.
  • the work of the worker can be checked by the wearable terminal 100b alone, and in particular, it is suitable for the case where the work is performed in a place where the communication infrastructure is not ready It can be applied to
  • a sensor that senses the movement of the wearer's eye can be attached to the inside of the smart glass or the headset terminal.
  • the wearable terminal provided with these sensors detects an operation to narrow the eye of the operator, which is performed when the image displayed on the display unit 107 is difficult to see.
  • these wearable terminals zoom the input image obtained from the image input unit 104 and display the display unit 107 in an enlarged manner. Whether or not the operation of narrowing the eyes is being performed can be determined based on whether or not the amount of reflected light obtained from the sensor has continued within a predetermined range and for a predetermined time or more.
  • the irradiated light is invisible light (preferably near infrared light) and
  • the light intensity is preferably in a range that does not affect the eye and retina.
  • the terminal such as the wearable terminal
  • This makes it possible to obtain information such as the imaging direction, the size of the object in the image range, and the tilt of the target device.
  • pre-processing such as correction of the inclination of the image and change of the size.
  • [Matching process with representative image] A modification of the process of determining whether the image acquired by the image input unit 104 and the representative image is the same or not, which is performed in steps S305, S309, and S316 of FIG. 10, will be described.
  • the matching determination of the image fetched from the image input unit 104 and the representative image can be performed using a method called image matching processing in which feature points in the image are extracted and matching between the feature points is performed.
  • the image fetched from the image input unit 104 and the representative image have predetermined angles and sizes. Preferably, it is adjusted to be imaged.
  • a representative image it is preferable to use an image captured by the camera of the same wearable terminal as the wearable terminal 100 worn by the worker.
  • a representative image is not an image that can be detected in another process, but an image that is obtained only in that process.
  • the captured image of the circuit board can be used as a representative image.
  • the outer shape may be recognized to perform match detection, or match detection may be performed with a predetermined shape or color (a color different from the background color in order to make the background float up).
  • a plurality of markers at least three points, three points can detect the position, size, and angle of the image) in advance for both the target object and the representative image to be subjected to the work.
  • markers may be used as feature points.
  • binarization may be performed based on the difference in color from the background of the comparison target image, and the positions of the markers may be extracted to perform matching.
  • the position an axis perpendicular to each other in the imaging plane in the X and Y directions and in the plane perpendicular to the imaging direction
  • posture angle ⁇ z, center in the direction perpendicular to the imaging plane
  • the matching may be performed by extracting the rotation direction and the size.
  • the coincidence degree of the pattern of each part from the reference position may be calculated, and the final identity judgment may be performed using this coincidence degree. More specifically, the difference between the image and the comparison image after matching the angle and the size, for example, the luminance for each pixel, is normalized after the entire image so that the average of the luminances agrees, and then the predetermined The method of judging by the difference of the area of the part which has the above luminance differences can be used.
  • matching in order to reduce the amount of calculation, matching may be performed between images binarized with a predetermined threshold (threshold).
  • a predetermined threshold threshold
  • a partial area of the image captured from the image input unit 104 is moved at a predetermined ratio, and matching processing is performed each time the movement is performed, and the image input is performed depending on whether a predetermined matching degree or more is obtained. It may be detected whether there is a representative image in the image captured by the unit 104.
  • the predetermined ratio is changed at a predetermined interval from equal magnification to a predetermined magnification, and there is a region in the acquired image that matches the representative image. You may repeat the judgment. Thereby, even if the size of the image changes, it is possible to detect the coincidence of the images.
  • imaging an image including the removed part at a relatively short time interval imaging the image at a time interval at which the detection target image does not move significantly
  • imaging the image at a relatively short time interval imaging the image at a time interval at which the detection target image does not move significantly
  • the storage unit of the work support apparatus described above holds the representative image in association with each of a plurality of types of work for which the worker is in charge;
  • the confirmation unit Furthermore, from the worker, the input of information on the work to be started is received, Based on the information input from the worker, the work to be started can be specified, and it can be configured to confirm whether the specified work is performed in a predetermined order.
  • the storage unit of the work support apparatus described above is In addition to the representative image, an error image representing a situation when an erroneous work is performed is stored, further, The confirmation unit Collating the input image with the error image to check whether an erroneous work has been performed; The notification unit If it is determined in the confirmation unit that an erroneous work has been performed, a configuration can be employed to notify the worker that an error has occurred.
  • the storage unit of the work support apparatus described above is The required time from the first step to the second step is held in association with the representative images of the first and second steps, further, The confirmation unit Check whether or not a delay has occurred in the operations from the first step to the second step, The notification unit may be configured to notify the worker of the presence or absence of the delay of the work.
  • the storage unit of the work support apparatus described above is Furthermore, a representative image of a third step performed after the second step is held, The confirmation unit If the input image matches the representative image of the third step after the first step, it is determined that an order error has occurred in the work, The notification unit may be configured to notify the worker that an order error has occurred in the work.
  • the storage unit of the work support apparatus described above is Furthermore, a representative image of a process performed after the second process is held,
  • the confirmation unit After the first step, if the input image matches any of the representative images of the steps to be performed after the second step, it is determined that an order error has occurred in the operation;
  • the notification unit may be configured to notify the worker that an order error has occurred in the work.
  • An image captured by a camera of a wearable terminal worn by the worker can be used as the image in the process in which the worker is performing work.
  • As the representative image an image captured by a camera of the same wearable terminal as the wearable terminal worn by the worker can be used.
  • the work support device mentioned above is A work support apparatus (wearable terminal) that receives an input of information on a work to be started from the worker; According to a configuration including a server that specifies the work to be started based on the information input from the worker and provides data corresponding to the specified work to the work support apparatus (wearable terminal) it can.
  • a work support apparatus wearable terminal
  • a server that specifies the work to be started based on the information input from the worker and provides data corresponding to the specified work to the work support apparatus (wearable terminal) it can.
  • [Tenth embodiment] (Refer to the work support method from the second viewpoint above)
  • Eleventh embodiment] (Refer to the program from the above third viewpoint)
  • the above tenth to eleventh embodiments can be expanded to the second to ninth embodiments as in the first embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Quality & Reliability (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Computer Interaction (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • General Factory Administration (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The objective of the present invention is to notify a worker of the occurrence of work process errors and the like with high precision, thereby improving the work efficiency of the worker. This work assistance device comprises: a storage unit for storing, in association with each other, work that includes at least a first process and a second process carried out after the first process, and process-specific representative images representing the work carried out respectively in the first process and the second process; an input unit for inputting an image of the worker in the process of carrying out the work; a confirmation unit for comparing the input image and the representative images and confirming whether the work is being carried out in a prescribed sequence; and a reporting unit for reporting the result of the confirmation to the worker.

Description

作業支援装置、作業支援方法及びプログラムWork support apparatus, work support method and program
 (関連出願についての記載)
 本発明は、日本国特許出願:特願2017-209179号(2017年10月30日出願)の優先権主張に基づくものであり、同出願の全記載内容は引用をもって本書に組み込み記載されているものとする。
 本発明は、作業支援装置、作業支援方法及びプログラムに関する。
(Description of related application)
The present invention is based on the claim of priority of Japanese Patent Application: Japanese Patent Application No. 2017-209179 (filed on October 30, 2017), and the entire contents of the same application are incorporated and described herein by reference. It shall be.
The present invention relates to a work support device, a work support method and a program.
 特許文献1に、発電所や変電所等の現場機器の保守・点検作業において、作業者の負担を軽減すると共に、保守・点検作業業務の省力化を図ることのできるという保守支援システムが開示されている。同文献には、保守・点検作業に関する情報を保持した複数のデータベースを備えたサーバ10を設けることが記載されている。一方、作業者が携行する携帯端末6には、作業者が被るヘルメット1に取り付けられたカメラ2により保守作業の所定の工程ごとにその進捗状況を撮影してサーバ10に伝送する機能が備えられる。そして、サーバ10は、カメラ2によって撮影された進捗状況に応じてデータベースから作業に必要なデータを抽出して、作業者が被るヘルメット1に取り付けられたディスプレイ5から作業に必要な情報を提供することが記載されている。 Patent Document 1 discloses a maintenance support system capable of reducing the burden on workers and reducing labor of maintenance and inspection work in maintenance and inspection work of field equipment such as power stations and substations. ing. The document describes that a server 10 having a plurality of databases holding information on maintenance and inspection operations is provided. On the other hand, the portable terminal 6 carried by the worker is provided with a function of photographing the progress status for each predetermined process of maintenance work by the camera 2 attached to the helmet 1 worn by the worker and transmitting it to the server 10 . Then, the server 10 extracts the data necessary for the work from the database according to the progress taken by the camera 2 and provides the information necessary for the work from the display 5 attached to the helmet 1 worn by the worker It is described.
 特許文献2には、監視者に頼ることなく、作業状態や作業現場の状況等を分析して評価することができるという作業管理システムが開示されている。同文献によると、この作業管理システムの入力部1は、熟練度の高い作業者が作業を進めている様子を撮像し、この動画像を周知の画像処理方法を用いてパターン化してから記憶部3に記憶する。また、この作業管理システムの検出部2は、通常の作業者が作業を進めている様子を撮像し、この動画像を周知の画像処理方法を用いてパターン化してから記憶部3に記憶したり、中央処理装置4に与える。そして、この作業管理システムの中央処理装置4は、2つの動画像を比較して、熟練度の高い作業者に対する通常の作業者の作業速度の差や作業手順の誤りを判定する。 Patent Document 2 discloses a work management system that can analyze and evaluate a work state, a situation of a work site, and the like without relying on a supervisor. According to this document, the input unit 1 of the work management system captures an image of a worker with a high degree of skill in progress, and patterns the moving image using a known image processing method before the storage unit. Store in 3. In addition, the detection unit 2 of the work management system captures an image of a normal worker performing a work, patterns the moving image using a known image processing method, and then stores the pattern in the storage unit 3. , To the central processing unit 4. Then, the central processing unit 4 of the work management system compares the two moving images to determine the difference in work speed of a normal worker with respect to a highly skilled worker or an error in the work procedure.
特開2005-216137号公報JP 2005-216137 A 特開2003-167613号公報Japanese Patent Application Publication No. 2003-167613
 以下の分析は、本発明によって与えられたものである。作業者を現地に派遣して、保守やメインテナンス作業を行う場合において、作業工程誤りが発生する場合がある。本発明者の分析によれば、これら作業誤りの多くは、作業者の誤認識、忘れ、見過ごし、思い込み等に起因するものであり、作業の中の特定のシーンを確認するだけで、その誤りを判定可能である。この点、特許文献1は、サーバ10の主制御部11が「所定の保守工程ごとに前記携帯端末6から伝送される保守作業の進捗状況から、該工程の保守作業の良否を判定する作業良否判定手段の役割を担う。」と包括的な記載がなされるに止まっている。 The following analysis is given by the present invention. When dispatching workers to the site for maintenance and maintenance work, work process errors may occur. According to the present inventor's analysis, many of these work errors are caused by worker's misrecognition, forgetting, oversight, assumption, etc., and it is necessary to confirm only a specific scene in the work. Can be determined. In this respect, Patent Document 1 describes that the main control unit 11 of the server 10 determines whether the maintenance work of the process is good or not from the progress of the maintenance work transmitted from the portable terminal 6 at each predetermined maintenance process. It plays a role of judgment means. "
 また、特許文献2の方法は、予め熟練度の高い作業者の作業映像を撮影しておき、通常の作業者の作業映像と比較するものであり、通常の作業者の作業の評価の良し悪しは、熟練度の高い作業者の作業映像次第となってしまう。また、特許文献2の方法は、同文献の図2に示された卓上作業などには使えるが、毎度、作業現場の環境が異なりうる訪問型、出張型の作業の確認には使えない場合があると考えられる。 In the method of Patent Document 2, the work video of a highly skilled worker is taken in advance and compared with the work video of a normal worker, and the evaluation of the work of the normal worker is evaluated as bad or bad Will depend on the work image of a highly skilled worker. Moreover, although the method of patent document 2 can be used for the desk work etc. shown by FIG. 2 of the same document, it may not be able to be used for confirmation of the visit type and business trip type of work where the environment of the work site may be different each time. It is believed that there is.
 本発明は、作業者に対し、精度よく作業工程誤り等の発生を知らせ、その作業効率の向上に貢献することのできる作業支援装置、作業支援方法及びプログラムを提供することを目的とする。 An object of the present invention is to provide a work support apparatus, a work support method, and a program that can notify a worker of occurrence of a work process error or the like with high accuracy and contribute to improvement of the work efficiency.
 第1の視点によれば、少なくとも第1の工程と第1の工程の後に行われる第2の工程とを含む作業と、前記第1の工程と、前記第2の工程とにおいてそれぞれ行われる作業を代表する工程毎の代表画像とを対応付けて保持する記憶部と、前記作業者が作業を行っている過程における画像を入力する入力部と、前記入力された画像と、前記代表画像とを照合して、前記作業が所定の順序で実施されているか否かを確認する確認部と、前記作業者に対し、前記確認の結果を通知する通知部と、を備えた作業支援装置が提供される。 According to the first aspect, an operation including at least a first step and a second step performed after the first step, an operation performed in each of the first step and the second step A storage unit that associates and holds a representative image for each process representing the image, an input unit that inputs an image during the process in which the worker is working, the input image, and the representative image There is provided a work support apparatus including a check unit that checks whether or not the work is performed in a predetermined order, and a notification unit that notifies the operator of the result of the check. Ru.
 第2の視点によれば、少なくとも第1の工程と第1の工程の後に行われる第2の工程とを含む作業と、前記第1、前記第2の工程において行われる作業を代表する工程毎の代表画像とを対応付けて保持する手段にアクセス可能なコンピュータが、前記作業者が作業を行っている過程における画像を入力するステップと、前記入力された画像と、前記代表画像とを照合して、前記作業が所定の順序で実施されているか否かを確認するステップと、前記作業者に対し、前記確認の結果を通知するステップと、を含む作業支援方法が提供される。本方法は、作業支援装置として機能するコンピュータという、特定の機械に結びつけられている。 According to the second aspect, an operation including at least a first step and a second step performed after the first step, and each step representing the operations performed in the first and second steps A computer accessible to the means for associating and holding the representative image with the step of inputting an image during the process of the worker, collating the input image with the representative image; There is provided a work support method including a step of confirming whether the work is performed in a predetermined order and a step of notifying the worker of the result of the confirmation. The method is tied to a specific machine, a computer that functions as a work support device.
 第3の視点によれば、少なくとも第1の工程と第1の工程の後に行われる第2の工程とを含む作業と、前記第1、前記第2の工程において行われる作業を代表する工程毎の代表画像とを対応付けて保持する手段にアクセス可能なコンピュータに、前記作業者が作業を行っている過程における画像を入力する処理と、前記入力された画像と、前記代表画像とを照合して、前記作業が所定の順序で実施されているか否かを確認する処理と、前記作業者に対し、前記確認の結果を通知する処理と、を実行させるプログラムが提供される。なお、このプログラムは、コンピュータが読み取り可能な(非トランジエントな)記憶媒体に記録することができる。即ち、本発明は、コンピュータプログラム製品として具現することも可能である。 According to the third aspect, an operation including at least a first step and a second step performed after the first step, and each step representing the operations performed in the first and second steps Processing for inputting an image in a process in which the operator is working, collating the input image with the representative image on a computer accessible to the means for associating and holding the representative image There is provided a program for executing a process of confirming whether the work is performed in a predetermined order and a process of notifying the worker of the result of the confirmation. Note that this program can be recorded on a computer readable (non-transient) storage medium. That is, the present invention can also be embodied as a computer program product.
 本発明によれば、作業者に対し、精度よく作業工程誤り等の発生を知らせ、その作業効率を向上させることが可能となる。即ち、本発明は、背景技術に記載した作業支援装置を、その作業工程誤りの有無の確認機能を飛躍的に向上させたものへと変換するものとなっている。 According to the present invention, it is possible to accurately notify the operator of the occurrence of an operation process error or the like and to improve the operation efficiency. That is, according to the present invention, the work support device described in the background art is converted into one whose leap function of checking the presence or absence of a work process error has been dramatically improved.
本発明の一実施形態の構成を示す図である。It is a figure which shows the structure of one Embodiment of this invention. 本発明の一実施形態の別の構成例を示す図である。It is a figure which shows another structural example of one Embodiment of this invention. 本発明の一実施形態の作業支援装置の記憶部に準備する情報を説明するための図である。It is a figure for demonstrating the information prepared to the memory | storage part of the work assistance apparatus of one Embodiment of this invention. 本発明の一実施形態の動作を説明するための図である。It is a figure for demonstrating the operation | movement of one Embodiment of this invention. 本発明の第1の実施形態の構成を示す図である。It is a figure which shows the structure of the 1st Embodiment of this invention. 本発明の第1の実施形態で用いるウェアラブル端末の構成を示す図である。It is a figure which shows the structure of the wearable terminal used in the 1st Embodiment of this invention. 本発明の第1の実施形態のサーバの構成を示す図である。It is a figure which shows the structure of the server of the 1st Embodiment of this invention. 本発明の第1の実施形態のサーバの作業情報データベース(作業情報DB)に保持される情報の一例を示す図である。It is a figure which shows an example of the information hold | maintained at the work information database (work information DB) of the server of the 1st Embodiment of this invention. 本発明の第1の実施形態のウェアラブル端末及びサーバの工程画像データベース(工程画像DB)に保持される情報の一例を示す図である。It is a figure which shows an example of the information hold | maintained at process image database (process image DB) of the wearable terminal of the 1st Embodiment of this invention, and a server. 本発明の第1の実施形態のウェアラブル端末の動作を表したフローチャートである。It is a flowchart showing operation | movement of the wearable terminal of the 1st Embodiment of this invention. 本発明の第2の実施形態の構成を示す図である。It is a figure which shows the structure of the 2nd Embodiment of this invention.
 はじめに本発明の一実施形態の概要について図面を参照して説明する。なお、この概要に付記した図面参照符号は、理解を助けるための一例として各要素に便宜上付記したものであり、本発明を図示の態様に限定することを意図するものではない。また、以降の説明で参照する図面等のブロック間の接続線は、双方向及び単方向の双方を含む。一方向矢印については、主たる信号(データ)の流れを模式的に示すものであり、双方向性を排除するものではない。また、図中の各ブロックの入出力の接続点には、ポート乃至インタフェースがあるが図示省略する。 First, an outline of an embodiment of the present invention will be described with reference to the drawings. The reference numerals of the drawings appended to this summary are added for convenience to the respective elements as an example for aiding understanding, and the present invention is not intended to be limited to the illustrated embodiments. In addition, connection lines between blocks such as drawings referred to in the following description include both bidirectional and unidirectional directions. The unidirectional arrows schematically indicate the flow of main signals (data), and do not exclude bidirectionality. Also, although there are ports or interfaces at the input / output connection points of each block in the drawing, they are not shown.
 本発明は、その一実施形態において、図1に示すように、記憶部11aと、入力部12aと、確認部13aと、通知部14aと、を備える作業支援装置10aにて実現できる。記憶部11aは、少なくとも第1の工程と第1の工程の後に行われる第2の工程と含む「作業」と、前記第1の工程と、前記第2の工程とにおいてそれぞれ行われる作業を代表する工程毎の代表画像とを対応付けて保持する。 In one embodiment, the present invention can be realized by a work support apparatus 10a including a storage unit 11a, an input unit 12a, a confirmation unit 13a, and a notification unit 14a, as shown in FIG. The storage unit 11a represents the “work” including at least the first process and the second process performed after the first process, the first process, and the work performed in the second process. The representative image for each process to be performed is associated and held.
 入力部12aは、前記作業者が前記選択した作業を行っている過程における画像を入力する。 The input unit 12a inputs an image in a process in which the worker is performing the selected work.
 確認部13aは、前記入力された画像と、前記代表画像とを照合して、前記作業が所定の順序で実施されているか否かを確認する。 The confirmation unit 13a collates the input image with the representative image to confirm whether the work is performed in a predetermined order.
 通知部14aは、前記作業者に対し、前記確認の結果を通知する。 The notification unit 14a notifies the worker of the result of the confirmation.
 なお、図1に示した作業支援装置10aは、図2に示すようなコンピュータ10bにて実現することもできる。この場合、メモリー11bが、上記した記憶部11aとして機能し、入力手段12bが、入力部12aとして機能することになる。なお、メモリー11bには、各種のディスクドライブ等の補助記憶装置も含まれる。また、入力手段12bは、作業を行っている過程における画像を撮影するカメラや、そのようなカメラから画像を受け取る入力インタフェース等であってもよい。また、確認部13a及び通知部14aは、プロセッサ13bにおいて、上記した確認部13aの確認処理と、その結果を出力手段14bから出力(通知)させるコンピュータプログラムによって実現することが可能である。 The work support apparatus 10a shown in FIG. 1 can also be realized by a computer 10b as shown in FIG. In this case, the memory 11b functions as the storage unit 11a described above, and the input unit 12b functions as the input unit 12a. The memory 11 b also includes auxiliary storage devices such as various disk drives. In addition, the input unit 12b may be a camera that captures an image in the process of working, an input interface that receives an image from such a camera, or the like. Further, the confirmation unit 13a and the notification unit 14a can be realized by the processor 13b by the above-described confirmation processing of the confirmation unit 13a and a computer program that causes the output unit 14b to output (notify) the result.
 ここで、あるアプリケーションプログラムのコンピュータへのインストール作業の支援を例に挙げて、本実施形態の動作を説明する。このアプリケーションプログラムでは、インストール工程(第2の工程)の前に、古いバージョンのプログラムのアンインストール作業(第1の工程)が必要とされているものとする。図3は、この場合に、記憶部11aに記憶する代表画像の例を示す。図3の例では、第1の工程の代表画像として、アンインストール中にモニタ上に表示されるサブウインドウが表示された画像が記憶される。同様に、第2の工程の代表画像として、インストール中にモニタ上に表示されるサブウインドウが表示された画像が記憶される。 Here, the operation of the present embodiment will be described by taking support of the installation work of a certain application program on a computer as an example. In this application program, it is assumed that prior to the installation step (the second step), an uninstall operation (the first step) of the program of the old version is required. FIG. 3 shows an example of a representative image stored in the storage unit 11a in this case. In the example of FIG. 3, an image in which a subwindow displayed on the monitor during uninstallation is displayed is stored as a representative image of the first step. Similarly, as a representative image of the second step, an image in which a subwindow displayed on the monitor during installation is displayed is stored.
 そして、入力部12aに、例えば、カメラを介して、作業者が選択した作業を行っている過程における画像の入力がなされると、確認部13aは、入力された画像と、前記代表画像とを照合して、前記作業が所定の順序で実施されているか否かを確認する。例えば、入力された画像が、第1の工程の代表画像と一致すると、確認部13aが、作業が正しく開始されたと判定する。その後、第2の工程の代表画像と一致する画像が入力されると、確認部13aは、作業が正しく行われたものと判定する。ここで、確認部13aが、通知部14aに、作業者に対して、作業が正しく行われていることを通知させるようにしてもよい。このようにすることで、作業者はアプリケーションのインストール作業を正しく実行できたことを確認できる。その後、アプリケーションプログラムの動作に問題が生じたとしても、作業者は、インストール作業自体は正しくできていたという情報が得られているので、問題は他の箇所にあると認識することができる。 Then, when an image is input to the input unit 12a in the process of performing the work selected by the worker via, for example, a camera, the confirmation unit 13a selects the input image and the representative image. The collation is performed to confirm whether the work is performed in a predetermined order. For example, when the input image matches the representative image of the first step, the confirmation unit 13a determines that the work has been correctly started. Thereafter, when an image that matches the representative image of the second step is input, the confirmation unit 13a determines that the work has been correctly performed. Here, the confirmation unit 13a may cause the notification unit 14a to notify the worker that the work is correctly performed. By doing this, the worker can confirm that the application installation work has been correctly performed. After that, even if a problem occurs in the operation of the application program, the worker can recognize that the problem is in another place because the information that the installation work itself was correctly performed is obtained.
 一方、入力された画像と、第1の工程の代表画像との一致の前に、第2の工程の代表画像と、一致した場合、確認部13aは、第1の工程よりも先に第2の工程が行われたものと判定する。この場合、確認部13aは、通知部14aに、作業者に対して、作業が正しく行われていないことを通知することが好ましい。このようにすることで、作業者はアプリケーションのインストール作業を間違っていたことを認識できる。これにより、次の作業や工程に入る前に、プログラムのインストールのやり直しを行うことができる。また、その際に、確認部13aが、通知部14aに、具体的な修正作業の内容を通知するように指示してもよい。例えば、図4の例の場合、確認部13aが、作業者に対し、V1.1のアプリケーションのアンインストールと、V1.0のアプリケーションのアンインストールを行った後、V1.1のインストールを行うべきことを通知することが考えられる。 On the other hand, when the input image and the representative image of the first step coincide with each other, if the representative image of the second step matches, the confirmation unit 13a performs the second process before the first step. It is determined that the following process has been performed. In this case, it is preferable that the confirmation unit 13a notify the notification unit 14a that the work is not properly performed. By doing this, the worker can recognize that the application installation work was wrong. This allows the program installation to be redone before the next work or process. In addition, at that time, the confirmation unit 13a may instruct the notification unit 14a to notify the content of the specific correction operation. For example, in the case of the example of FIG. 4, after the confirmation unit 13a uninstalls the application of V1.1 and the application of V1.0 for the worker, the installation of V1.1 should be performed. It is conceivable to notify that.
[第1の実施形態]
 続いて、作業支援装置として、ウェアラブル端末を用いる本発明の第1の実施形態について図面を参照して詳細に説明する。図5は、本発明の第1の実施形態の作業支援装置の構成を示す図である。図5を参照すると、作業者100aが着用するメガネ型のウェアラブル端末(作業者装着端末)100と、サーバ200とがネットワーク101を介して接続された構成が示されている。なお、以下の実施形態では、作業画像の入力装置としてウェアラブル端末を用いるものとして説明するが、作業画像の入力装置としてウェアラブル端末に限られるものではない。例えば、スマートフォンやその他カメラ付きの業務用端末を作業画像の入力装置として用いることも可能である。また、カメラ自体が独立した構成であってもよく、カメラから送信された画像を上記ウェアラブル端末、スマートフォン、業務用端末等を介して、作業支援装置に送信することとしてもよい。なお、メガネ型のウェアラブル端末としては、スマートグラスやヘッドセット型端末と呼ばれるものを用いることができる。
First Embodiment
Subsequently, a first embodiment of the present invention using a wearable terminal as a work support apparatus will be described in detail with reference to the drawings. FIG. 5 is a diagram showing the configuration of the work support apparatus of the first embodiment of the present invention. Referring to FIG. 5, a configuration in which a glasses-type wearable terminal (worker attached terminal) 100 worn by a worker 100 a and the server 200 are connected via the network 101 is shown. In the following embodiments, a wearable terminal is described as a work image input device, but the present invention is not limited to a wearable terminal as a work image input device. For example, it is also possible to use a smartphone or other business terminal with a camera as an input device for work images. In addition, the camera itself may be configured to be independent, and an image transmitted from the camera may be transmitted to the work support apparatus via the wearable terminal, the smartphone, the business terminal, or the like. In addition, as a wearable terminal of a glasses type, what is called a smart glass or a headset type terminal can be used.
 なお、ウェアラブル端末100及びサーバ200のネットワーク101への接続形態(通信方式)としては、種々の方式を用いることができる。例えば、移動体通信網を介した接続、無線LAN(Local Area Network)による接続等が考えられる。その他、Bluetooth(登録商標)や赤外線通信等の近距離無線通信を介して、ウェアラブル端末100及びサーバ200がネットワーク101に接続することとしてもよい。もちろん、有線LAN等の有線通信を介して、ウェアラブル端末100及びサーバ200がネットワーク101に接続する方法も採用可能である。 In addition, as a connection form (communication system) to the network 101 of the wearable terminal 100 and the server 200, various systems can be used. For example, connection via a mobile communication network, connection by wireless LAN (Local Area Network), etc. can be considered. In addition, the wearable terminal 100 and the server 200 may be connected to the network 101 via near field wireless communication such as Bluetooth (registered trademark) or infrared communication. Of course, it is also possible to adopt a method in which the wearable terminal 100 and the server 200 connect to the network 101 via wired communication such as a wired LAN.
 図6は、本発明の第1の実施形態のウェアラブル端末の構成を示す図である。図6を参照すると、制御部102、通信部103、画像入力部104、音声入力部105、入力部106、表示部107、音声出力部108、計時部109及び記憶部110を備えるウェアラブル端末100が示されている。 FIG. 6 is a diagram showing a configuration of the wearable terminal of the first embodiment of the present invention. 6, wearable terminal 100 includes control unit 102, communication unit 103, image input unit 104, voice input unit 105, input unit 106, display unit 107, voice output unit 108, clock unit 109, and storage unit 110. It is shown.
 通信部103は、ネットワーク101を介して、ウェアラブル端末100とサーバ200の間のデータの通信を行う。 The communication unit 103 performs data communication between the wearable terminal 100 and the server 200 via the network 101.
 画像入力部104は、例えば、撮影装置(カメラ)によって構成することができる。具体的には、画像入力部104は、制御部102からの指示に応じて、作業者が視認する領域の画像(動画でも良く、所定間隔(動画像より、撮像時間間隔が広くても良い)毎に撮像した静止画でも良い)を取り込む。画像入力部104は、記憶部110の工程画像DB111に取り込んだ画像を保存する。なお、画像入力部104は、必ずしも撮影装置(カメラ)である必要はなく、撮影装置(カメラ)等を接続可能なインターフェースであってもよい。 The image input unit 104 can be configured, for example, by an imaging device (camera). Specifically, in response to an instruction from the control unit 102, the image input unit 104 displays an image of an area visually recognized by the worker (a moving image or a predetermined interval (the imaging time interval may be wider than the moving image)) It may be a still image captured every time). The image input unit 104 stores the captured image in the process image DB 111 of the storage unit 110. The image input unit 104 does not necessarily have to be an imaging device (camera), and may be an interface to which an imaging device (camera) or the like can be connected.
 音声入力部105は、マイク(マイクロフォン)によって構成することができる。音声入力部105を介して、作業者から音声による指示を受け付け、ウェアラブル端末100の動作を制御してもよい。この音声入力部105から、特定の装置等の作業対象が発する音、作業者の周囲の音を取り込んで、作業が正しく行われているか否かの確認を行うようにしてもよい。 The voice input unit 105 can be configured by a microphone (microphone). A voice instruction may be received from the operator via the voice input unit 105 to control the operation of the wearable terminal 100. A sound emitted from the work target such as a specific device or a sound around the worker may be taken from the voice input unit 105 to confirm whether the work is performed correctly.
 入力部106は、タッチパネルやスライドパッド、マウス、キーボード等の情報入力手段によって構成される。作業者から入力部106を介して指示を受け付け、ウェアラブル端末100の動作を制御してもよい。また、作業者が、入力部106を介して入力した情報(例えば、工程画像DBの修正データ)を、記憶部110の工程画像DB111等に記憶するようにしてもよい。なお、上記した音声入力部105や入力部106は適宜省略することもできる。例えば、作業者からの指示を、画像入力部104によって作業者のジェスチャーを認識して受け取る場合、音声入力部105や入力部106を省略することができる。 The input unit 106 includes an information input unit such as a touch panel, a slide pad, a mouse, and a keyboard. An instruction may be received from the worker via the input unit 106, and the operation of the wearable terminal 100 may be controlled. In addition, information input by the worker via the input unit 106 (for example, correction data of the process image DB) may be stored in the process image DB 111 or the like of the storage unit 110. Note that the voice input unit 105 and the input unit 106 described above can be omitted as appropriate. For example, when the instruction from the worker is received by the image input unit 104 by recognizing the gesture of the worker, the voice input unit 105 and the input unit 106 can be omitted.
 なお、上記した画像入力部104、音声入力部105、入力部106における画像等や音声等の情報の入力は、制御部102の指示するタイミングで行われるものとする。 Note that the input of information such as an image or voice in the image input unit 104, the voice input unit 105, and the input unit 106 described above is performed at the timing instructed by the control unit 102.
 表示部107は、制御部102からの指示で、出力画像(動画でも良い)を表示する液晶ディスプレイや有機EL(Electro Luminescence)ディスプレイによって構成される。 The display unit 107 is configured by a liquid crystal display or an organic EL (Electro Luminescence) display that displays an output image (or a moving image) according to an instruction from the control unit 102.
 音声出力部108は、制御部102からの指示で、音声を出力する。例えば、作業対象の装置に対して行われた操作の際の動作音の取得が必要な場合は、制御部102が、表示部107または音声出力部108に対して、作業者に対し、必要な操作の実行を促すメッセージ(音声案内等)を出力させるようにしてもよい。また、音声出力部108に代えて、出力手段として、作業者に注意を促すための振動モータ、振動を発生するデバイスや、ランプを用いることもできる。 The voice output unit 108 outputs voice according to an instruction from the control unit 102. For example, when it is necessary to obtain an operation sound at the time of an operation performed on a device to be worked, the control unit 102 requires the display unit 107 or the voice output unit 108 to the operator. You may make it output the message (speech guidance etc.) which urges execution of operation. Also, instead of the sound output unit 108, a vibration motor for alerting the operator, a device that generates vibration, or a lamp can be used as the output means.
 計時部109は、作業の開始から各工程までの所要時間(経過時間)を計測する。この計測した所要時間と、標準時間や最大許容時間と比較することにより、作業の遅れの有無や、トラブルの発生有無等を把握することができる。 The clock unit 109 measures the required time (elapsed time) from the start of work to each process. By comparing the measured required time with the standard time or the maximum allowable time, it is possible to grasp the presence or absence of a delay in work, the presence or absence of a trouble, and the like.
 制御部102は、上記したウェアラブル端末100の各部を制御し、画像入力部104から入力された画像と代表画像や誤り画像とを照合し、作業の進捗管理(図10のフローチャート参照)を行う。また本実施形態の制御部102は、各工程の所要時間(経過時間)と別途定められた標準時間や最大許容時間とを比較することにより、作業の進捗を管理する。また本実施形態の制御部102は、予め定められた警告発動条件が成立すると、作業者に対し警告を出力する機能を備えている。このような制御部102は、ウェアラブル端末100の各部を制御するプログラム112を、記憶部110から読み出してプロセッサに実行させることにより実現することもできる。 The control unit 102 controls the respective units of the wearable terminal 100 described above, collates the image input from the image input unit 104 with the representative image or the error image, and performs work progress management (see the flowchart of FIG. 10). Further, the control unit 102 according to the present embodiment manages the progress of the work by comparing the required time (elapsed time) of each process with a separately determined standard time or a maximum allowable time. Further, the control unit 102 of the present embodiment has a function of outputting a warning to the worker when a predetermined warning activation condition is satisfied. Such a control unit 102 can also be realized by reading a program 112 for controlling each unit of the wearable terminal 100 from the storage unit 110 and causing the processor to execute the program.
 記憶部110は、工程画像データベース(工程画像DB)111として機能する。この工程画像DB111には、サーバ200から受信した工程画像情報が登録される。工程画像情報の詳細については、後に、詳細に説明する。 The storage unit 110 functions as a process image database (process image DB) 111. The process image information received from the server 200 is registered in the process image DB 111. The details of the process image information will be described in detail later.
 図7は、本発明の第1の実施形態のサーバ200の構成を示す図である。図7を参照すると、制御部202、通信部203、画像入力部204、音声入力部205、入力部206、表示部207、音声出力部208及び記憶部209を備えるサーバ200が示されている。 FIG. 7 is a diagram showing the configuration of the server 200 according to the first embodiment of this invention. Referring to FIG. 7, a server 200 including a control unit 202, a communication unit 203, an image input unit 204, an audio input unit 205, an input unit 206, a display unit 207, an audio output unit 208, and a storage unit 209 is shown.
 通信部203は、ネットワーク101を介して、ウェアラブル端末100とサーバ200の間のデータの通信を行う。 The communication unit 203 performs data communication between the wearable terminal 100 and the server 200 via the network 101.
 画像入力部204は、ウェアラブル端末100側から、作業者が視認する領域の画像(動画でも良く、所定間隔(動画像より、撮像時間間隔が広くても良い)毎に撮像した静止画でも良い)を受け取るインタフェースである。 The image input unit 204 is an image of a region visually recognized by the worker from the wearable terminal 100 side (a moving image or a still image captured at predetermined intervals (the imaging time interval may be wider than the moving image)). Is an interface that receives
 音声入力部205は、ウェアラブル端末100側から音声データを受け取るインタフェースである。音声入力部205を介して、作業者から音声による指示を受け付け、サーバ200の動作を制御してもよい。この音声入力部205から、特定の装置等の作業対象が発する音、作業者の周囲の音を取り込んで、作業が正しく行われているか否かの確認を行うようにしてもよい。 The voice input unit 205 is an interface that receives voice data from the wearable terminal 100 side. A voice instruction may be received from the worker via the voice input unit 205 to control the operation of the server 200. The sound emitted by the work target such as a specific device or the sound around the worker may be taken in from the voice input unit 205 to check whether the work is properly performed.
 入力部206は、タッチパネルやスライドパッド、マウス、キーボード等の情報入力手段によって構成される。作業者から入力部206を介して指示を受け付け、サーバ200の動作を制御してもよい。また、作業者が、入力部206を介して入力した情報(例えば、工程画像DBの修正データ)を、記憶部209の工程画像DB211等に記憶するようにしてもよい。 The input unit 206 is configured by an information input unit such as a touch panel, a slide pad, a mouse, and a keyboard. An instruction may be received from the worker via the input unit 206 to control the operation of the server 200. In addition, information (for example, correction data of the process image DB) input by the worker via the input unit 206 may be stored in the process image DB 211 or the like of the storage unit 209.
 表示部207は、サーバ200側の操作者に必要なメッセージ等を出力したり、入出力内容を表示する液晶ディスプレイや有機ELディスプレイによって構成される。 The display unit 207 is configured by a liquid crystal display or an organic EL display that outputs a message or the like necessary for the operator on the server 200 side, and displays input / output contents.
 音声出力部208は、制御部202からの指示により、サーバ側の操作者に対し、必要な操作の実行を促すメッセージ(音声案内等)を出力する。また、音声出力部208に代えて、出力手段として、光で操作者に注意を促すためのランプ等を用いることもできる。 In response to an instruction from the control unit 202, the voice output unit 208 outputs a message (voice guidance or the like) prompting the server side operator to perform a necessary operation. Further, instead of the voice output unit 208, a lamp or the like for alerting the operator with light can be used as an output means.
 制御部202は、上記したサーバ200の各部を制御する。この制御部202は、サーバ200の各部を制御するプログラム(非図示)を、記憶部209から読み出してプロセッサに実行させることにより実現することもできる。また、ウェアラブル端末100からサーバ200側に、画像入力部104から入力された画像が送信される場合、この制御部202において、作業の進捗管理(図10のフローチャート参照)を行う処理を行わせても良い。具体的には、制御部202は、ウェアラブル端末100から受信された画像と代表画像や誤り画像とを照合し、作業の進捗管理を行うことになる。また、制御部202において、作業を開始してからの経過時間を記録し、別途定めた標準的な所要時間との比較により、作業の遅れの有無を判定させてもよい。 The control unit 202 controls each unit of the server 200 described above. The control unit 202 can also be realized by reading a program (not shown) for controlling each unit of the server 200 from the storage unit 209 and causing the processor to execute the program. Further, when an image input from the image input unit 104 is transmitted from the wearable terminal 100 to the server 200 side, the control unit 202 performs processing for performing work progress management (see the flowchart in FIG. 10). Also good. Specifically, the control unit 202 collates the image received from the wearable terminal 100 with the representative image or the error image, and manages the progress of the work. In addition, the control unit 202 may record the elapsed time since the start of the work, and determine whether there is a delay in the work by comparison with a separately determined standard required time.
 ここで、サーバ200の記憶部209に保持される情報について説明する。記憶部209には、作業情報と、工程画像情報が登録され、それぞれ作業情報データベース(作業情報DB)210、工程画像データベース(工程画像DB)211として機能する。 Here, the information held in the storage unit 209 of the server 200 will be described. Work information and process image information are registered in the storage unit 209, and function as a work information database (work information DB) 210 and a process image database (process image DB) 211, respectively.
 図8は、作業情報DB210に保持される情報の一例を示す図である。図8を参照すると、装置名、作業種類、障害内容によって特定される「作業」に一意に付与される作業No.を対応づけたエントリが示されている。この作業情報DB210は、ウェアラブル端末100から、工程画像の送信要求を受けた際に、入力された作業内容から作業No.を特定するために用いられる。例えば、ある作業者が、起動しないパソコンの故障修理を行う場合、その作業は#1の作業として特定される。なお、作業の内容は、図8の例に限られず、本発明を適用可能なさまざまな「作業」を特定するための情報を登録してもよい。例えば、図8の例のほか、装置撤去作業や定期点検作業を作業情報DB210に登録してもよい。また例えば、自動車のオイル交換であれば、装置フィールドには、自動車の車種名や型番(型式)を示す情報が設定され、作業種類フィールドには、オイル交換が設定される(なお、障害内容フィールドは、図8の#4のサーバ4の設置と同様に空欄でもよいし、あるいは、オイルエレメント交換の要否等を設定してもよい)。 FIG. 8 is a diagram showing an example of information held in the work information DB 210. As shown in FIG. Referring to FIG. 8, an operation No. uniquely assigned to the “operation” specified by the device name, the operation type, and the fault content. Entries are shown. When the work information DB 210 receives a transmission request for a process image from the wearable terminal 100, the work information DB 210 receives the work content input from the work information DB 210. Used to identify the For example, when a certain worker carries out failure repair of a personal computer which does not start up, the work is identified as the work of # 1. The content of the work is not limited to the example of FIG. 8, and information for specifying various “work” to which the present invention can be applied may be registered. For example, in addition to the example of FIG. 8, the device removal operation and the periodic inspection operation may be registered in the operation information DB 210. Further, for example, in the case of oil change of an automobile, information indicating the model name and model number (model) of the automobile is set in the device field, and oil change is set in the operation type field (note that the fault content field). May be blank as in the case of the installation of the server 4 of # 4 in FIG. 8, or the necessity or the like of oil element replacement may be set).
 図9は、工程画像DB211に保持される情報の一例を示す図である。図9を参照すると、作業No.2の「作業」における各工程の代表画像と、この代表画像等を用いて、工程の進捗管理を行うための情報が対応付けて登録されている。図9の例は、図8の作業情報DBにて作業No.2と特定される、プリンタ2の故障修理(紙詰まり)を行う際の情報が登録されている。また、図9は、作業No.2を構成する6つの工程のうち、工程3まで正常に進み、工程4にて作業の誤りを検出した状態を示している。以下、図9の各項目について説明する。 FIG. 9 is a diagram showing an example of information held in the process image DB 211. As shown in FIG. Referring to FIG. Information for performing process progress management is registered in association with a representative image of each process in “Work” in 2 and this representative image and the like. The example of FIG. 9 corresponds to the task No. in the task information DB of FIG. The information at the time of performing the failure repair (paper jam) of the printer 2 identified as 2 is registered. Also, FIG. Of the six steps constituting step 2, the process proceeds normally to step 3, and a state in which an operation error is detected in step 4 is shown. Hereinafter, each item of FIG. 9 will be described.
 「代表画像」フィールドは、各工程において、正しく作業が行われたことを確認するための画像を登録するためのフィールドである。この代表画像は、ウェアラブル端末100にて画像の一致確認がし易いよう、作業者(例えば、装置の保守作業者であっても良い)が視認する領域(厳密にはウェアラブル端末100に入力が想定される画像と同一の領域)を撮像したものが好ましい。より望ましくは、代表画像の角度、大きさ、画素数等は、ウェアラブル端末100に入力される画像と同一になるように設定される。 The “representative image” field is a field for registering an image for confirming that the work has been correctly performed in each process. This representative image is assumed to be an area (strictly speaking, an input is made to wearable terminal 100) that a worker (for example, a maintenance worker of the apparatus may) visually recognize so that the matching of the image can be easily confirmed in wearable terminal 100 Image taken in the same region as the image to be More preferably, the angle, the size, the number of pixels, and the like of the representative image are set to be the same as the image input to the wearable terminal 100.
 また、誤判定を防ぐため、代表画像は、他の工程でも撮影出来る画像ではなく、その工程でのみ得られる画像であることが好ましい。また、この代表画像は、ウェアラブル端末100に入力が想定される画像の一部と照合する画像であってもよい。また、ウェアラブル端末100に入力が想定される画像と代表画像の角度、大きさ、画素数等が、必ずしも一致している必要はない。例えば、いずれか一方の画像を座標変換により、傾けた画像としたり、画像の大きさや画素数を変換したりすることで、双方を照合する方法も採用可能である。 Moreover, in order to prevent an erroneous determination, it is preferable that the representative image is not an image that can be photographed in another process, but an image obtained only in that process. Further, this representative image may be an image to be collated with a part of the image assumed to be input to the wearable terminal 100. In addition, the angle, the size, the number of pixels, and the like of the image assumed to be input to the wearable terminal 100 and the representative image do not necessarily have to match. For example, it is also possible to adopt a method of collating both by setting one of the images as an inclined image by coordinate conversion, or converting the size and the number of pixels of the image.
 代表画像の例としては、ハードディスク交換作業におけるハードディスクドライブの外観画像、回路基板や機構部品の交換作業における対象装置での実装状態を想定した回路基板や機構部品の外観画像が挙げられる。また、代表画像の別の例として、機器接続作業における作業対象装置での外部に露出しているコネクタの外観やその接続状態を示す画像が挙げられる。 Examples of representative images include an external appearance image of a hard disk drive in hard disk replacement work, and an external appearance image of a circuit board or mechanical component assuming a mounting state on a target device in a replacement operation of a circuit board or mechanical component. Further, as another example of the representative image, there is an image showing the appearance of the connector exposed to the outside of the work target device in the device connection work and the connection state thereof.
 図9の「撮像画像」フィールドは、ウェアラブル端末100の画像入力部104から入力された画像のうち、代表画像や誤り画像(後述)との照合がとれた画像が登録されるフィールドである。図9の例では、サーバ200側の工程画像DB211に「撮像画像」フィールドが設けられているが、ウェアラブル端末100側でのみ代表画像との照合を行う場合、サーバ200側の工程画像DB211の「撮像画像」フィールドは省略できる。前述のとおり、図9の例では、工程1~3までが確認されたので、「撮像画像」フィールドには、代表画像と一致すると判定された画像が登録されている。図9の工程4は、誤り画像との一致が検出されたので、「撮像画像」フィールドには、誤り画像と一致すると判定された画像が登録されている。 The “captured image” field in FIG. 9 is a field in which, of the images input from the image input unit 104 of the wearable terminal 100, an image obtained by matching with a representative image or an error image (described later) is registered. In the example of FIG. 9, although the “captured image” field is provided in the process image DB 211 on the server 200 side, when the comparison with the representative image is performed only on the wearable terminal 100 side, “ The "captured image" field can be omitted. As described above, in the example of FIG. 9, since steps 1 to 3 have been confirmed, an image determined to match the representative image is registered in the “captured image” field. Since a match with the error image is detected in step 4 of FIG. 9, an image determined to match the error image is registered in the “captured image” field.
 図9の「実施状況」フィールドは、ウェアラブル端末100の制御部102又はサーバ200の制御部202で代表画像との照合により、確認された工程の進捗状況等を記録するフィールドである。図9の例では、サーバ200側の工程画像DB211に「実施状況」フィールドが設けられているが、ウェアラブル端末100側でのみ代表画像との照合を行う場合、サーバ200側の工程画像DB211の「実施状況」フィールドは省略できる。前述のとおり、図9の例では、工程1~3までが確認されたので、「実施状況」フィールドには、「作業開始」、「工程確認」といった情報が登録されている。図9の工程4は、誤り画像との一致が検出されたので、「実施状況」フィールドには、「誤り検知」という情報が登録されている。また、工程5以降は行われていないため、「未実施」という初期状態の情報が登録されている。 The “execution status” field in FIG. 9 is a field for recording the progress status of the process confirmed by the comparison with the representative image by the control unit 102 of the wearable terminal 100 or the control unit 202 of the server 200. In the example of FIG. 9, the “implementation status” field is provided in the process image DB 211 on the server 200 side, but when the comparison with the representative image is performed only on the wearable terminal 100 side, “Step image DB 211 of the server 200 side The Implementation Status field can be omitted. As described above, in the example of FIG. 9, since steps 1 to 3 have been confirmed, information such as “operation start” and “step confirmation” are registered in the “execution status” field. In step 4 of FIG. 9, since a match with an error image is detected, information “error detection” is registered in the “execution status” field. Further, since the process 5 and the subsequent steps are not performed, the information of the initial state of "not performed" is registered.
 図9の「手順情報」フィールドは、各工程の次の工程の作業内容等が登録されるフィールドである。「手順情報」フィールドの内容は、必要に応じ、表示部107や音声出力部108を介して、作業者に通知される。 The “procedure information” field in FIG. 9 is a field in which the work content and the like of the process next to each process are registered. The content of the “procedure information” field is notified to the worker via the display unit 107 or the voice output unit 108 as necessary.
 図9の「通知情報」フィールドは、各工程に到達した際に、作業者に通知する情報が登録されるフィールドである。例えば、撮像画像と代表画像が適合し、作業開始と判定した場合、通知情報フィールドに登録されている内容が作業者に通知される。作業情報との差異は、作業情報が作業内容を伝えるという観点で作成されているのに対し、通知情報は、注意事項、連絡事項や、次工程作業を行う上で、作業誤りを防ぐ為の注意喚起の情報が設定されている点である。なお、上記した作業情報や通知情報として、文字、音声、動作、図形等の形態を採ることができる。また、ウェアラブル端末100の眼鏡型のスクリーンに、上記作業情報や通知情報を示す文字や図形を重畳表示し、作業者の作業を支援するようにしてもよい。 The “notification information” field in FIG. 9 is a field in which information to be notified to the worker is registered when each process is reached. For example, when the captured image and the representative image match, and it is determined that the work is started, the worker is notified of the content registered in the notification information field. While the difference between work information and work information is created from the perspective that work information conveys the work content, notification information is used to prevent work errors when performing notes, contact matters, and the next process work. This is the point where alert information is set. In addition, forms, such as a character, a voice, an operation | movement, a figure, can be taken as said work information and notification information. In addition, characters and figures indicating the work information and the notification information may be superimposed and displayed on the glasses-type screen of the wearable terminal 100 to support the work of the worker.
 「誤り画像」フィールドは、各工程において、作業ミスが行われたことを検出するための画像を登録するためのフィールドである。この誤り画像も、代表画像と同様に、ウェアラブル端末100にて画像の一致確認がし易いよう、作業者(例えば、装置の保守作業者であっても良い)が視認する領域(厳密にはウェアラブル端末100に入力が想定される画像と同一の領域)を撮像したものが好ましい。より望ましくは、誤り画像の角度、大きさ、画素数等は、ウェアラブル端末100に入力される画像と同一になるように設定される。 The "error image" field is a field for registering an image for detecting that an operation error has occurred in each process. As in the case of the representative image, this error image is also a region (strictly speaking, wearable) in which the worker (for example, a maintenance worker of the apparatus) may visually check so that the matching of the image can be easily confirmed in the wearable terminal 100 It is preferable that the same image as the image assumed to be input to the terminal 100 be captured. More preferably, the angle, the size, the number of pixels, and the like of the error image are set to be the same as the image input to the wearable terminal 100.
 また、誤判定を防ぐため、誤り画像も、他の工程でも撮影出来る画像ではなく、その工程でのみ得られる画像であることが好ましい。また、この誤り画像は、ウェアラブル端末100に入力が想定される画像の一部と照合する画像であってもよい。また、ウェアラブル端末100に入力が想定される画像と誤り画像の角度、大きさ、画素数等が、必ずしも一致している必要はない。例えば、いずれか一方の画像を座標変換により、傾けた画像としたり、画像の大きさや画素数を変換したりすることで、双方を照合する方法も採用可能である。 Further, in order to prevent an erroneous determination, it is preferable that the error image is not an image that can be photographed in other steps, but an image obtained only in that step. Also, this error image may be an image to be collated with a part of the image assumed to be input to the wearable terminal 100. In addition, the angle, the size, the number of pixels, and the like of the image assumed to be input to the wearable terminal 100 and the error image do not necessarily have to match. For example, it is also possible to adopt a method of collating both by setting one of the images as an inclined image by coordinate conversion, or converting the size and the number of pixels of the image.
 図9の「標準時間」フィールドは、各工程間の標準的な作業時間が登録されるフィールドである。この標準時間と実際の作業時間を比較することにより作業の進捗具合(遅れ具合)を把握することが可能となる。また、図9の「最大許容時間」フィールドは、各工程間の所要時間として許容される時間が登録されるフィールドである(即ち、最大許容時間>標準時間である)。図9の「経過時間」フィールドは、各工程間の実際の所要時間が登録されるフィールドである。なお、図9の例では、最終工程である工程6の標準時間と最大許容時間が空欄となっているが、これは、この工程の代表画像との一致をもって作業終了となり、判定の必要が無いためである。 The “standard time” field in FIG. 9 is a field in which a standard working time between each process is registered. By comparing the standard time with the actual work time, it is possible to grasp the progress (the degree of delay) of the work. Also, the "maximum allowable time" field in FIG. 9 is a field in which an allowable time required for each process is registered (that is, maximum allowable time> standard time). The "elapsed time" field of FIG. 9 is a field in which the actual required time between each process is registered. In the example of FIG. 9, the standard time and the maximum allowable time of the final process, step 6, are blank. However, the operation ends with the agreement with the representative image of this process, and there is no need for determination. It is for.
 図9の「遅れ判定」フィールドは、各工程の進捗が、標準的な所要時間と比較して進んでいるのか、遅れているのかの判定結果が格納されるフィールドである。図9の例では、経過時間が、標準時間より短い場合「早め判定」としている。例えば、工程3は、標準時間15分に対し、経過時間が8分であるため、「早め判定」となっている。図9の例では、経過時間が、最大許容時間より長い場合「遅延判定」としている。例えば、工程2、4は、最大許容時間より、経過時間が長いため、「遅延判定」となっている。図9の例では、経過時間が、最大許容時間よりも短いが標準時間より長い場合「遅れ判定」としている。例えば、工程1は、標準時間5分に対し、経過時間が8分、最大許容時間が10分となっているため、「遅れ判定」となっている。なお、これらの判定基準はあくまで一例であり、判定を行う作業の種別や性質により適宜変更することができる。例えば、図9のような工程単位での判定のほか、累積標準時間や累積最大許容時間を設定し、累積の経過時間との比較を行って判定結果を得ても良い。 The “delay determination” field in FIG. 9 is a field in which the determination result of whether the progress of each process is advanced or delayed compared to the standard required time is stored. In the example of FIG. 9, when the elapsed time is shorter than the standard time, "early determination" is made. For example, since the elapsed time is 8 minutes with respect to the standard time of 15 minutes, the process 3 is “early determination”. In the example of FIG. 9, when the elapsed time is longer than the maximum allowable time, “delay determination” is set. For example, in steps 2 and 4, since the elapsed time is longer than the maximum allowable time, it is “delay determination”. In the example of FIG. 9, when the elapsed time is shorter than the maximum allowable time but longer than the standard time, “delay determination” is made. For example, in step 1, the elapsed time is 8 minutes and the maximum allowable time is 10 minutes with respect to the standard time of 5 minutes, so “delay determination” is made. In addition, these determination criteria are an example to the last, and can be suitably changed with the classification and property of operation | work which perform determination. For example, in addition to the determination in process units as shown in FIG. 9, the cumulative standard time or the cumulative maximum allowable time may be set, and the determination result may be obtained by comparing with the cumulative elapsed time.
 また、標準時間と最大許容時間との関係も工程の内容に応じて適宜設定することができる。ごく簡単には、標準時間に所定時間を加えたものを最大許容時間としてもよいし、標準時間に所定の率を乗じたものを最大許容時間としてもよい。 Also, the relationship between the standard time and the maximum allowable time can be set appropriately according to the contents of the process. Quite simply, a standard time plus a predetermined time may be set as the maximum allowable time, or a standard time multiplied by a predetermined rate may be set as the maximum allowable time.
 図9の「遅れ判定」フィールドに更新が行われたタイミングで、作業者に対して、音声や画像により、その判定結果や遅延量等を通知してもよい。これにより、作業者は、作業の遅れや、その程度を把握することが可能となる。 At the timing when the “delay determination” field in FIG. 9 is updated, the determination result or the delay amount may be notified to the operator by voice or image. As a result, the worker can grasp the delay of work and the degree thereof.
 図9の「警告情報」フィールドは、「警告発動条件」フィールドに設定された条件が成立した際に、作業者に通知される警告内容が格納されるフィールドである。例えば、図9の例では工程2と工程4に警告情報と、その警告発動条件が設定されている。これにより、作業中の特に重要な工程について、誤りや遅延が発生した際に作業者に警告(通知)を行うことが可能となっている。 The “warning information” field in FIG. 9 is a field in which the warning content notified to the worker is stored when the condition set in the “warning activation condition” field is satisfied. For example, in the example of FIG. 9, the warning information and the warning activation condition are set in the steps 2 and 4. This makes it possible to warn (notify) the worker when an error or a delay occurs with respect to a particularly important process in operation.
 続いて、本実施形態の動作について図面を参照して詳細に説明する。図10は、本発明の第1の実施形態のウェアラブル端末の動作を表したフローチャートである。図10を参照すると、まず、作業者が、ウェアラブル端末100の入力部106から、図8に示す作業を特定するための情報(装置名、作業種類、障害内容等)を入力する(ステップS301)。 Subsequently, the operation of the present embodiment will be described in detail with reference to the drawings. FIG. 10 is a flowchart showing the operation of the wearable terminal according to the first embodiment of this invention. Referring to FIG. 10, first, the worker inputs information (device name, work type, fault content, etc.) for specifying the work shown in FIG. 8 from the input unit 106 of the wearable terminal 100 (step S301). .
 ウェアラブル端末100の制御部102は、サーバ200に対し、前記入力された情報を送信し、作業Noの特定と、特定した作業Noに対応する工程画像の送信を要求する。サーバ200は、作業情報DB210から、ウェアラブル端末100から受け取った情報(装置名、作業種類、障害内容等)に対応する作業Noを特定する。次に、サーバ200は、工程画像DB211から作業Noに対応するデータを特定し、作業Noとともに、ウェアラブル端末100に送信する。ウェアラブル端末100は、サーバ200から受け取った情報を、工程画像DB111に格納する(ステップS302)。 The control unit 102 of the wearable terminal 100 transmits the input information to the server 200, and requests specification of a work No. and transmission of a process image corresponding to the specified work No. The server 200 specifies, from the work information DB 210, a work No. corresponding to the information (apparatus name, work type, fault content, etc.) received from the wearable terminal 100. Next, the server 200 specifies data corresponding to the work No. from the process image DB 211, and transmits the data together with the work No. to the wearable terminal 100. The wearable terminal 100 stores the information received from the server 200 in the process image DB 111 (step S302).
 ウェアラブル端末100の制御部102は、工程数を示す変数に「1」を設定し、入力画像との照合動作を開始する(ステップS303)。 The control unit 102 of the wearable terminal 100 sets “1” to the variable indicating the number of steps, and starts the matching operation with the input image (step S303).
 ウェアラブル端末100の制御部102は、画像入力部104から作業者の視認する領域の画像を取り込む(ステップS304)。取り込んだ画像は工程画像DB111の撮像画像フィールドに一次保存される。ウェアラブル端末100の制御部102は、取り込んだ画像と、工程画像DB111の該当する作業Noの工程数=1の代表画像とを比較する(ステップS305)。ここで、取り込んだ画像と、工程画像DB111の該当する作業Noの工程数=1の代表画像とが一致しない場合(ステップS305の「一致しない」)、ウェアラブル端末100の制御部102は、作業は開始されていないと判定する。従って、ウェアラブル端末100の制御部102は、画像入力部104から作業者の視認する領域の画像の取り込みを継続する(ステップS304へ)。 The control unit 102 of the wearable terminal 100 captures an image of a region visually recognized by the worker from the image input unit 104 (step S304). The captured image is temporarily stored in the captured image field of the process image DB 111. The control unit 102 of the wearable terminal 100 compares the captured image with the representative image of the process number = 1 of the corresponding operation No. of the process image DB 111 (step S305). Here, when the captured image does not match the representative image of the process number = 1 of the corresponding work No. of the process image DB 111 ("do not match" in step S305), the control unit 102 of the wearable terminal 100 performs the work Determine that it has not been started. Therefore, the control unit 102 of the wearable terminal 100 continues to capture an image of the area visually recognized by the worker from the image input unit 104 (to step S304).
 なお、ステップS305以下における画像同士の照合には種々の方法を用いることができる。その具体例は、後に、詳細に説明する。 In addition, various methods can be used for collation of the images in step S305 or subsequent ones. The specific example will be described in detail later.
 一方、取り込んだ画像と、工程画像DB111の該当する作業Noの工程数=1の代表画像とが一致した場合(ステップS305の「一致」)、ウェアラブル端末100の制御部102は、作業がスタートしたと認識する(ステップS306)。この段階で、例えば、図9の工程No.1の実施状況が「作業開始」となり、計時部109にて経過時間のカウントがスタートする。 On the other hand, when the captured image and the representative image of the process number of the corresponding operation No. = 1 of the process image DB 111 match (“coincidence” in step S305), the control unit 102 of the wearable terminal 100 starts the operation. (Step S306). At this stage, for example, process No. 1 in FIG. The implementation status of 1 becomes “work start”, and the counting unit 109 starts counting the elapsed time.
 ウェアラブル端末100の制御部102は、工程数を示す変数に1加算し(インクリメント)、入力画像と次工程との照合動作を開始する(ステップS307)。 The control unit 102 of the wearable terminal 100 adds 1 to the variable indicating the number of steps (increment), and starts the matching operation between the input image and the next step (step S307).
 ウェアラブル端末100の制御部102は、画像入力部104から作業者の視認する領域の画像を取り込む(ステップS308)。取り込んだ画像は工程画像DB111の撮像画像フィールドに一次保存される。ウェアラブル端末100の制御部102は、取り込んだ画像と、工程画像DB111の該当する工程数の代表画像とを比較する(ステップS309)。ここで、取り込んだ画像と、工程画像DB111の該当する工程数の代表画像が一致しない場合(ステップS309の「一致しない」)、ウェアラブル端末100の制御部102は、取り込んだ画像と、現工程数より先のすべての工程の代表画像との照合を行う(ステップS316)。ここで、取り込んだ画像と、現工程数より先のすべての工程の代表画像とが一致しない場合(ステップS316の「一致しない」)、ウェアラブル端末100の制御部102は、ステップS309に戻り、取り込んだ画像と、工程画像DB111の該当する工程数の代表画像との照合を継続する。 The control unit 102 of the wearable terminal 100 captures an image of a region visually recognized by the worker from the image input unit 104 (step S308). The captured image is temporarily stored in the captured image field of the process image DB 111. The control unit 102 of the wearable terminal 100 compares the captured image with the representative image of the corresponding number of steps of the step image DB 111 (step S309). Here, when the captured image and the representative image of the corresponding number of steps of the process image DB 111 do not match (“do not match” in step S309), the control unit 102 of the wearable terminal 100 acquires the captured image and the current number of steps. Matching with representative images of all earlier steps is performed (step S316). Here, if the captured image does not match the representative images of all the processes before the current number of processes (“do not match” in step S316), the control unit 102 of the wearable terminal 100 returns to step S309 and captures The collation of the base image with the representative image of the corresponding number of steps of the step image DB 111 is continued.
 一方、ステップS316で比較した代表画像のうちのいずれか一つと、取り込んだ画像とが一致した場合、ウェアラブル端末100の制御部102は、正しい作業順序で作業が行われていないと認識する(ステップS317)。そして、ウェアラブル端末100の制御部102は、表示部107又は音声出力部108を介して、作業者に対し、作業の順序誤りを通知する(ステップS318)。その際に、ウェアラブル端末100の制御部102が、該当する工程の作業情報や通知情報を出力して、作業者に対し正しい作業内容や戻し作業の内容を指示してもよい。 On the other hand, when any one of the representative images compared in step S316 matches the captured image, the control unit 102 of the wearable terminal 100 recognizes that the work is not performed in the correct work order (step S317). Then, the control unit 102 of the wearable terminal 100 notifies the worker of a work order error via the display unit 107 or the voice output unit 108 (step S318). At that time, the control unit 102 of the wearable terminal 100 may output work information and notification information of the corresponding process, and may instruct the worker on the correct work content and the content of the return work.
 一方、ステップS309で、取り込んだ画像と、工程画像DB111の該当する工程数の代表画像が一致した場合(ステップS309の「一致」)、ウェアラブル端末100の制御部102は、正しい作業順序で作業が行われていると認識する(ステップS310)。この場合、ウェアラブル端末100の制御部102は、表示部107又は音声出力部108を介して、作業者に対し、正しい順序で作業が行われていることを通知する(ステップS311)。 On the other hand, if the captured image and the representative image of the corresponding number of steps in the process image DB 111 match in step S309 ("match" in step S309), the control unit 102 of the wearable terminal 100 performs the work in the correct work order It recognizes that it is being performed (step S310). In this case, the control unit 102 of the wearable terminal 100 notifies the worker that the work is being performed in the correct order via the display unit 107 or the audio output unit 108 (step S311).
 次に、ウェアラブル端末100の制御部102は、工程数を示す変数が最終工程数に一致するか否かを確認する(ステップS312)。ここで、工程数を示す変数が最終工程数未満である場合、作業は終わっていないので、ウェアラブル端末100の制御部102は、ステップS308に戻って、取り込んだ画像と、工程画像DB111の該当する工程数の代表画像との比較を継続する(ステップS312の「最終工程でない」)。 Next, the control unit 102 of the wearable terminal 100 confirms whether or not the variable indicating the number of steps matches the final number of steps (step S312). Here, when the variable indicating the number of steps is less than the final number of steps, the work is not finished, so the control unit 102 of the wearable terminal 100 returns to step S308 and corresponds to the captured image and the step image DB 111. The comparison of the number of steps with the representative image is continued ("not the last step" in step S312).
 一方、工程数を示す変数が最終工程数と一致した場合、ウェアラブル端末100の制御部102は、該当する作業Noの全工程が正しく行われたものと認識する(ステップS313)。ウェアラブル端末100の制御部102は、表示部107又は音声出力部108を介して、作業者に対し、正しい順序で作業が完了したことを通知する(ステップS314)。これにより、一連の作業の進捗管理が完了する。 On the other hand, when the variable indicating the number of steps matches the final number of steps, the control unit 102 of the wearable terminal 100 recognizes that all steps of the corresponding work No. have been correctly performed (step S313). The control unit 102 of the wearable terminal 100 notifies the worker that the work has been completed in the correct order via the display unit 107 or the audio output unit 108 (step S314). This completes the progress management of the series of tasks.
 なお、図10のフローチャートでは省略しているが、ウェアラブル端末100の制御部102は、ステップS309の新しい画像と次工程の代表画像の一致を確認したタイミング等で、その時点での経過時間と、前工程の標準時間、最大許容時間との比較を行う。この比較の結果に応じて、ウェアラブル端末100の制御部102は、前工程の遅れ等を確認し、工程画像DBの遅れ判定フィールドの内容を更新する。また、遅れが検出された場合、ウェアラブル端末100の制御部102は、作業者に対し、作業遅れ等を通知する。図10のフローチャートでは省略しているが、ウェアラブル端末100の制御部102は、所定の周期で、図9の工程画像DB111の警告発動条件フィールドに規定された条件が成立したか否かを確認する。ここで、警告発動条件フィールドに規定された条件が成立したと判定した場合、ウェアラブル端末100の制御部102は、警告情報フィールドに設定された内容を用いて作業者に対し、警告を行う。 Although omitted in the flowchart of FIG. 10, the control unit 102 of the wearable terminal 100 determines the elapsed time at the time when the match between the new image in step S309 and the representative image in the next step is confirmed. Compare the standard time of the previous process with the maximum allowable time. According to the result of the comparison, the control unit 102 of the wearable terminal 100 confirms the delay or the like of the previous process, and updates the content of the delay determination field of the process image DB. In addition, when a delay is detected, the control unit 102 of the wearable terminal 100 notifies the worker of a delay in operation and the like. Although omitted in the flowchart of FIG. 10, the control unit 102 of the wearable terminal 100 confirms whether or not the condition defined in the warning activation condition field of the process image DB 111 of FIG. 9 is satisfied at a predetermined cycle. . Here, when it is determined that the condition defined in the warning activation condition field is satisfied, the control unit 102 of the wearable terminal 100 gives a warning to the worker using the content set in the warning information field.
 以上説明した本実施形態のウェアラブル端末100及びサーバ200によれば、作業者が行っている作業をリアルタイムに観測し、正しい順序で行われているか否かを確認することが可能となる。また、本実施形態では、作業者に対し、各工程で必要な情報を提供したり、作業の遅れの有無や、警告を発することも可能となっている。このため、作業工程進捗の管理を、省力化ないし自動化することが可能となる。 According to the wearable terminal 100 and the server 200 of the present embodiment described above, it is possible to observe in real time the work performed by the worker and to confirm whether or not the work is performed in the correct order. Further, in the present embodiment, it is possible to provide the worker with necessary information in each process, and to issue a warning about whether or not the operation is delayed. For this reason, it is possible to save labor or automate management of the progress of the work process.
 本実施形態を適用した工程の進捗管理の例を説明する。
(工程1)カバーを取り外す工程
 代表画像としては、カバー全体の画像を用いる。この画像には、事前にマーカを付加したカバーの画像を用いてもよい。また、このマーカに代えて、カバー外形の形状や大きさ(撮像倍率も考慮する)、カバーを認識できるような画像を用意してもよい。これにより作業の開始が検出される。
An example of progress management of a process to which this embodiment is applied will be described.
(Step 1) Step of Removing the Cover As a representative image, an image of the entire cover is used. For this image, an image of a cover to which a marker is added in advance may be used. Also, instead of this marker, an image may be prepared which can recognize the shape and size of the outer shape of the cover (in consideration of the imaging magnification) and the cover. This detects the start of work.
(工程2)カバーの固定
 カバーの固定は、取り外し可能な複数のボルト(雄ねじ、を意味する。)による、ネジ固定を行うものとする。この場合、ネジ固定後のカバー画像を代表画像として用いることで、カバーの固定が正しく行われていることを検出できる。あるいは、この代表画像との照合に代えて、これらのボルトの位置を予め記憶しておき、取得した撮像画像から、カバー部を検出し、そのカバー全体の位置から、各ボルトの、相対位置部分の画像を切りだし、この切り出した画像の中に、ボルト特有の色(例えば、亜鉛メッキの色)部分が、ボルトの頭の大きさが入る所定範囲の領域に有るか、検出、判断を行うことで、その位置にボルトが有るかを判断する方法も採用ができる。仮に、カバーの4隅、4か所に、ボルトでカバーを取り付ける部分が有るとしたら、その領域にボルトが存在することを検出し、すべての個所に、ボルトが有ると判断した場合、正しく、カバーが取り付けられ固定されていると判断する。
(Step 2) Fixing of the cover Fixing of the cover shall be screw fixing by means of a plurality of removable bolts (meaning male screws). In this case, by using the cover image after screw fixation as a representative image, it is possible to detect that the cover is correctly fixed. Alternatively, instead of collation with this representative image, the positions of these bolts are stored in advance, the cover portion is detected from the acquired captured image, and the relative position portion of each bolt is detected from the position of the entire cover. The image is cut out, and it is detected and judged whether the bolt-specific color (for example, the color of zinc plating) is in the area of the predetermined range in which the size of the bolt head is included in the cut-out image. Thus, a method of determining whether there is a bolt at that position can also be employed. If there are parts to attach the cover with bolts at four corners and four places of the cover, it is detected that there are bolts in that area, and if it is judged that there are bolts at all places, it is correct. Judge that the cover is attached and fixed.
 更に、前記ボルトの取り付け順序が有る場合(例えば、カバーの右上、左下(右上の対角)、左上、右下(左上の対角)、の順に締めなければならない、という順序がある場合)には、工程2を細分化してもよい。例えば、ボルトを締める毎に、その順番に従って取り付けられるボルトの位置にボルトの像が増える代表画像を、予め用意しておき、撮像画像と、前記ボルトを締める順番に従った代表画像と、その順番通りにマッチングを行うことができる。そして、その順番通りに、画像が一致した場合、定められた順番通りに、ボルトが締められたと判断することができる。 Furthermore, if there is a mounting order of the bolts (for example, if there is an order of tightening in the order of upper right, lower left (upper right diagonal), upper left, lower right (upper left diagonal) of the cover) May subdivide step 2. For example, every time the bolt is tightened, a representative image in which an image of the bolt is increased is prepared in advance at the position of the bolt attached in that order, and the captured image, the representative image according to the order of tightening the bolt, and the order Matching can be done as it is. Then, if the images match according to the order, it can be determined that the bolts are tightened according to the defined order.
(工程3)ケーブルの抜去
 ケーブルの端部が、装置の所定の場所、端子に接続されている、すなわち、差し込まれている状態を撮影した代表画像とケーブルが無い状態を撮影した代表画像を用意する。そして、画像入力部104から取得した画像が、前記ケーブルの端部が差し込まれている画像から、ケーブルが無い画像に変化した場合、ケーブルが作業手順に従って抜かれた、と判断することができる。
(Step 3) Withdrawal of the cable The end of the cable is connected to a predetermined place of the device, a terminal, that is, a representative image obtained by imaging the inserted state and a representative image obtained by imaging the absence of the cable are prepared. Do. Then, when the image obtained from the image input unit 104 changes from the image into which the end of the cable is inserted to an image without a cable, it can be determined that the cable has been pulled out in accordance with the work procedure.
 または、抜去されたコネクタの装置側の画像を代表画像とし、画像入力部104から取得した画像と一致したことをもって、ケーブルが正しく抜かれた、と判断しても良い。 Alternatively, the image on the device side of the removed connector may be used as a representative image, and it may be determined that the cable has been correctly removed from the match with the image acquired from the image input unit 104.
(工程4)スイッチの操作
 スイッチ、トグルスイッチ等の操作部分が、スイッチのオン、オフ、に従い変化する場合や、スイッチのオンオフを行うと、スイッチの一部の色が変わる変化が有る場合、スイッチがオン状態の画像と、オフ状態の画像をそれぞれ代表画像として用意しておく。画像入力部104から取得した画像が、スイッチがオン状態の画像と一致した後、スイッチがオフの画像と一致した場合、スイッチが正しく操作された、と判断することができる。そして、一連の作業がスイッチをオフにすることで完了する場合、本工程をもって作業が完了したことを確認できる。
(Step 4) Operation of the switch When the operation part of the switch, toggle switch, etc. changes according to the on / off of the switch, or when there is a change in the color of part of the switch when the switch is turned on / off, the switch An image in the on state and an image in the off state are prepared as representative images. If the image acquired from the image input unit 104 matches the image in the on state after the switch matches the image in the on state, it can be determined that the switch has been correctly operated. Then, when a series of work is completed by turning off the switch, it is possible to confirm that the work is completed according to this process.
[第2の実施形態]
 続いて、本発明の第2の実施形態について説明する。図11は、本発明の第2の実施形態のウェアラブル端末100bの構成を示す図である。図6に示した第1の実施形態のウェアラブル端末100との構成上の相違点は、記憶部110aに作業情報DB113が配置されている点である。
Second Embodiment
Subsequently, a second embodiment of the present invention will be described. FIG. 11 is a diagram showing the configuration of the wearable terminal 100b according to the second embodiment of this invention. The difference in configuration from the wearable terminal 100 of the first embodiment shown in FIG. 6 is that the work information DB 113 is arranged in the storage unit 110a.
 具体的には、本実施形態のウェアラブル端末100bは、作業情報DB113を備えているため、ウェアラブル端末100b単体で、作業Noを特定することが可能となっている。また、ウェアラブル端末100bの工程画像DB111には、事前に、その作業者が行う作業に必要なデータが格納されている。そして、本実施形態のウェアラブル端末100bは、工程画像DB111から、前記特定した作業Noに対応するデータを読み出してセットする。以降の動作は、図10に示した第1の実施形態のステップS303以降と同一であるので、説明を省略する。 Specifically, since the wearable terminal 100b of the present embodiment includes the work information DB 113, it is possible to identify the work No. by the wearable terminal 100b alone. In addition, data necessary for the work performed by the worker is stored in advance in the process image DB 111 of the wearable terminal 100 b. Then, the wearable terminal 100b according to the present embodiment reads and sets data corresponding to the specified work No. from the process image DB 111. The subsequent operation is the same as that of step S303 and subsequent steps of the first embodiment shown in FIG.
 以上のように、本実施形態によれば、ウェアラブル端末100b単体で、作業者の作業をチェックすることが可能となっており、特に、通信インフラが整っていない場所での作業を行うケースに好適に適用することができる。 As described above, according to the present embodiment, the work of the worker can be checked by the wearable terminal 100b alone, and in particular, it is suitable for the case where the work is performed in a place where the communication infrastructure is not ready It can be applied to
 最後に、本発明のいくつかの変形実施形態について説明する。
[ウェアラブル端末]
 上記したウェアラブル端末として、スマートグラスやヘッドセット端末と呼ばれる頭部装着型で、装着した者の視界に情報を表示できる端末を用いることができる。また、このようなウェアラブル端末において、作業者の網膜上で結像する光束を、対象者の目に直接照射して、網膜上に像を結像させ、対象者に、視野上の像(虚像)として認識させるタイプの機器を用いることもできる。
Finally, several alternative embodiments of the invention will be described.
[Wearable terminal]
As the wearable terminal described above, it is possible to use a head-mounted terminal called a smart glass or a headset terminal, which can display information in the field of view of the person who wears it. Further, in such a wearable terminal, a light flux formed on the retina of the worker is directly irradiated to the eye of the subject to form an image on the retina, and the image of the visual field on the subject (a virtual image It is also possible to use a device of the type recognized as).
 より望ましくは、これらスマートグラスやヘッドセット端末の内側に、装着した者の眼の動きをセンシングするセンサを取り付けることもできる。例えば、これらのセンサを備えたウェアラブル端末は、表示部107に映し出される像が見難いときに行われる作業者の目を細める動作を検出する。このような動作が行われたとき、これらのウェアラブル端末は、画像入力部104から得られる入力画像をズームし、表示部107にて、拡大表示する。なお、上記目を細める動作が行われているか否かは、上記センサから得られる反射光量が所定範囲内かつ所定時間以上継続したか否か等により判断することができる。また、上記センサの精度を向上させるためには、反射光を入力させるために眼球方向に補助光を照射することが望ましく、その照射光は不可視光(近赤外光が望ましい)で、かつ、その光強度は、眼球および網膜に影響の無い範囲であることが好ましい。 More desirably, a sensor that senses the movement of the wearer's eye can be attached to the inside of the smart glass or the headset terminal. For example, the wearable terminal provided with these sensors detects an operation to narrow the eye of the operator, which is performed when the image displayed on the display unit 107 is difficult to see. When such an operation is performed, these wearable terminals zoom the input image obtained from the image input unit 104 and display the display unit 107 in an enlarged manner. Whether or not the operation of narrowing the eyes is being performed can be determined based on whether or not the amount of reflected light obtained from the sensor has continued within a predetermined range and for a predetermined time or more. Further, in order to improve the accuracy of the sensor, it is desirable to irradiate auxiliary light in the direction of the eye in order to input the reflected light, and the irradiated light is invisible light (preferably near infrared light) and The light intensity is preferably in a range that does not affect the eye and retina.
 また、上記したウェアラブル端末等の端末において、装置の姿勢を検出する3軸加速度センサを内蔵し、重力方向を検出させることも好ましい。これにより、撮像方向や、対象物の画像範囲での大きさ、対象装置の傾き等の情報を得ることが可能となる。これにより、下記マッチングにおいて、画像の傾きの補正や、大きさの変更等の前処理を適切に行うことが可能となる。 In the above-described terminal such as the wearable terminal, it is also preferable to incorporate a 3-axis acceleration sensor for detecting the attitude of the device and to detect the direction of gravity. This makes it possible to obtain information such as the imaging direction, the size of the object in the image range, and the tilt of the target device. Thereby, in the following matching, it becomes possible to appropriately perform pre-processing such as correction of the inclination of the image and change of the size.
[代表画像とのマッチング処理]
 上記図10のステップS305、S309、S316で行われる、画像入力部104より取り込んだ画像と代表画像の一致判定処理の変形例について説明する。画像入力部104より取り込んだ画像と代表画像の一致判定は、画像の中の特徴点を抽出し、特徴点同士のマッチングを行う画像マッチング処理と呼ばれる方法を用いて行うことができる。
[Matching process with representative image]
A modification of the process of determining whether the image acquired by the image input unit 104 and the representative image is the same or not, which is performed in steps S305, S309, and S316 of FIG. 10, will be described. The matching determination of the image fetched from the image input unit 104 and the representative image can be performed using a method called image matching processing in which feature points in the image are extracted and matching between the feature points is performed.
 特定の作業の工程の確認においては、画像入力部104より取り込んだ画像と代表画像のマッチングがし易い様、画像入力部104より取り込んだ画像と代表画像とが、予め定めた角度や大きさで撮像されるように調整されていることが好ましい。例えば、代表画像として、作業者が着用するウェアラブル端末100と同一のウェアラブル端末のカメラにて撮影された画像を用いることが好ましい。また、代表画像は、他の工程でも検出出来る画像ではなく、その工程でのみ得られる画像であることが好ましい。 In order to easily match the representative image with the image fetched from the image input unit 104 in the confirmation of the process of the specific work, the image fetched from the image input unit 104 and the representative image have predetermined angles and sizes. Preferably, it is adjusted to be imaged. For example, as a representative image, it is preferable to use an image captured by the camera of the same wearable terminal as the wearable terminal 100 worn by the worker. Moreover, it is preferable that a representative image is not an image that can be detected in another process, but an image that is obtained only in that process.
 また、画像入力部104より取り込んだ画像又は代表画像の少なくとも一方が傾いていたり、大きさが異なる場合、座標変換により傾きを補正したり、画像の大きさを変更する等してからマッチングすることが好ましい。特定の作業の工程の確認においては、回路基板の撮像画像を代表画像とすることができる。回路基板のマッチングにおいて、外形を認識して、一致検出を行っても良く、予め定めた形状や色(背景から浮かび上がらせる為、背景色と異なる色)にて、一致検出を行ってもよい。また、作業の対象となる対象物と代表画像の双方に、事前に、複数のマーカ(少なくとも3点。3点で、画像の位置と大きさと角度の検出が可能。更には、画像一致判断の為の、画像の大きさの調整と姿勢のアライメントが可能。)を付し、これらのマーカを特徴点として用いてマッチングを行っても良い。またマーカを使用する場合において、比較対象画像の背景との色の差による2値化を行い、各マーカの位置を抽出してマッチングを行ってもよい。さらには、これらのマーカにより、その位置(撮像面内での、X、Y方向、撮像方向に垂直な面内で、それぞれ垂直に交わる軸)、姿勢(角度θz、撮像面に垂直方向を中心にした回転方向)や大きさを抽出してマッチングを行っても良い。 In addition, if at least one of the image or the representative image fetched from the image input unit 104 is inclined or the size is different, the inclination is corrected by coordinate conversion, or the size of the image is changed before matching. Is preferred. In confirmation of the process of the specific operation, the captured image of the circuit board can be used as a representative image. In the matching of the circuit board, the outer shape may be recognized to perform match detection, or match detection may be performed with a predetermined shape or color (a color different from the background color in order to make the background float up). In addition, a plurality of markers (at least three points, three points can detect the position, size, and angle of the image) in advance for both the target object and the representative image to be subjected to the work. Therefore, matching may be performed using these markers as feature points. When markers are used, binarization may be performed based on the difference in color from the background of the comparison target image, and the positions of the markers may be extracted to perform matching. Furthermore, with these markers, the position (an axis perpendicular to each other in the imaging plane in the X and Y directions and in the plane perpendicular to the imaging direction), posture (angle θz, center in the direction perpendicular to the imaging plane) The matching may be performed by extracting the rotation direction and the size.
 また、上記マーカによる一致判定に加えて、基準位置からの各部位のパターンの一致度を算出し、この一致度を用いて最終的な同一性判断を行ってもよい。より具体的には、画像の、角度と大きさを合わせた後の、比較画像との差分、例えば、各画素毎の輝度を、輝度の平均が一致する様、画像全体で正規化後、所定以上の輝度差が有る部分の面積の差分により判断する手法を用いることができる。 Further, in addition to the coincidence determination by the marker, the coincidence degree of the pattern of each part from the reference position may be calculated, and the final identity judgment may be performed using this coincidence degree. More specifically, the difference between the image and the comparison image after matching the angle and the size, for example, the luminance for each pixel, is normalized after the entire image so that the average of the luminances agrees, and then the predetermined The method of judging by the difference of the area of the part which has the above luminance differences can be used.
 なお、前記マッチング処理において、演算量を減らすため、所定の閾値(スレッショルド)で2値化した画像間でマッチングを行っても良い。また、前記マッチングにおいて、画像全体でなく、その一部を抽出してマッチングを行ってもよい。例えば、画像入力部104より取り込んだ画像の予め定めた割合の一部の画像が、代表画像と、所定度以上一致するかを判断するマッチング処理を行うことができる。この場合において、画像入力部104より取り込んだ画像の一部の領域を、所定の割合で移動させ、移動の都度、マッチング処理を行い、所定以上の一致度が得られるか否かにより、画像入力部104より取り込んだ画像内に代表画像が有るかを検出しても良い。 In the matching process, in order to reduce the amount of calculation, matching may be performed between images binarized with a predetermined threshold (threshold). In the matching, not a whole image but a part of the image may be extracted and matched. For example, it is possible to perform a matching process to determine whether a part of the images in a predetermined ratio of the images captured from the image input unit 104 matches the representative image by a predetermined degree or more. In this case, a partial area of the image captured from the image input unit 104 is moved at a predetermined ratio, and matching processing is performed each time the movement is performed, and the image input is performed depending on whether a predetermined matching degree or more is obtained. It may be detected whether there is a representative image in the image captured by the unit 104.
 また、画像入力部104より取り込んだ画像から切り出す際に前記予め定めた割合を、等倍から所定倍率まで、所定間隔で変化させ、取得した画像中に、代表画像と一致する領域が有るか否かの判断を繰り返しても良い。これにより、画像の大きさが変化しても、画像の一致を検出することが可能となる。 Further, when cutting out from the image taken in from the image input unit 104, the predetermined ratio is changed at a predetermined interval from equal magnification to a predetermined magnification, and there is a region in the acquired image that matches the representative image. You may repeat the judgment. Thereby, even if the size of the image changes, it is possible to detect the coincidence of the images.
 また、画像の領域を移動する度に、所定角度画像を回転させ、その都度、マッチングを行う方法も採用可能である。360度回転させるまでの間に、所定以上の一致度が得られるか否かを確認することで、対象画像が回転していても、正しい判断が出来る様になる。 Further, it is also possible to adopt a method of rotating the predetermined angle image each time the area of the image is moved, and performing matching each time. By checking whether or not a predetermined degree of coincidence can be obtained before the 360 ° rotation, it is possible to make a correct determination even if the target image is rotated.
[部品交換作業の際の作業者へのガイダンス] 
 また、作業が部品交換作業である場合、部品交換の画像が記憶されている部品の交換時、交換前の部品の画像を代表画像として、交換対象の部品の画像照合を行う場合がある。その際に、部品を交換する作業を撮影する過程で、他の物品等と重ならない画像が得られるような作業手順を定めておくことが好ましい。このような作業手順は、通知情報等により、作業者に案内することができる。
[Guidance to workers when replacing parts]
When the work is part replacement work, when replacing a part storing an image of part replacement, image matching of the part to be replaced may be performed using the image of the part before replacement as a representative image. At that time, in the process of photographing an operation of replacing a part, it is preferable to define an operation procedure that can obtain an image that does not overlap with other articles or the like. Such an operation procedure can be guided to the worker by notification information or the like.
 また、前記作業手順において、撮像方向中心軸に対し垂直方向の軸(2軸)を中心とした傾きが所定角度以上とならない様に、交換対象の部品を把持することを指示することも好ましい。この様にすることで、新旧部品が画像中で重ならず、個々検出可能となる。さらに、この様にすることで、各部品の移動を追跡して、誤った取り付けが行われていないことを、画像で判断することが可能となる。上記の様にすることにより、2つの画像を比較する際、マッチング対象の画像の姿勢や大きさを合わせる前処理を省略することが可能となる。また、上記の様にすることにより、2つの画像に写る部品の姿勢が類似し、位置も大きく相違することがない為、マッチングの判定が容易になり、その精度を高めることが可能となる。 Further, in the work procedure, it is also preferable to instruct gripping of the part to be replaced such that the inclination about an axis (two axes) perpendicular to the imaging direction central axis does not become a predetermined angle or more. By doing this, new and old parts can be detected individually without overlapping in the image. Furthermore, by doing this, it is possible to track the movement of each part and to judge from the image that no erroneous attachment has been made. By doing as described above, when comparing two images, it becomes possible to omit pre-processing to match the posture and size of the image to be matched. Further, by doing as described above, since the postures of parts appearing in two images are similar and the positions are not largely different, the determination of matching becomes easy, and it becomes possible to improve the accuracy.
 さらに、このような作業手順を取り決めておくことにより、取り外した部品を含む画像を比較的短時間の間隔(検出対象画像が大きく動かない程度の時間間隔で、画像を撮像する)で撮像することが可能となる。そして、対象部品の位置の変化を算出することにより、取り外した部品と交換部品を取り違えて、取り外した部品を再度取り付けてしまうことを検出できるようにもなる。 Furthermore, by arranging such a work procedure, imaging an image including the removed part at a relatively short time interval (imaging the image at a time interval at which the detection target image does not move significantly) Is possible. Then, by calculating the change in the position of the target part, it is possible to detect that the removed part and the replacement part are mistaken for reattaching the removed part.
 また、取り外した部品を、再度取り付けた場合(画像では、取り外した部品の画像を追跡し、取り付け位置から所定距離以内に近付いた時で判断しても良い)、警告を通知することも好ましい。このような通知を行うことにより、部品等の交換がされていないという交換ミスを防ぐことが出来る。 It is also preferable to notify a warning when the removed part is remounted (it may be determined by tracking the image of the removed part in an image and approaching within a predetermined distance from the mounting position). By giving such a notice, it is possible to prevent an exchange mistake that the parts and the like are not exchanged.
 以上、本発明の各実施形態を説明したが、本発明は、上記した実施形態に限定されるものではなく、本発明の基本的技術的思想を逸脱しない範囲で、更なる変形・置換・調整を加えることができる。例えば、各図面に示したネットワーク構成、各要素の構成、メッセージの表現形態は、本発明の理解を助けるための一例であり、これらの図面に示した構成に限定されるものではない。 As mentioned above, although each embodiment of the present invention was described, the present invention is not limited to the above-mentioned embodiment, and further modification, substitution, adjustment in the range which does not deviate from the basic technical idea of the present invention Can be added. For example, the network configuration shown in each drawing, the configuration of each element, and the form of expressing messages are merely examples for helping to understand the present invention, and the present invention is not limited to the configuration shown in these drawings.
 最後に、本発明の好ましい形態を要約する。
[第1の形態]
 (上記第1の視点による作業支援装置参照)
[第2の形態]
 上記した作業支援装置の前記記憶部は、前記作業者が担当する複数種の作業毎に、前記代表画像を対応付けて保持しており、
 前記確認部は、
 さらに、前記作業者から、開始する作業に関する情報の入力を受け付け、
 前記作業者から入力された情報に基づいて、開始される作業を特定し、該特定した作業が所定の順序で実施されているか否かを確認する構成を採ることができる。
[第3の形態]
 上記した作業支援装置の前記記憶部は、
 前記代表画像に加えて、誤った作業が行われた場合の状況を表した誤り画像を記憶しており、
 さらに、
 前記確認部は、
 前記入力された画像と、前記誤り画像とを照合して、誤った作業が行われたか否かを確認し、
 前記通知部は、
 前記確認部において誤った作業がなされたと判定した場合、前記作業者に対して誤りが発生したことを通知する構成を採ることができる。
[第4の形態]
 上記した作業支援装置の前記記憶部は、
 前記第1の工程から前記第2の工程に至る所要時間を、前記第1、第2の工程の代表画像を対応付けて保持し、
 さらに、
 前記確認部は、
 前記第1の工程から前記第2の工程に至るまでの作業に、遅延が生じているか否かを確認し、
 前記通知部は、前記作業者に対し、前記作業の遅延の有無を通知する構成を採ることができる。
[第5の形態]
 上記した作業支援装置の前記記憶部は、
 さらに、前記第2の工程の後に行われる第3の工程の代表画像を保持し、
 前記確認部は、
 前記第1の工程の後に、入力された画像が、前記第3の工程の代表画像と一致した場合、前記作業に順序誤りが発生したと判定し、
 前記通知部は、前記作業者に対し、前記作業に順序誤りが発生していることを通知する構成を採ることができる。
[第6の形態]
 上記した作業支援装置の前記記憶部は、
 さらに、前記第2の工程の後に行われる工程の代表画像を保持し、
 前記確認部は、
 前記第1の工程の後に、入力された画像が、前記第2の工程の後に行われるべき工程の代表画像のいずれかと一致した場合、前記作業に順序誤りが発生したと判定し、
 前記通知部は、前記作業者に対し、前記作業に順序誤りが発生していることを通知する構成を採ることができる。
[第7の形態]
 上記した作業支援装置において、
 前記作業者が作業を行っている過程における画像として、前記作業者が着用しているウェアラブル端末のカメラにて撮影された画像を用いることができる。
[第8の形態]
 上記した作業支援装置において、
 前記代表画像として、前記作業者が着用するウェアラブル端末と同一のウェアラブル端末のカメラにて撮影された画像を用いることができる。
[第9の形態]
 上記した作業支援装置は、
 前記作業者から、開始する作業に関する情報の入力を受け付ける作業支援装置(ウェアラブル端末)と、
 前記作業者から入力された情報に基づいて、開始される作業を特定し、該特定した作業に対応するデータを作業支援装置(ウェアラブル端末)に提供するサーバとを含む構成にて実現することができる。
[第10の形態]
 (上記第2の視点による作業支援方法参照)
[第11の形態]
 (上記第3の視点によるプログラム参照)
 なお、上記第10~第11の形態は、第1の形態と同様に、第2~第9の形態に展開することが可能である。
Finally, the preferred form of the invention is summarized.
[First embodiment]
(Refer to the work support device from the above first viewpoint)
[Second form]
The storage unit of the work support apparatus described above holds the representative image in association with each of a plurality of types of work for which the worker is in charge;
The confirmation unit
Furthermore, from the worker, the input of information on the work to be started is received,
Based on the information input from the worker, the work to be started can be specified, and it can be configured to confirm whether the specified work is performed in a predetermined order.
[Third form]
The storage unit of the work support apparatus described above is
In addition to the representative image, an error image representing a situation when an erroneous work is performed is stored,
further,
The confirmation unit
Collating the input image with the error image to check whether an erroneous work has been performed;
The notification unit
If it is determined in the confirmation unit that an erroneous work has been performed, a configuration can be employed to notify the worker that an error has occurred.
[Fourth embodiment]
The storage unit of the work support apparatus described above is
The required time from the first step to the second step is held in association with the representative images of the first and second steps,
further,
The confirmation unit
Check whether or not a delay has occurred in the operations from the first step to the second step,
The notification unit may be configured to notify the worker of the presence or absence of the delay of the work.
[Fifth embodiment]
The storage unit of the work support apparatus described above is
Furthermore, a representative image of a third step performed after the second step is held,
The confirmation unit
If the input image matches the representative image of the third step after the first step, it is determined that an order error has occurred in the work,
The notification unit may be configured to notify the worker that an order error has occurred in the work.
Sixth Embodiment
The storage unit of the work support apparatus described above is
Furthermore, a representative image of a process performed after the second process is held,
The confirmation unit
After the first step, if the input image matches any of the representative images of the steps to be performed after the second step, it is determined that an order error has occurred in the operation;
The notification unit may be configured to notify the worker that an order error has occurred in the work.
[Seventh embodiment]
In the work support apparatus described above,
An image captured by a camera of a wearable terminal worn by the worker can be used as the image in the process in which the worker is performing work.
[Eighth embodiment]
In the work support apparatus described above,
As the representative image, an image captured by a camera of the same wearable terminal as the wearable terminal worn by the worker can be used.
[Ninth form]
The work support device mentioned above is
A work support apparatus (wearable terminal) that receives an input of information on a work to be started from the worker;
According to a configuration including a server that specifies the work to be started based on the information input from the worker and provides data corresponding to the specified work to the work support apparatus (wearable terminal) it can.
[Tenth embodiment]
(Refer to the work support method from the second viewpoint above)
[Eleventh embodiment]
(Refer to the program from the above third viewpoint)
The above tenth to eleventh embodiments can be expanded to the second to ninth embodiments as in the first embodiment.
 なお、上記の特許文献の各開示を、本書に引用をもって繰り込むものとする。本発明の全開示(請求の範囲を含む)の枠内において、さらにその基本的技術思想に基づいて、実施形態ないし実施例の変更・調整が可能である。また、本発明の開示の枠内において種々の開示要素(各請求項の各要素、各実施形態ないし実施例の各要素、各図面の各要素等を含む)の多様な組み合わせ、ないし選択(部分的削除を含む)が可能である。すなわち、本発明は、請求の範囲を含む全開示、技術的思想にしたがって当業者であればなし得るであろう各種変形、修正を含むことは勿論である。特に、本書に記載した数値範囲については、当該範囲内に含まれる任意の数値ないし小範囲が、別段の記載のない場合でも具体的に記載されているものと解釈されるべきである。 The disclosures of the above-mentioned patent documents are incorporated herein by reference. Within the scope of the entire disclosure of the present invention (including the scope of the claims), modifications and adjustments of the embodiments or examples are possible based on the basic technical concept of the invention. In addition, various combinations or selections of various disclosed elements (including each element of each claim, each element of each embodiment or example, each element of each drawing, and the like) within the scope of the disclosure of the present invention Selective deletion is possible). That is, the present invention of course includes the entire disclosure including the scope of the claims, and various modifications and alterations that can be made by those skilled in the art according to the technical concept. In particular, with regard to the numerical ranges described herein, it should be understood that any numerical value or small range falling within the relevant range is specifically described even if it is not otherwise described.
 10a 作業支援装置
 11a 記憶部
 12a 入力部
 13a 確認部
 14a 通知部
 10b コンピュータ
 11b メモリー
 12b 入力手段
 13b プロセッサ
 14b 出力手段
 100、100b ウェアラブル端末
 100a 作業者 
 101 ネットワーク
 102、202 制御部
 103、203 通信部
 104、204 画像入力部
 105、205 音声入力部
 106、206 入力部
 107、207 表示部
 108、208 音声出力部
 109 計時部
 110、110a、209 記憶部
 111、211 工程画像データベース(工程画像DB)
 112 プログラム
 113、210 作業情報データベース(作業情報DB)
 200 サーバ
10a Work support device 11a Storage unit 12a Input unit 13a Confirmation unit 14a Notification unit 10b Computer 11b Memory 12b Input unit 13b Processor 14b Output unit 100, 100b Wearable terminal 100a Worker
101 Network 102, 202 Control Unit 103, 203 Communication Unit 104, 204 Image Input Unit 105, 205 Voice Input Unit 106, 206 Input Unit 107, 207 Display Unit 108, 208 Voice Output Unit 109 Timekeeping Unit 110, 110a, 209 Storage Unit 111, 211 process image database (process image DB)
112 programs 113, 210 work information database (work information DB)
200 server

Claims (10)

  1.  少なくとも第1の工程と第1の工程の後に行われる第2の工程とを含む作業と、前記第1の工程と、前記第2の工程とにおいてそれぞれ行われる作業を代表する工程毎の代表画像とを対応付けて保持する記憶部と、
     作業者が作業を行っている過程における画像を入力する入力部と、
     前記入力された画像と、前記代表画像とを照合して、前記作業が所定の順序で実施されているか否かを確認する確認部と、
     前記作業者に対し、前記確認の結果を通知する通知部と、
     を備えた作業支援装置。
    A representative image for each process representing an operation including at least a first process and a second process performed after the first process, an operation performed in the first process, and an operation performed in the second process And a storage unit that associates and holds the
    An input unit for inputting an image in a process in which the worker is working;
    A confirmation unit that collates the input image with the representative image to confirm whether the work is performed in a predetermined order;
    A notification unit that notifies the worker of the result of the confirmation;
    Work support device equipped with
  2.  前記記憶部は、前記作業者が担当する複数種の作業毎に、前記代表画像を対応付けて保持しており、
     前記確認部は、
     さらに、前記作業者から、開始する作業に関する情報の入力を受け付け、
     前記作業者から入力された情報に基づいて、開始される作業を特定し、該特定した作業が所定の順序で実施されているか否かを確認する請求項1の作業支援装置。
    The storage unit associates and holds the representative image for each of a plurality of types of work performed by the worker.
    The confirmation unit
    Furthermore, from the worker, the input of information on the work to be started is received,
    The work support apparatus according to claim 1, wherein the work to be started is specified based on the information input from the worker, and whether or not the specified work is performed in a predetermined order.
  3.  前記記憶部は、
     前記代表画像に加えて、誤った作業が行われた場合の状況を表した誤り画像を記憶しており、
     さらに、
     前記確認部は、
     前記入力された画像と、前記誤り画像とを照合して、誤った作業が行われたか否かを確認し、
     前記通知部は、
     前記確認部において誤った作業がなされたと判定した場合、前記作業者に対して、誤りが発生したことを通知する請求項1又は2の作業支援装置。
    The storage unit is
    In addition to the representative image, an error image representing a situation when an erroneous work is performed is stored,
    further,
    The confirmation unit
    Collating the input image with the error image to check whether an erroneous work has been performed;
    The notification unit
    The work support apparatus according to claim 1 or 2, wherein when it is determined in the confirmation unit that an erroneous work has been performed, the worker is notified that an error has occurred.
  4.  前記記憶部は、
     前記第1の工程から前記第2の工程に至る所要時間を、前記第1、第2の工程の代表画像を対応付けて保持し、
     さらに、
     前記確認部は、
     前記第1の工程から前記第2の工程に至るまでの作業に、遅延が生じているか否かを確認し、
     前記通知部は、前記作業者に対し、前記作業の遅延の有無を通知する、
     請求項1から3いずれか一の作業支援装置。
    The storage unit is
    The required time from the first step to the second step is held in association with the representative images of the first and second steps,
    further,
    The confirmation unit
    Check whether or not a delay has occurred in the operations from the first step to the second step,
    The notification unit notifies the worker of the presence or absence of the delay of the work.
    The work support apparatus according to any one of claims 1 to 3.
  5.  前記記憶部は、さらに、前記第2の工程の後に行われる第3の工程の代表画像を保持し、
     前記確認部は、
     前記第1の工程の後に、入力された画像が、前記第3の工程の代表画像と一致した場合、前記作業に順序誤りが発生したと判定し、
     前記通知部は、前記作業者に対し、前記作業に順序誤りが発生していることを通知する、
     請求項1から4いずれか一の作業支援装置。
    The storage unit further holds a representative image of a third step performed after the second step,
    The confirmation unit
    If the input image matches the representative image of the third step after the first step, it is determined that an order error has occurred in the work,
    The notification unit notifies the worker that an order error has occurred in the work.
    The work support apparatus according to any one of claims 1 to 4.
  6.  前記記憶部は、さらに、前記第2の工程の後に行われる工程の代表画像を保持し、
     前記確認部は、
     前記第1の工程の後に、入力された画像が、前記第2の工程の後に行われるべき工程の代表画像のいずれかと一致した場合、前記作業に順序誤りが発生したと判定し、
     前記通知部は、前記作業者に対し、前記作業に順序誤りが発生していることを通知する、
     請求項1から4いずれか一の作業支援装置。
    The storage unit further holds a representative image of a process performed after the second process,
    The confirmation unit
    After the first step, if the input image matches any of the representative images of the steps to be performed after the second step, it is determined that an order error has occurred in the operation;
    The notification unit notifies the worker that an order error has occurred in the work.
    The work support apparatus according to any one of claims 1 to 4.
  7.  前記作業者が作業を行っている過程における画像が、前記作業者が着用しているウェアラブル端末のカメラにて撮影された画像である請求項1から6いずれか一の作業支援装置。 The work support apparatus according to any one of claims 1 to 6, wherein the image in the process in which the worker is working is an image captured by a camera of a wearable terminal worn by the worker.
  8.  前記代表画像として、前記作業者が着用するウェアラブル端末と同一のウェアラブル端末のカメラにて撮影された画像を用いる請求項7の作業支援装置。 The work support apparatus according to claim 7, wherein an image captured by a camera of a wearable terminal identical to the wearable terminal worn by the worker is used as the representative image.
  9.  少なくとも第1の工程と第1の工程の後に行われる第2の工程とを含む作業と、前記第1、前記第2の工程において行われる作業を代表する工程毎の代表画像とを対応付けて保持する手段にアクセス可能なコンピュータが、
     作業者が作業を行っている過程における画像を入力するステップと、
     前記入力された画像と、前記代表画像とを照合して、前記作業が所定の順序で実施されているか否かを確認するステップと、
     前記作業者に対し、前記確認の結果を通知するステップと、
     を含む作業支援方法。
    An operation including at least a first step and a second step performed after the first step is associated with a representative image for each step representing the operation performed in the first and second steps. A computer with access to the means to hold
    Inputting an image in a process in which the worker is working;
    Checking the input image and the representative image to confirm whether the work is performed in a predetermined order;
    Notifying the worker of the result of the confirmation;
    Work support method including.
  10.  少なくとも第1の工程と第1の工程の後に行われる第2の工程とを含む作業と、前記第1、前記第2の工程において行われる作業を代表する工程毎の代表画像とを対応付けて保持する手段にアクセス可能なコンピュータに、
     作業者が作業を行っている過程における画像を入力する処理と、
     前記入力された画像と、前記代表画像とを照合して、前記作業が所定の順序で実施されているか否かを確認する処理と、
     前記作業者に対し、前記確認の結果を通知する処理と、
     を実行させるプログラム。
    An operation including at least a first step and a second step performed after the first step is associated with a representative image for each step representing the operation performed in the first and second steps. On a computer accessible to the means for holding
    A process of inputting an image in a process in which an operator is working;
    A process of comparing the input image with the representative image to confirm whether the work is performed in a predetermined order;
    A process of notifying the worker of the result of the confirmation;
    A program that runs
PCT/JP2018/039376 2017-10-30 2018-10-23 Work assistance device, work assistance method, and program WO2019087870A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880070359.2A CN111328402A (en) 2017-10-30 2018-10-23 Work support device, work support method, and program
JP2019551164A JP6912150B2 (en) 2017-10-30 2018-10-23 Work support equipment, work support methods and programs

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-209179 2017-10-30
JP2017209179 2017-10-30

Publications (1)

Publication Number Publication Date
WO2019087870A1 true WO2019087870A1 (en) 2019-05-09

Family

ID=66331946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/039376 WO2019087870A1 (en) 2017-10-30 2018-10-23 Work assistance device, work assistance method, and program

Country Status (3)

Country Link
JP (2) JP6912150B2 (en)
CN (1) CN111328402A (en)
WO (1) WO2019087870A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021036945A (en) * 2019-08-30 2021-03-11 Hoya株式会社 Endoscope reprocessing support device, endoscope reprocessing support method, and program
WO2021125294A1 (en) * 2019-12-18 2021-06-24 Arithmer株式会社 Rental target management system, rental target management program, and rental target management method
CN113497917A (en) * 2020-03-18 2021-10-12 东芝泰格有限公司 Image processing device
JP6999840B1 (en) 2020-12-21 2022-01-19 クーパン コーポレイション Electronic devices for managing work information and their methods
JP2022100073A (en) * 2020-12-23 2022-07-05 横河電機株式会社 Apparatus, system, method, and program
JP2023529786A (en) * 2021-05-07 2023-07-12 テンセント・アメリカ・エルエルシー A method for estimating inter-camera pose graphs and transformation matrices by recognizing markers on the ground in panoramic images
WO2023176030A1 (en) * 2022-03-18 2023-09-21 株式会社島津製作所 Error reporting system and control device
JP7406038B1 (en) * 2023-09-19 2023-12-26 株式会社日立パワーソリューションズ Work support system and work support method
JP7433126B2 (en) 2020-04-21 2024-02-19 三菱電機株式会社 Image display system, image display device, server, image display method and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06162034A (en) * 1992-11-20 1994-06-10 Kumahira Safe Co Inc Working hour managing device
JP2009279193A (en) * 2008-05-22 2009-12-03 Fujifilm Corp Medical apparatus management system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3156220B2 (en) * 1996-01-17 2001-04-16 明光産業株式会社 Gas container bottom inspection device
JP4784752B2 (en) 2006-06-30 2011-10-05 サクサ株式会社 Image processing device
JP6113631B2 (en) 2013-11-18 2017-04-12 東芝三菱電機産業システム株式会社 Work confirmation system
JP6399437B2 (en) * 2014-06-04 2018-10-03 パナソニックIpマネジメント株式会社 Control device and work management system using the same
JP6451133B2 (en) * 2014-08-01 2019-01-16 株式会社リコー Anomaly detection device, anomaly detection method, anomaly detection system, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06162034A (en) * 1992-11-20 1994-06-10 Kumahira Safe Co Inc Working hour managing device
JP2009279193A (en) * 2008-05-22 2009-12-03 Fujifilm Corp Medical apparatus management system

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021036945A (en) * 2019-08-30 2021-03-11 Hoya株式会社 Endoscope reprocessing support device, endoscope reprocessing support method, and program
JP7329392B2 (en) 2019-08-30 2023-08-18 Hoya株式会社 Endoscope reprocess support device, endoscope reprocess support method and program
WO2021125294A1 (en) * 2019-12-18 2021-06-24 Arithmer株式会社 Rental target management system, rental target management program, and rental target management method
JPWO2021125294A1 (en) * 2019-12-18 2021-12-16 Arithmer株式会社 Lending object management system, lending object management program and lending object management method.
JP7015503B2 (en) 2019-12-18 2022-02-03 Arithmer株式会社 Lending object management system, lending object management program and lending object management method.
CN113497917A (en) * 2020-03-18 2021-10-12 东芝泰格有限公司 Image processing device
JP7471878B2 (en) 2020-03-18 2024-04-22 東芝テック株式会社 Image Processing Device
JP7433126B2 (en) 2020-04-21 2024-02-19 三菱電機株式会社 Image display system, image display device, server, image display method and program
JP6999840B1 (en) 2020-12-21 2022-01-19 クーパン コーポレイション Electronic devices for managing work information and their methods
JP2022098393A (en) * 2020-12-21 2022-07-01 クーパン コーポレイション Electronic apparatus for managing working information and method thereof
JP2022098500A (en) * 2020-12-21 2022-07-01 クーパン コーポレイション Electronic apparatus for managing working information and method thereof
JP2022100073A (en) * 2020-12-23 2022-07-05 横河電機株式会社 Apparatus, system, method, and program
JP7415912B2 (en) 2020-12-23 2024-01-17 横河電機株式会社 Apparatus, system, method and program
JP2023529786A (en) * 2021-05-07 2023-07-12 テンセント・アメリカ・エルエルシー A method for estimating inter-camera pose graphs and transformation matrices by recognizing markers on the ground in panoramic images
WO2023176030A1 (en) * 2022-03-18 2023-09-21 株式会社島津製作所 Error reporting system and control device
JP7406038B1 (en) * 2023-09-19 2023-12-26 株式会社日立パワーソリューションズ Work support system and work support method

Also Published As

Publication number Publication date
CN111328402A (en) 2020-06-23
JPWO2019087870A1 (en) 2020-12-03
JP7156731B2 (en) 2022-10-19
JP2021152979A (en) 2021-09-30
JP6912150B2 (en) 2021-07-28

Similar Documents

Publication Publication Date Title
WO2019087870A1 (en) Work assistance device, work assistance method, and program
CN106340217B (en) Manufacturing equipment intelligence system and its implementation based on augmented reality
US20130010068A1 (en) Augmented reality system
JP2005250990A (en) Operation support apparatus
JP4785583B2 (en) Work support system, work support device, and computer program
US20210224752A1 (en) Work support system and work support method
US20210335148A1 (en) Content presentation system
AU2018405401A1 (en) Blasting plan logger, related methods and computer program products
JP2008235504A (en) Assembly inspection device
JP2015061339A (en) Wire connection work support system
US11291387B2 (en) System for recognizing abnormal activity of human body using wearable electronic device and mixed reality technology
JP7191560B2 (en) content creation system
US10133900B2 (en) Controlling the output of contextual information using a computing device
JP2017068849A (en) Article assembling status information display device
JPWO2020194413A1 (en) Maintenance support system, maintenance support method and program
JP5891191B2 (en) Operation result acquisition system and operation result acquisition method
KR102597228B1 (en) System for guiding information anc checking mobile facility based on artificial intelligence
CN114567535B (en) Product interaction and fault diagnosis method based on augmented reality
JP2020181531A (en) Construction method of installation, video display device, and inspection method of installation position
WO2017195646A1 (en) Work assistance device
WO2022046227A1 (en) Mixed reality image capture and smart inspection
CN207259864U (en) The massaging device of fiber web machine
JP2020197971A (en) Intrusion detection device, program, and intrusion detection system
KR101483956B1 (en) Apparatus for verifying assembly quality of parts and method thereof
JP2007156838A (en) Assembling operation confirmation system and assembling operation confirmation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18872404

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019551164

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18872404

Country of ref document: EP

Kind code of ref document: A1