WO2017013899A1 - 活動記録装置、活動記録プログラム、および、活動記録方法 - Google Patents
活動記録装置、活動記録プログラム、および、活動記録方法 Download PDFInfo
- Publication number
- WO2017013899A1 WO2017013899A1 PCT/JP2016/058433 JP2016058433W WO2017013899A1 WO 2017013899 A1 WO2017013899 A1 WO 2017013899A1 JP 2016058433 W JP2016058433 W JP 2016058433W WO 2017013899 A1 WO2017013899 A1 WO 2017013899A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- activity
- worker
- specifying
- time
- recording
- Prior art date
Links
- 230000000694 effects Effects 0.000 title claims abstract description 310
- 238000000034 method Methods 0.000 title claims description 72
- 230000001133 acceleration Effects 0.000 claims description 14
- 238000004891 communication Methods 0.000 claims description 12
- 230000003213 activating effect Effects 0.000 claims description 2
- 238000004519 manufacturing process Methods 0.000 description 66
- 230000008569 process Effects 0.000 description 51
- 238000012545 processing Methods 0.000 description 34
- 238000003860 storage Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 14
- 238000004458 analytical method Methods 0.000 description 12
- 239000000284 extract Substances 0.000 description 8
- 230000008859 change Effects 0.000 description 6
- 230000004044 response Effects 0.000 description 5
- 238000007726 management method Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000010276 construction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000012856 packing Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 230000032258 transport Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C1/00—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
- G07C1/10—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people together with the recording, indicating or registering of other data, e.g. of signs of identity
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/406—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by monitoring or safety
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06311—Scheduling, planning or task assignment for a person or group
- G06Q10/063114—Status monitoring or status determination for a person or group
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06398—Performance of employee with respect to a job function
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/109—Time management, e.g. calendars, reminders, meetings or time accounting
- G06Q10/1091—Recording time for administrative or management purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/24—Pc safety
- G05B2219/24055—Trace, store a working, operation history
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35291—Record history, log, journal, audit of machine operation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Definitions
- the present invention relates to an activity recording apparatus, an activity recording program, and an activity recording method for recording an operator's activity in an area where the worker is active, such as a production site, and particularly to easily record the entire activity of the operator. is there.
- Production facilities machines and workers (people) exist at the production site, and they carry out production activities with their respective roles.
- the role played by workers at the production site is significant. Therefore, it is important to grasp the efficiency of the worker's production activities (hereinafter referred to as worker activities) in evaluating the productivity of the entire production site.
- Non-Patent Document 1 “Work Sampling Method” is a method that instantaneously observes the operating state of humans and machines, types of work, etc., and analyzes the time structure of each observation item (for example, non-patent document 1). Patent Document 2).
- Patent Document 1 a work performance acquisition technique in which the worker himself / herself inputs his / her worker activity record, or a technique for automatically acquiring a worker's operation using a sensor or the like (for example, Patent Document 2) has been developed.
- the present invention has been made to solve the above-described problems, and an object thereof is to provide an activity recording device, an activity recording program, and an activity recording method that can easily record the entire activity of an operator. To do.
- the activity recording apparatus is: In the activity recording device that records the activity of the worker as activity data, A first specifying unit for specifying the worker; A second specifying unit for specifying the position of the worker; A third specifying part for specifying the object of the worker; A fourth identification unit for identifying the state of the worker; It is specified for the worker, the position, the object, and the aspect specified by the first specifying unit, the second specifying unit, the third specifying unit, and the fourth specifying unit. And a recording unit that records the activity data in association with the activity time.
- the activity recording program is: In an activity record program that records worker activities as activity data, On the computer, A first identification step for identifying the worker, A second specifying step of specifying the position of the worker; A third identifying step for identifying the object of the worker; A fourth identification step for identifying the state of the worker; It is specified for the worker, the position, the object, and the aspect specified by the first specifying step, the second specifying step, the third specifying step, and the fourth specifying step. A recording step of recording the activity data in association with the recorded time as the activity time.
- the activity recording method includes: In the activity recording method for recording the activity of the worker as activity data, A first identification step for identifying an operator; A second specifying step for specifying the position of the worker; A third identifying step for identifying the object of the worker; A fourth identification step for identifying the state of the worker; Activating specified times for workers, positions, objects, and modes specified by the first specifying step, the second specifying step, the third specifying step, and the fourth specifying step And a recording step of recording time activity in association with the activity data.
- the activity recording device the activity recording program, and the activity recording method according to the present invention, easily record the entire worker's activities.
- FIG. 1 shows the hardware constitutions of the activity recording apparatus which concerns on Embodiment 1 of this invention. It is a functional block diagram of the activity recording device which concerns on Embodiment 1 of this invention. It is a flowchart which records the activity data in the activity recording device which concerns on Embodiment 1 of this invention.
- 6 is a diagram illustrating a recording example of activity data according to Embodiment 1.
- FIG. 2 It is a function block diagram of the activity recording device which concerns on Embodiment 2 of this invention. It is a figure which shows the relevance database of Embodiment 2. It is a figure which shows an example of the display screen of the display of the activity recording device in Embodiment 2.
- 10 is a flowchart illustrating processing in a first specifying unit that uses a worker ID list according to the second embodiment.
- 10 is a flowchart illustrating processing in a second specifying unit that uses the position list according to the second embodiment.
- 10 is a flowchart showing processing in a third specifying unit using the object list of the second embodiment.
- 10 is a flowchart illustrating processing in a fourth specifying unit using the mode list according to the second embodiment.
- 10 is a flowchart illustrating processing of the activity recording device according to the third embodiment.
- 10 is a flowchart showing processing of another activity recording apparatus according to the third embodiment.
- 10 is a flowchart showing processing of another activity recording apparatus according to the third embodiment. It is a figure which shows the relevance database of Embodiment 4.
- FIG. 20 is a diagram showing a screen display example of a display in the fifth embodiment.
- FIG. 20 is a diagram showing a screen display on the display when work is interrupted in the fifth embodiment.
- 10 is a flowchart showing the operation of the activity recording apparatus according to the fifth embodiment.
- 10 is a flowchart showing the operation of the activity recording apparatus according to the fifth embodiment.
- 10 is a flowchart showing the operation of the activity recording apparatus according to the fifth embodiment.
- 10 is a flowchart showing the operation of the activity recording apparatus according to the fifth embodiment.
- FIG. 18 is a flowchart illustrating processing in a first specifying unit that uses an IC card according to a sixth embodiment. 18 is a flowchart illustrating processing in a first specifying unit that uses the face recognition camera of the seventh embodiment.
- 22 is a flowchart illustrating processing in a first specifying unit using the fingerprint recognition sensor according to the eighth embodiment. 22 is a flowchart illustrating processing in a second specifying unit using the GPS sensor according to the ninth embodiment.
- 20 is a data table used in the second specifying unit of the ninth embodiment.
- 24 is a flowchart showing processing of a second specifying unit using the radio wave intensity sensor according to the tenth embodiment.
- FIG. 38 is a diagram illustrating distance characteristics of radio field intensity of a beacon used in the second specifying unit according to the tenth embodiment.
- FIG. 38 is a diagram illustrating an example of a time-series change in radio wave intensity of a beacon used in the second specifying unit according to the tenth embodiment.
- FIG. 38 is a data table used for the second specifying unit in the tenth embodiment.
- FIG. 32 is a flowchart illustrating processing of a third specifying unit that uses information from the production facility according to the eleventh embodiment. 38 is a flowchart illustrating processing of a fourth specifying unit using motion capture according to the twelfth embodiment.
- FIG. 38 is a flowchart showing a process of a fourth specifying unit using the acceleration sensor of the thirteenth embodiment. It is a flowchart which shows the process of the 2nd specific
- Embodiment 1 FIG.
- the present invention records the activity data of the worker by predefining the appearance pattern of the worker activity and the type of acquired information necessary for the productivity analysis of the worker in an area where the worker is active such as a production site. An activity recording device that can be easily performed is presented.
- the record data (hereinafter referred to as “activity data”) of the worker's activity in the present invention includes “worker”, “worker position” (hereinafter referred to as “position”), and “worker object”. ”(Hereinafter referred to as“ object ”),“ Worker's mode ”(hereinafter referred to as“ Mode ”), and“ Time when worker was active ”(hereinafter referred to as“ Activity time ”) Data configured as a set is a set.
- FIG. 1 is a diagram showing a hardware configuration of the activity recording apparatus according to the first embodiment of the present invention.
- FIG. 1 is a diagram showing a hardware configuration of an activity recording apparatus common to all the following embodiments.
- an activity recording apparatus according to the first embodiment of the present invention includes a CPU (central processing unit) 1, a program memory 2 that stores work executed by the CPU 1, and data temporarily for the CPU to perform arithmetic processing.
- a work memory 3 to be transferred, a main memory 4 (including a storage in which various databases and activity data are recorded), an interface 6 and the like are connected to the data bus 5.
- the program memory 2 includes a first specifying step for specifying the worker, a second specifying step for specifying the position of the worker, a third specifying step for specifying the activity object of the worker, and the activity state of the worker. Specified for the worker, position, object, and mode specified by the fourth specifying step, the first specifying step, the second specifying step, the third specifying step, and the fourth specifying step.
- An activity recording program including a recording step of recording the activity time as the activity data in association with the recorded time as the activity time is stored.
- an input unit 7 constituted by a touch panel, a display 8, and a timer 10 for managing time are connected via an interface 6.
- the timer 10 is also used as a time adding unit that adds time.
- the communication module 9 for communicating with the exterior is connected as needed.
- the display screen of the display 8 may also be used as a touch panel or a keyboard.
- FIG. 2 is a functional configuration diagram of the activity recording apparatus according to Embodiment 1 of the present invention.
- the activity recording device 11 includes a first specifying unit 15, a second specifying unit 16, a third specifying unit 17, a fourth specifying unit 18, a recording unit 13, an input unit 7, a timer 10, a communication module 9, a display 8, and a power source 14. It has.
- each of the first specifying unit 15, the second specifying unit 16, the third specifying unit 17, and the fourth specifying unit 18 inputs data for specifying each element of the activity data from the worker. It is received via the touch panel of unit 7.
- specification part 15 specifies an operator.
- the second specifying unit 16 specifies the position of the worker.
- specification part 17 specifies an operator's target object.
- the fourth identification unit 18 identifies the state of the worker.
- the recording unit 13 is specified for the worker, the position, the object, and the mode specified by the first specifying unit 15, the second specifying unit 16, the third specifying unit 17, and the fourth specifying unit 18.
- the recorded time is added from the timer 10 and associated as activity time and recorded as activity data.
- the worker activity data is stored in a storage included in the main memory 4.
- the activity data stored in the storage can be written to an external PC (personal computer) or the like to analyze the activity data after recording.
- the communication module 9 can be connected to the recording unit 13 to exchange the contents of activity data with external devices through the network.
- the activity data sent to the display 8 connected to the recording unit 13 is used to present work performance data to the worker.
- the activity recording device 11 has an independent power source 14 inside. Therefore, the activity recording device 11 is not fixed at a certain place, but is portable so that an operator can carry it around. As a result, it is possible to always record the activity regardless of where the worker is doing.
- a touch panel is used as the input unit 7 of the first specifying unit 15.
- an IC card reading unit, a face recognition camera, a fingerprint recognition sensor, and the like can be used as necessary. These will be described in other embodiments.
- a touch panel is used as the input unit 7 of the second specifying unit 16.
- a GPS sensor As the other input unit 7, a GPS sensor, a radio wave intensity sensor, an acceleration sensor, a geomagnetic sensor, or the like can be used as necessary. These will be described in other embodiments.
- a touch panel is used as the input unit 7 of the third specifying unit 17.
- a sensor linked to the production facility can be used as necessary. This will be described in another embodiment.
- a touch panel is used as the input unit 7 of the fourth specifying unit 18.
- a motion capture, an acceleration sensor, or the like can be used as necessary. This will be described in another embodiment.
- each of the specifying units 15, 16, 17, 18 may be provided with an input unit.
- Embodiment 1 of this invention determines the workers to be observed.
- an operator is specified by carrying each worker individually.
- the worker himself / herself directly inputs the worker's ID (abbreviation of identification) to the first specifying unit 15 of the activity recording apparatus 11 of the first embodiment via the input unit 7.
- the 1st specific process which specifies is performed (step ST32 of FIG. 3).
- Step ST33 in FIG. 3 determines the position of the worker. For example, whether the worker is in a warehouse or an assembly line.
- the worker himself / herself directly inputs the information of the worker's position into the second specifying unit 16 of the activity recording device 11 via the input unit 7 and performs the second specifying step of specifying the position ( Step ST33 in FIG. 3).
- the object at the position of the worker is determined. For example, an object such as “model A” or “manufacturing lot B” is selected.
- the operator himself / herself directly performs the third specifying step of specifying the object information by inputting it to the third specifying unit 17 of the activity recording device 11 via the input unit 7 (step of FIG. 3). ST34).
- the operator himself / herself directly performs the fourth specifying step of specifying and inputting the mode information to the fourth specifying unit 18 of the activity recording apparatus 11 via the input unit 7 (step ST35 in FIG. 3). ).
- the time when the state is fixed that is, the time when all the elements are fixed, is added from the timer 10 and associated with it to specify the activity time (step ST36 in FIG. 3).
- the recording unit 13 performs a recording process of recording the activity data in the storage of the main memory 4 as shown in FIG. 4 (step ST37 in FIG. 3).
- Steps ST31 to ST37 shown in the column of the activity recording device 11 in FIG. 3 are a processing flow as an activity recording method that is executed in common in other embodiments.
- each column represents each element of activity data, and each row represents one activity data.
- worker information is included in the first column, and a worker ID is recorded. The position is written in the second column.
- an object is written. The worker on the production line indicates that the production lot is recorded on the assumption that the work flowing through the production line is performed. The state is written in the fourth column.
- Activity data is recorded one by one whenever this situation changes.
- the activity time is recorded.
- the activity time is defined as the change time of the state.
- the first characteristic of production activities performed at production sites is that many unspecified persons are not engaged in production activities.
- Production sites have the character that a specific person is engaged based on an employment contract. This means that it is necessary to identify the person “who”.
- the second feature at the production site, there is an operation to add value to the product, for example, material processing or product assembly. Furthermore, on the production site, there are operations that do not add value, transportation of products, and monitoring of the operating state of machines. Therefore, there are two types of operations at the production site. The work of adding value is performed in a predetermined place in the production site.
- the location “where” the work that should be performed can be narrowed down in a limited manner. For example, if the location is a “warehouse”, it is the location where the goods are stored. For this reason, the production activity performed in the “warehouse” is one of several kinds of work that occurs in association with the carry-in and carry-out of articles and those works.
- the activity recording apparatus 11 has at least five elements of “worker”, “position”, “object”, “mode”, and “activity time” in the activity data of the worker. Specifically, it has activity data that configures these five elements as a set. Then, every time one of “worker”, “position”, “object”, and “mode” changes, the recording unit 13 records the activity data together with the activity time.
- data consisting of “worker”, “position”, “object”, “mode”, and “activity time” is defined as activity data, and this is defined as worker, position, object
- activity data is defined as worker, position, object
- the identified time is associated as the activity time and recorded as the activity data for the worker, the position, the object, and the state.
- the activity history of each worker in the productivity analysis can be recorded as necessary and the man-hour required for the productivity analysis can be reduced.
- activity data necessary for productivity analysis such as video motion analysis method and work sampling method in the IE method can be obtained easily, in detail, accurately, and in large quantities.
- the specified time is added as the activity time and recorded as activity data, so it has excellent accuracy.
- the work data can be recorded.
- FIG. FIG. 5 is a diagram showing a configuration of the activity recording apparatus 11 according to the second embodiment of the present invention.
- FIG. 6 is a diagram showing the configuration of the relevance database of the activity recording apparatus 11 shown in FIG.
- DB relevance database
- each identifying unit 15, 16, 17, 18 is connected to a relevance database 19.
- Each identification part 15, 16, 17, 18 collates the received data with relevance DB19, and identifies each element of activity data.
- the relevance DB 19 is a set of data consisting of four dimensions of an operator, a position, an object, and a state. Each piece of data expresses one aspect. That is, the relevance DB 19 is a mode DB, and a search in the DB is performed to narrow down the mode.
- a set of workers, positions, and objects, which are higher-level items than modes is specified first. For example, as shown in FIG. 6, when the worker is specified as AB12345, the position is facility A or facility B. Subsequently, when the position is specified as the facility A, the object is the model a or the model b. When the model b is selected, a set of corresponding modes ⁇ operation b1, operation b2, operation b3 ⁇ is obtained. In this way, the relevance DB 19 can be realized by adding a mode to all possible combinations of workers, positions, and objects.
- an operator ID DB 76 is connected to the first identification unit 15.
- the worker ID DB 76 stores worker IDs.
- FIG. 7 shows the first identification unit 15, the second identification unit 16, the third identification unit 17, and the fourth identification unit 18, according to the present embodiment, from the list displayed on the touch panel screen of the display 8. This is an example of selective input. Therefore, in the second embodiment, the display 8 also serves as the input unit 7 shown in the first embodiment.
- an operator list 53 On the screen of the display 8 of the activity recording device 11, an operator list 53, a position list 54, an object list 55, and a state list 56 are displayed.
- the list names on the display screen are indicated by IDs, lines, and lots that are easy to understand in operation.
- the worker first selects the corresponding worker, position, and object one by one. Every time one element is selected, necessary options can be selected on the list by narrowing down using the relevance DB 19 shown in FIG.
- the state list is read from the relevance DB 19 and displayed on the display 8.
- the activity data including the current time is stored in the storage, and the timer for the elapsed time portion is started.
- the worker selects the next state to be engaged from the state list. Then, activity data including the current time displayed on the first display unit 51 as the activity time is stored in the storage. At the same time, the elapsed time of the second display section 52 is reset, and counting of the elapsed time of the next work is started.
- FIG. 8 shows processing when selecting from the worker ID list as corresponding to the first specifying unit 15.
- the worker identification process is started (step ST71 in FIG. 8).
- the first identification unit 15 displays a worker ID list to be presented to the worker from the worker ID DB 76 on the display 8 (step ST72 in FIG. 8, worker list 53 in FIG. 7).
- the worker selects the worker ID corresponding to himself and inputs the first identification unit 15 from the touch panel display 8 to identify the worker (step ST73 in FIG. 8).
- the first specifying unit 15 narrows down the corresponding position, object, and state by the relevance DB 19 according to the specified worker, extracts the corresponding relevance data, and creates the first data table 77. (Step ST74 in FIG. 8). Through the above, the worker identification process is terminated (step ST75 in FIG. 8).
- worker input can be realized only by a general-purpose input interface such as a touch panel or a keyboard.
- FIG. 9 shows processing when selecting from a list of positions as corresponding to the second specifying unit 16.
- the position specifying process is started (step ST81 in FIG. 9).
- the second specifying unit 16 displays a position list where the worker can be active on the display 8 from the first data table 77 created in the worker specifying process (step ST82 in FIG. 9, position list in FIG. 7). 54 is not hatched).
- the operator selects a position corresponding to himself / herself and inputs the position to the second specifying unit 16 from the display 8 of the touch panel, whereby the position is specified (step ST83 in FIG. 9).
- the second specifying unit 16 narrows down the corresponding object and state using the first data table 77 according to the specified position, and creates the second data table 85 (step ST84 in FIG. 9). Through the above, the position specifying process is completed (step ST86 in FIG. 9).
- position input can be realized only by a general-purpose input interface such as a touch panel or a keyboard.
- FIG. 10 shows a processing flow when selecting from a list of objects as corresponding to the third specifying unit 17.
- the object specifying process is started (step ST91 in FIG. 10).
- the third specifying unit 17 displays a list of objects on which the worker can act at the position from the second data table 85 created through the worker specifying process and the position specifying process on the display 8 (see FIG. 10 step ST92, the portion of the object list in FIG. 7 not hatched).
- the operator selects an object corresponding to himself / herself, and inputs the object to the third specifying unit 17 from the display 8 of the touch panel, whereby the object is specified (step ST93 in FIG. 10).
- the third specifying unit 17 narrows down the corresponding mode using the second data table 85 according to the specified object, and creates the third data table 95 (step ST94 in FIG. 10). Through the above, the object specifying process is completed (step ST96 in FIG. 10). According to this embodiment, it is possible to realize an object input only by a general-purpose input interface such as a touch panel or a keyboard.
- FIG. 11 shows a processing flow in the case of selecting from the state list as corresponding to the fourth specifying unit 18.
- the fourth specifying unit 18 displays the state list of the object at the position from the third data table 95 created through the worker specifying process, the position specifying process, and the object specifying process.
- Step ST102 in FIG. 11, state list 56 in FIG. 7 state list 56 in FIG. 7.
- the worker selects a mode corresponding to himself / herself and inputs the mode from the display 8 of the touch panel to the fourth specifying unit 18 to specify the mode (step ST103 in FIG. 11).
- the recording unit 13 stores, in the storage, activity data obtained by adding the activity time acquired from the timer 10 to the identified state and the worker, position, and object identified so far (step of FIG. 11). ST104). Through the above, the mode identification process is terminated (step ST105 in FIG. 11). In the present embodiment, it is possible to realize state input only by a general-purpose input interface such as a touch panel or a keyboard.
- the activity recording apparatus of the second embodiment configured as described above has the same effect as that of the first embodiment as well as the relevance DB. Therefore, the production activity performed at the production site In this case, the appearance pattern of the activity data can be narrowed down to some extent if it is possible to specify “who”, “where”, and “what”.
- Embodiment 3 is the same as the second embodiment in the case where the worker, the position, the object and the state are specified, and the worker, the position and the object are constant and only the state is changed. The recording of activity data will be described.
- FIG. 12 is a flowchart showing the operation of the activity recording apparatus 11 according to the third embodiment of the present invention.
- the fourth specifying unit 18 reads the worker information, the position, and the object from the third data table 95 created through the worker specifying process, the position specifying process, and the object specifying process. (Step ST111 to Step ST113 in FIG. 12).
- the mode list is displayed in the same manner as in the second embodiment. The worker selects one mode from the list, and the mode is specified (step ST114 in FIG. 12).
- the recording unit 13 specifies the activity time acquired from the timer 10 for the specified mode, the worker, the position, and the target specified and read in advance. (Step ST115 in FIG. 12). Then, the activity data is stored in the storage (step ST116 in FIG. 12).
- the worker, the position, and the object are constant, and only the state is changed.
- the worker, the position, and the object and the state are constant. As shown in FIG. 13, it can be performed in the same manner as in the third embodiment.
- the third specifying unit 17 obtains the worker information and the position from the second data table 85 created through the worker specifying process and the position specifying process. Read (step ST111, step ST112 in FIG. 13). Next, a list of objects is displayed as in the second embodiment. The operator selects one object from the list, and the object is specified (step ST117 in FIG. 13).
- the third specifying unit 17 creates a third data table 95 by narrowing down the corresponding mode using the second data table 85 according to the specified object.
- the fourth specifying unit 18 displays a list of modes. The worker selects one mode from the list, and the mode is specified (step ST114 in FIG. 13).
- the recording unit 13 specifies the activity time acquired from the timer 10 for the specified mode, the worker, the position, and the target specified and read in advance (see FIG. 13 step ST115). Then, the activity data is stored in the storage (step ST116 in FIG. 13).
- the case where the worker is constant and the position, the object, and the state are changed can be performed in the same manner as in the third embodiment as shown in FIG.
- the second specifying unit 16 reads the worker information from the first data table 77 created through the worker specifying process (FIG. 14). Step ST111). Next, a list of positions is displayed as in the second embodiment. The position is specified by selecting one target object from the list (step ST118 in FIG. 14).
- the second specifying unit 16 creates a second data table 85 by narrowing down the corresponding mode using the first data table 77 according to the specified position.
- the third specifying unit 17 displays a list of objects from the second data table 85 as in the second embodiment. The operator selects one object from the list, and the object is specified (step ST117 in FIG. 14).
- the third specifying unit 17 creates a third data table 95 by narrowing down the corresponding mode using the second data table 85 according to the specified object.
- the fourth specifying unit 18 displays a list of modes. The worker selects one mode from the list, and the mode is specified (step ST114 in FIG. 14).
- the recording unit 13 specifies the activity time acquired from the timer 10 for the specified mode, the worker, the position, and the target specified and read in advance (see FIG. 14 step ST115). Then, the activity data is stored in the storage (step ST116 in FIG. 14).
- the third embodiment configured as described above, it is possible to reuse the relevance DB that has been narrowed down once, as well as the same effects as the above-described embodiments. Can be recorded with high accuracy in a short process.
- Embodiment 4 FIG.
- the relevance DB 19 (see FIG. 6) in the second embodiment has a problem in that it is necessary to modify one entire DB when changing the element of each column.
- the present embodiment is one in which several internal variables are prepared to separate the tables and facilitate management and construction.
- FIG. 15 shows the relevance DB 19 of this embodiment.
- the worker column is a group, which is different from the worker ID in FIG. Moreover, the column of a position is an assembly line and a parts line, and is different from the equipment of FIG.
- the column of the object is a model and is the same as FIG.
- the mode column is the work and is the same as in FIG.
- FIG. 16 shows a procedure for recording activity data using the relevance DB 19.
- each team's line table showing the correspondence between teams and lines is provided as an internal table. This is a list of positions where the members of each group can be engaged, and can be easily created from a production management system existing in the factory.
- each model production line master showing the correspondence between lines and models is held as an internal table. This is a list of models that can be produced in each line. This table can also be created from an existing production management system in the factory.
- This is a table showing which model each lot actually produced on the daily line corresponds to on the drawing, and can be created from, for example, a small schedule plan of a factory.
- FIG. 17 shows the correspondence between internal variables and screen display.
- the assembly line related to the worker specified by the ID is narrowed down using the group as a key.
- the narrowed assembly line is displayed on the screen.
- an assembly line is identified from among them, a lot related to the identified assembly line is narrowed down in the DB using the model as a key. The narrowed lot is displayed on the screen.
- each element actually displayed on the screen of the operator may be different from the reference column of each element used as a DB key.
- This is an internal variable that can be easily managed as a DB variable as a DB. This is because a variable that is easy for the operator to select and understand is selected as a variable for screen display.
- the operator, the position, and the target object can be obtained by using the relevance DB as well as the same effects as the above-described embodiments.
- possible modes can be narrowed down.
- Embodiment 5 FIG.
- the completion of the work and the case where the work is interrupted will be described. Completion of work and interruption of work are set as one aspect.
- 18 and 19 are diagrams showing examples of display on the display 8 of the activity recording apparatus 11.
- an execution screen of software for recording activity data is displayed on the display 8.
- an end button 151 for the software and an environment setting button 152 for the operation of the software are displayed at the top.
- the position display unit 153 corresponds to the position information in the activity data. In the present embodiment, the case where the operator selects from a position list that can be engaged is shown.
- the worker display unit 154 corresponds to the worker in the activity data.
- the object display unit 155 corresponds to the object in the activity data. For an operator who works at a predetermined position, the object is “which production lot to produce”.
- the production lot that is the object that can be worked on the production line that is the position is obtained in advance and stored in the storage in the activity recording device 11. Then, it is possible to select from this list of objects.
- This list of objects corresponds to the position ⁇ object spreadsheet in the first embodiment.
- the interruption button 156 is a button that is pressed when it is desired to interrupt the recording of the activity data.
- a plurality of interruption work options as shown in FIG. A plurality of suspended work displays a suspended work list stored in the storage in advance.
- the suspended work list is a list having only non-stationary work as elements among the modes in the above embodiments.
- the recording unit 13 adds the identified time as an activity time to the identified worker, position, and target object as the mode, adds the identified time as an activity time, and records it as activity data. To do.
- the screen returns to the screen of FIG. Therefore, the time when the interrupt button 156 is first pressed is the time when the interrupted work is interrupted, and the time when the interrupted work list is selected is the time when the interrupted work is ended.
- this is an example, and other selection methods may be used.
- the work completion button 158 is pressed every time the worker changes the steady work among the modes, that is, every time the work is completed.
- the features of the touch panel can be used to display work contents and notes to be performed by the worker at that time.
- the recording unit 13 adds the time when the work is completed as an activity time to the identified worker, position, object, and state, and records them as activity data. Furthermore, the actual work time is calculated from the time when the work is completed from the activity data and the time when the work is specified earlier, that is, the work start time. Then, the preset standard work time and the actual work time are compared and displayed on the graph display unit 157 of the display 8.
- the graph display unit 157 compares the standard work time preset for each work with the actual work time, and is displayed as a graph as information indicating the work efficiency of the worker.
- a bar graph is displayed for comparison.
- the work completion button 158 since the operator mainly presses the work completion button 158, the work completion button 158 is large and is arranged at the bottom where it is difficult to make a mistake so as not to hinder the workability of the worker. Yes.
- a graph display unit 157 is arranged immediately above the work completion button 158. When the work completion button 158 is pressed, the work time can be confirmed by moving the line of sight slightly, so that it can be used as a pacemaker.
- 20 to 22 show process flowcharts of the activity recording apparatus of FIG. 18 according to the present embodiment.
- an operator is engaged in a mass production process for performing a cycle operation on a determined charge line.
- the position will be explained in the engaged production line.
- the object will be explained in the production lot.
- the mode is the set minimum unit among the work of one cycle applied to each workpiece in the production lot. Therefore, the modes are specified in advance as work 1, work 2, work 3, and work 4, and will be described using the work 1 to the work 4.
- the interrupting work that occurs by interrupting the cycle work will be described as a mode.
- an application (hereinafter referred to as “app”) as an activity recording program for recording activity data is executed.
- the application first checks whether or not the relevance data in the relevance DB 19 exists (step ST171 in FIG. 20). And when it does not exist (NO), relevance data is acquired from the outside and stored in the relevance DB 19 of the storage of the activity recording apparatus 11 (step ST172 in FIG. 20). Further, when the data of the relevance DB 19 exists (YES), the application takes in the content of the relevance DB 19.
- the worker selects each item of the worker, the position, and the object from the activity data.
- an operator is selected (step ST173 in FIG. 20).
- the activity recording device 11 creates a list of production lines that the worker can engage with and presents the list to the worker on the screen of the display 8.
- the worker selects an engaged production line from the displayed list of production lines (step ST174 in FIG. 20).
- the activity recording apparatus 11 creates a list of production lots produced on the production line and presents it to the operator on the screen of the display 8.
- the activity recording device 11 creates a list of work to be performed on the work (hereinafter simply referred to as work) and interruption work according to the selected object, and stores the list in the internal memory.
- step ST176 in FIG. 21 The preparation before the start of production is completed as described above, and the environment setting button is pressed (step ST176 in FIG. 21). Then, the work is started for the switched new production lot (step ST177 in FIG. 21). Next, work is performed (step ST178 in FIG. 21). Next, it is determined whether or not the work is interrupted (step ST179 in FIG. 21).
- step ST180 in FIG. 21 the completion button is pressed (step ST180 in FIG. 21). Then, the activity time is determined each time (step ST181 in FIG. 21). Then, it is saved as activity data in the storage of the activity recording device 11 (step ST182 in FIG. 21). Further, it is determined whether it is the final work in the production lot (step ST185 in FIG. 22).
- the time when the work completion button is pressed becomes the start time of the next work. If the determination is YES, the next operation is started (step ST189 in FIG. 21). And the process similar to the case shown above is repeated. In such a case, it is not necessary to press the button for selecting the work in the following manner.
- step ST179 If the operation is to be interrupted during the operation, the interrupt button 156 is pressed, and YES is determined in step ST179 whether or not the operation is interrupted.
- a suspended work list is displayed in a pop-up (step ST183 in FIG. 21). Then, this time is determined as the activity time as the work interruption start time (step ST181 in FIG. 21). At this point, the activity interruption activity data is recorded (step ST182 in FIG. 21).
- the worker performs the suspended work, and selects the suspended work from the suspended work list at the end of the suspended work (step ST184 in FIG. 21). Then, using the suspended work as a mode, the activity time for ending the suspended work is determined (step ST181 in FIG. 21). Then, the activity data indicating the end of the interrupted work is recorded in the storage of the activity recording device (step ST182 in FIG. 21). At this time, the state of the start time of the interrupting work is first determined and recorded together. Specifically, activity data associated with the interruption work as shown in FIG. 23 is recorded.
- step ST185 is determined in step ST185 as to whether the operation is the final operation, and the operator selects whether to continue the operation (step ST186 in FIG. 21). If not continued, the end button is pushed down to end the production activity (step ST187 in FIG. 21), and the activity recording device is ended (step ST188 in FIG. 21).
- step ST190 to step ST192 in FIG. 21 it is determined in this order whether or not the object, the position, and the operator are changed. If there is a change, it is selected from the list and the same processing as in the case described above is performed again to confirm each element. When not changing all, work is started again from work 1 for a new work of the same lot (step ST193 in FIG. 21). With the above flow, each element of the activity data can be recorded at each change point of the mode.
- the work of past activity data in the cycle is performed during the work in each cycle.
- the actual time is displayed in a bar graph along with the standard time.
- the operator's line of sight when entering the work change point is directed to the display, and the worker is surely confirmed the work instructions and work time data displayed on the display screen, which are necessary for work learning. be able to.
- interrupted work can be specified as a mode and added to the activity data, precise activity data can be recorded.
- FIG. FIG. 24 shows a processing flow when an IC card reading unit is used as the input unit of the first specifying unit 15 in the activity recording apparatus according to the sixth embodiment of the present invention.
- the worker identification process is started (step ST180 in FIG. 24).
- the operator touches the IC card reading unit with the IC card he / she owns.
- the IC card reading unit reads the ID information in the card, and the first specifying unit 15 acquires the worker ID (step ST181 in FIG. 24).
- the first identification unit 15 searches the worker ID DB 76 for the corresponding worker ID and identifies the worker (step ST182 in FIG. 24).
- the 1st specific part 15 narrows down a corresponding position, a subject, and a state by relevance DB19, and extracts relevance data
- a first data table 77 is created (step ST183 in FIG. 24).
- the worker identification process is terminated (step ST184 in FIG. 24).
- FIG. FIG. 25 shows a processing flow when a face recognition camera is used as the input unit of the first specifying unit 15 in the activity recording apparatus according to the seventh embodiment of the present invention.
- the worker identification process is started (step ST190 in FIG. 25).
- the first specifying unit 15 causes the face recognition camera to wait for the operator to capture a face image (step ST191 in FIG. 25). In this state, the face recognition camera captures the operator's face.
- the first specifying unit 15 extracts a feature amount from the photographed face image (step ST192 in FIG. 25).
- the first specifying unit 15 specifies the worker by collating this feature amount with the face feature amount DB 193 previously created for each worker (step ST194 in FIG. 25).
- the 1st specific part 15 narrows down a corresponding position, a subject, and a state by relevance DB19, and extracts relevance data
- a first data table 77 is created (step ST195 in FIG. 25).
- the worker identification process is terminated (step ST196 in FIG. 25).
- the operator can obtain the same effects as those of the above-described embodiments as well as the worker without depending on other physical devices such as an IC card. Can be identified.
- FIG. 26 shows a processing flow when a fingerprint sensor is used as the input unit of the first specifying unit 15 in the activity recording apparatus according to the eighth embodiment of the present invention.
- the worker identification process is started (step ST200 in FIG. 26).
- the first specifying unit 15 causes the fingerprint sensor to wait for acquisition of the worker's fingerprint (step ST201 in FIG. 26).
- the worker touches the finger with the fingerprint sensor, and the fingerprint sensor acquires fingerprint data (step ST202 in FIG. 26).
- the first specifying unit 15 extracts a feature amount from the acquired data of the fingerprint sensor (step ST203 in FIG. 26).
- the first specifying unit 15 specifies the worker by collating this feature amount with the fingerprint feature amount DB 204 created in advance from each worker's fingerprint (step ST205 in FIG. 26).
- the 1st specific part 15 narrows down a corresponding position, a subject, and a state by relevance DB19, and extracts relevance data
- a first data table 77 is created (step ST206 in FIG. 26).
- the worker identification process is terminated (step ST207 in FIG. 26).
- the operator can obtain the same effects as those of the above-described embodiments as well as the worker without depending on other physical devices such as an IC card. Can be identified.
- FIG. FIG. 27 shows a processing flow when a GPS device (Global Positioning System Global Positioning System) is used as the input unit of the second specifying unit 16 in the activity recording device of the ninth embodiment of the present invention.
- the position specifying process is started (step ST210 in FIG. 27).
- the GPS device acquires latitude / longitude information (step ST211 in FIG. 27).
- the second specifying unit 16 acquires the position from the first data table 77 created in the worker specifying process. As shown in FIG. 28, it has data consisting of a combination of a position where the worker can be active and its latitude and longitude.
- the second specifying unit 16 compares the latitude / longitude information calculated from the GPS device with the first data table 77, and specifies the optimum position in the manufacturing site (FIG. 27). Step ST212). The second specifying unit 16 narrows down the corresponding objects and modes using the first data table 77 according to the specified position, and creates the second data table 85 (step ST213 in FIG. 27). Through the above, the position specifying process is completed (step ST214 in FIG. 27).
- FIG. FIG. 29 measures the radio field intensity of a radio wave transmitter (for example, shown in the case of a beacon) installed in the factory in advance as an input unit of the second specifying unit in the activity recording apparatus of the tenth embodiment of the present invention.
- the processing flow in the case of using a sensor for example, it shows about the case of a field intensity sensor.
- a beacon is installed in a place that can be a position in the factory.
- Each beacon is assigned a unique ID, and each beacon always transmits radio waves of equal strength.
- the position specifying process is started (step ST231 in FIG. 29).
- the radio wave intensity sensor uses the radio wave intensity sensor, the radio wave intensity value received from each beacon is acquired in combination with the ID (step ST232 in FIG. 29).
- the radio field intensity measured by the radio field intensity sensor decreases as the distance from the beacon increases (see FIG. 30). Therefore, when the received radio wave intensity values from the respective beacons are compared and only one beacon showing a radio wave intensity value above a certain level is determined, the position can be specified as being in the vicinity of the beacon (see FIG. 31).
- the second specifying unit 16 acquires the position from the first data table 77 created in the worker specifying process.
- the data table has data consisting of a combination of a position where the worker can be active and a beacon ID installed there.
- the second identification unit 16 collates the beacon ID acquired by the radio wave intensity sensor with the first data table 77 to identify the optimum position within the manufacturing site (Ste ST234 in FIG. 29).
- the second specifying unit 16 narrows down the corresponding objects and modes using the first data table 77 according to the specified position, and creates the second data table 85 (step ST235 in FIG. 29).
- the position specifying process is completed (step ST236 in FIG. 29).
- FIG. FIG. 33 shows a sensor for acquiring information on an object obtained from a production facility, with the input unit connected to the production facility as an input unit of the third specifying unit 17 in the activity recording apparatus according to the eleventh embodiment of the present invention.
- the processing flow when using is shown.
- the object specifying process is started (step ST271 in FIG. 33).
- the sensor acquires the production lot information produced by the equipment as an object from the connected production equipment (step ST272 in FIG. 33).
- the third specifying unit 17 compares the information on the object acquired from the sensor with the second data table 85, and the same object as the object acquired from the production facility. To identify an object (step ST273 in FIG. 33). The third specifying unit 17 narrows down the corresponding mode using the second data table 85 in accordance with the specified object, and creates the third data table 95 (step ST274 in FIG. 33). Through the above, the object specifying process is terminated (step ST275 in FIG. 33).
- FIG. FIG. 34 shows a processing flow in the case of using motion capture as the input unit of the fourth specifying unit 18 in the activity recording apparatus of the twelfth embodiment of the present invention.
- Motion capture is a device that digitizes and records the locations and angles of major joints (shoulders, elbows, fingers, hips, knees, etc.) in human movements from images taken by a video camera or the like.
- a video camera or the like.
- the state specifying process is started (step ST281 in FIG. 34).
- the motion capture measures the worker's motion (step ST282 in FIG. 34).
- the fourth specifying unit 18 extracts a feature amount from the measured motion (step ST283 in FIG. 34).
- a motion feature DB 285 is created in advance that extracts the features of motion that can be distinguished from other modes.
- the fourth specifying unit 18 searches the motion feature DB 285 for a feature having the closest feature to the measured motion feature, and the third data table 95 It collates and specifies as an aspect (step ST284 of FIG. 34).
- the recording unit 13 stores, in the storage, the activity data obtained by adding the activity time acquired from the timer 10 to the identified state and the worker, position, and object identified so far (step of FIG. 34). ST286).
- the mode identification process is terminated (step ST287 in FIG. 34).
- FIG. FIG. 36 shows a case where a pedometer (so-called pedometer (registered trademark)) using an acceleration sensor or the like is used as an input unit of the fourth specifying unit 18 in the activity recording apparatus of the thirteenth embodiment of the present invention. The processing flow is shown.
- a pedometer so-called pedometer (registered trademark)
- pedometer registered trademark
- the worker is walking (the number of steps is counted up with the passage of time) or not walking from the cumulative number of steps and the elapsed time of the worker. It is possible to specify whether the vehicle is in a stopped state (the number of steps has not been counted up even when time has elapsed).
- the status of workers who perform special activities such as supplying parts from warehouses to each line can be broadly divided into two types: parts transportation and parts loading / unloading at warehouses and lines. Either can be specified by detecting the walking state by the sensor.
- the mode identification process is started (step ST301 in FIG. 36).
- the acceleration sensor detects a walking state or a stopped state (step ST302 in FIG. 36).
- the fourth specifying unit 18 specifies the mode according to the detected walking state or stopped state (step ST303 in FIG. 36).
- the recording unit 13 adds the activity time acquired from the timer 10 to the identified mode and the worker, position, and object identified so far. Data is stored in the storage (step ST304 in FIG. 36). Through the above, the mode identification process is terminated (step ST305 in FIG. 36).
- FIG. FIG. 37 shows a processing flow when an acceleration sensor and a geomagnetic sensor are used as the input unit of the second specifying unit 16 in the activity recording apparatus according to the fourteenth embodiment of the present invention.
- the worker's walking state can be detected by using the acceleration sensor. At this time, since the position changes with walking, it is necessary to update the position.
- the acceleration sensor detects the walking state (step ST311 in FIG. 37).
- specification part 16 starts position specification (step ST312 of FIG. 37).
- the geomagnetic sensor acquires the azimuth angle when the walking state is detected (step ST313 in FIG. 37).
- the position can be updated (step ST314 in FIG. 37).
- the second specifying unit 16 specifies the position (step ST315 in FIG. 37).
- the second specifying unit 16 narrows down the corresponding object and state using the first data table 77 according to the specified position, and creates the second data table 85 (step ST316 in FIG. 37). Through the above, the position specifying process is completed (step ST317 in FIG. 37).
- the same effects as those of the above-described embodiments can be obtained, and the operator who performs the special activity needs to input by the worker.
- the position can be specified.
- FIG. FIG. 38 is a diagram showing a usage example of the communication module in the activity recording apparatus according to the fifteenth embodiment of the present invention.
- the activity recording device 11 has a communication module, and can transmit and receive files by connecting to a network.
- each worker in a series of work processes from delivery, parts assembly, product assembly, packing, and shipping possesses the activity recording device 11.
- Each activity recording device 11 is connected to the same network.
- a terminal owned by the supervisor of the workplace is also connected to this network. Therefore, each activity recording device 11 and terminal correspond to other communication devices from one activity recording device 11.
- the activity recording device 11 of each worker always transmits the recorded activity data to the supervisor's terminal through the network.
- the supervisor receives the activity data of each process obtained and can distribute individual information to the activity recording apparatus 11 possessed by each worker. For example, an operator who has sufficiently advanced compared to the preceding and following processes is instructed to leave the place and go to support other processes that are progressing slowly.
- the fifteenth embodiment configured as described above, it is possible to obtain the same effects as those of the above-described embodiments as well as individual workers by using the activity recording device provided with the communication module.
- the supervisor can grasp the work performance in real time, and can contribute to productivity improvement by distributing information to the worker individually according to the performance.
Landscapes
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Development Economics (AREA)
- Manufacturing & Machinery (AREA)
- Game Theory and Decision Science (AREA)
- Automation & Control Theory (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- Primary Health Care (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- General Factory Administration (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Time Recorders, Dirve Recorders, Access Control (AREA)
Abstract
Description
作業者の活動を活動データとして記録する活動記録装置において、
前記作業者を特定する第一特定部と、
前記作業者の位置を特定する第二特定部と、
前記作業者の対象物を特定する第三特定部と、
前記作業者の様態を特定する第四特定部と、
前記第一特定部、前記第二特定部、前記第三特定部、および前記第四特定部により特定された、前記作業者、前記位置、前記対象物、および、前記様態に対して、特定された時刻を活動時刻として関連付けて前記活動データとして記録する記録部とを備えるものである。
作業者の活動を活動データとして記録する活動記録プログラムにおいて、
コンピュータに、
作業者を特定する第一特定ステップと、
前記作業者の位置を特定する第二特定ステップと、
前記作業者の対象物を特定する第三特定ステップと、
前記作業者の様態を特定する第四特定ステップと、
前記第一特定ステップ、前記第二特定ステップ、前記第三特定ステップ、および前記第四特定ステップにより特定された、前記作業者、前記位置、前記対象物、および、前記様態に対して、特定された時刻を活動時刻として関連付けて前記活動データとして記録する記録ステップとを行わせるものである。
作業者の活動を活動データとして記録する活動記録方法において、
作業者を特定する第一特定工程と、
前記作業者の位置を特定する第二特定工程と、
前記作業者の対象物を特定する第三特定工程と、
前記作業者の様態を特定する第四特定工程と、
前記第一特定工程、前記第二特定工程、前記第三特定工程、および前記第四特定工程により特定された、作業者、位置、対象物、および、様態に対して、特定された時刻を活動時刻を関連付けて前記活動データとして記録する記録工程とを備えるものである。
作業者の活動の全容を簡便に記録する。
本発明は、生産現場など作業者の活動する領域における作業者の生産性分析において必要な作業者活動の出現パターンと取得情報の種類をあらかじめ規定することで、その作業者の活動データの記録を容易に行うことができる活動記録装置を提示する。
活動記録装置11は、第一特定部15、第二特定部16、第三特定部17、第四特定部18、記録部13、入力部7、タイマー10、通信モジュール9、ディスプレイ8、電源14を備えている。
図5はこの発明の実施の形態2における活動記録装置11の構成を示す図である。図6は図5に示した活動記録装置11の関連性データベースの構成を示す図である。上記実施の形態1においては、各特定部から各要素をそれぞれ特定する例を示したが、本実施の形態においては、各要素である、「作業者」、「位置」、「対象物」、「様態」があらかじめ関連付けられた関連性データベース(以下、データベースはDBと略して示す)19を用いる場合について説明する。
このように、作業者、位置、対象物としてありうる組み合わせすべてに対して様態を付加したものによって、関連性DB19を実現することが可能である。
図8は、第一特定部15に対応するものとして、作業者IDリストから選択する場合の処理を示している。作業者特定プロセスを開始する(図8のステップST71)。第一特定部15は作業者IDDB76から、作業者に提示する作業者IDリストをディスプレイ8に表示する(図8のステップST72、図7の作業者リスト53)。これを受けて、作業者は自らに該当する作業者IDを選択し、タッチパネルのディスプレイ8から第一特定部15に入力すると作業者が特定される(図8のステップST73)。
本実施の形態により、タッチパネルやキーボードなど、汎用的な入力インターフェイスのみによって、作業者入力を実現することが可能である。
本実施の形態により、タッチパネルやキーボードなど、汎用的な入力インターフェイスのみによって、位置入力を実現することが可能である。
対象物特定プロセスを開始する(図10のステップST91)。第三特定部17は、作業者特定プロセス及び位置特定プロセスを経て作成された、第二データテーブル85から、当該作業者が当該位置にて活動しうる対象物リストをディスプレイ8に表示する(図10のステップST92、図7の対象物リストのハッチングが付されていない部分)。これを受けて、作業者は自らに該当する対象物を選択し、タッチパネルのディスプレイ8から第三特定部17に入力すると対象物が特定される(図10のステップST93)。
本実施の形態により、タッチパネルやキーボードなど、汎用的な入力インターフェイスのみによって、対象物入力を実現することが可能である。
様態特定プロセスを開始すると(図11のステップST101)。第四特定部18は、作業者特定プロセス、位置特定プロセス及び対象物特定プロセスを経て作成された、第三データテーブル95から、当該作業者が当該位置にて当該対象物の様態リストを表示する(図11のステップST102、図7の様態リスト56)。これを受けて、作業者は自らに該当する様態を選択し、タッチパネルのディスプレイ8から第四特定部18に入力すると様態が特定される(図11のステップST103)。
本実施の形態では、タッチパネルやキーボードなど、汎用的な入力インターフェイスのみによって、様態入力を実現することが可能である。
本実施の形態3は、上記実施の形態2において、一端、作業者、位置、対象物および様態が特定された後、作業者、位置、対象物が一定で、様態のみが変更する場合における、活動データの記録について説明する。
次に、第三特定部17は、第二データテーブル85から、対象物のリストを上記実施の形態2と同様に表示する。作業者がリストから一つの対象物を選択して、対象物が特定される(図14のステップST117)。
上記実施の形態2における関連性DB19(図6参照)は、各カラムの要素変更の際に、一つのDB全体に手を加える必要がある点で問題がある。また、一つのテーブルにすべての情報を登録する必要があるため、DB構築の負担が大きいという点も問題がある。そこで、いくつかの内部変数を用意することでテーブルを分離し、管理および構築を容易にしたものが本実施の形態である。
まず、一般に生産現場において、作業者は個々人で扱われることはなく、所属する班などの組織ごとに管理される。そこで、関連性DB19において、作業者にあたる項目を班とし、作業者IDがどの班に属するかを定義した作業者所属班マスタを内部テーブルを持っておく。このテーブルにより、作業者が自らのIDを選択すると、対応する班を作業者として登録可能である。なお、このテーブルは、工場に既存の人事管理システムなどから作成することができる。
本発明の実施の形態5の活動記録装置においては、作業の完了について、および、作業を中断する場合について説明する。作業の完了および作業の中断は、様態の1つとして設定する。
図18および図19は活動記録装置11のディスプレイ8の表示例を示す図である。図18において、ディスプレイ8上に、活動データを記録するソフトウェアの実行画面が表示される。画面上には、当該ソフトウェアの終了ボタン151、当該ソフトウェアの動作の環境設定ボタン152が最上部に表示されている。
作業完了ボタン158上には、タッチパネルの特徴を活用し、その時点で作業者が実施すべき作業内容や注意事項を表示することができる。
最初に、作業者を選択する(図20のステップST173)。選択された作業者に応じて、活動記録装置11は、作業者が従事しうる製造ラインのリストを作成し、ディスプレイ8の画面により作業者に提示する。
そして、この時点が、作業中断の開始時刻として活動時刻として確定する(図21のステップST181)。そして、この時点で作業中断の活動データを記録する(図21のステップST182)。
継続しない場合は、終了ボタンを押し下げて生産活動を終了し(図21のステップST187)、活動記録装置を終了させる(図21のステップST188)。
以上のフローにより、様態の変化点の度に、活動データの各要素を記録することが可能である。
図24は、この発明の実施の形態6の活動記録装置における、第一特定部15の入力部として、ICカード読み取り部を利用する場合の処理フローを示している。
作業者特定プロセスを開始する(図24のステップST180)。作業者は、自ら所持するICカードをICカード読み取り部にタッチする。そして、ICカード読み取り部はカード内のID情報を読み取り、第一特定部15は作業者IDを取得する(図24のステップST181)。第一特定部15は作業者IDDB76から、該当する作業者IDを探索し、作業者を特定する(図24のステップST182)。
図25は、この発明の実施の形態7の活動記録装置における、第一特定部15の入力部として、顔認識カメラを利用する場合の処理フローを示している。
作業者特定プロセスを開始する(図25のステップST190)。第一特定部15は顔認識カメラを、作業者の顔画像の撮影を待機させる(図25のステップST191)。この状態において、顔認識カメラは作業者の顔を撮影する。第一特定部15は、撮影した顔画像から特徴量を抽出する(図25のステップST192)。次に、第一特定部15はこの特徴量と、予め各作業者について作成した顔特徴量DB193を照合させることで、作業者を特定する(図25のステップST194)。
図26は、この発明の実施の形態8の活動記録装置における、第一特定部15の入力部として、指紋センサを利用する場合の処理フローを示している。
作業者特定プロセスを開始する(図26のステップST200)。第一特定部15は指紋センサを、作業者の指紋の取得のため待機させる(図26のステップST201)。
図27は、この発明の実施の形態9の活動記録装置における、第二特定部16の入力部として、GPS装置(Global Positioning System 全地球測位システム)を利用する場合の処理フローを示している。
位置特定プロセスを開始する(図27のステップST210)。GPS装置は、緯度経度情報を取得する(図27のステップST211)。
図29は、この発明の実施の形態10の活動記録装置における、第二特定部の入力部として、あらかじめ工場内に設置した電波発信機(例えば、ビーコンの場合について示す)の電波強度を計測するセンサ(例えば、電波強度センサの場合について示す)を利用する場合の処理フローを示している。
位置特定プロセスを開始する(図29のステップST231)。次に、電波強度センサを用いて、各ビーコンから受信した電波強度値をIDと組みで取得する(図29のステップST232)。
図33は、この発明の実施の形態11の活動記録装置における、第三特定部17の入力部として、入力部を生産設備に接続され、生産設備から得られた対象物の情報を取得するセンサを利用する場合の処理フローを示している。
対象物特定プロセスを開始する(図33のステップST271)。センサは、接続された生産設備より、対象物として、設備で生産されている生産ロット情報を取得する(図33のステップST272)。
第三特定部17は、特定された対象物に応じて、第二データテーブル85により、対応する様態を絞り込み、第三データテーブル95を作成する(図33のステップST274)。以上を経て、対象物特定プロセスを終了する(図33のステップST275)。
図34は、この発明の実施の形態12の活動記録装置における、第四特定部18の入力部として、モーションキャプチャを利用する場合の処理フローを示している。モーションキャプチャとは、ビデオカメラ等から撮影された映像等から、人間の動作において主要な関節(肩、肘、指、腰、膝など)の箇所や角度を数値化して記録する装置である。例えば、作業者が携帯するタブレット端末に内蔵される加速度センサまたはジャイロセンサやカメラなどである。
また、第三データテーブル95に含まれる作業者が執りうる様態については、他の様態と区別できる動きの特徴を抽出したモーション特徴量DB285を予め作成されている。
記録部13は、特定された様態、そしてこれまでに特定された作業者、位置、対象物に対し、タイマー10から取得した活動時刻を加えた活動データを、ストレージに保存する(図34のステップST286)。以上を経て、様態特定プロセスを終了する(図34のステップST287)。
図36は、この発明の実施の形態13の活動記録装置における、第四特定部18の入力部として、加速度センサを用いた歩数計測装置(いわゆる万歩計(登録商標))等を利用する場合の処理フローを示している。
図37は、この発明の実施の形態14の活動記録装置における、第二特定部16の入力部として、加速度センサおよび地磁気センサを利用する場合の処理フローを示している。
上記実施の形態13にて示したように、加速度センサの利用によって作業者の歩行状態を検出することが可能である。この際、歩行に伴って位置が変化するため、位置を更新することが必要となる。
図38は、この発明の実施の形態15の活動記録装置における通信モジュールの利用例を示した図である。活動記録装置11は通信モジュールを有しており、ネットワークに接続してファイルの送受信を行うことが可能である。
図38では、納品、部品組立、製品組立、梱包、出荷までの一連の作業工程の作業者それぞれが活動記録装置11を所持している。そして、各活動記録装置11は同一ネットワークに接続している。また、このネットワークには、当該職場の監督者が所持する端末も接続されている。よって、各活動記録装置11および端末は、1つの活動記録装置11からは他の通信機器に相当する。
Claims (19)
- 作業者の活動を活動データとして記録する活動記録装置において、
前記作業者を特定する第一特定部と、
前記作業者の位置を特定する第二特定部と、
前記作業者の対象物を特定する第三特定部と、
前記作業者の様態を特定する第四特定部と、
前記第一特定部、前記第二特定部、前記第三特定部、および前記第四特定部により特定された、前記作業者、前記位置、前記対象物、および、前記様態に対して、特定された時刻を活動時刻として関連付けて前記活動データとして記録する記録部とを備える活動記録装置。 - 前記記録部は、特定された前記作業者、前記位置、前記対象物および前記様態の要素の内、少なくともいずれか1つの要素が変化するたびに、特定された時刻を活動時刻として付加して前記活動データとして記録する請求項1に記載の活動記録装置。
- 前記作業者、前記位置、前記対象物および前記様態の関連性データを格納する関連性データベースを備え、
前記第一特定部、前記第二特定部、前記第三特定部のいずれか1つは、特定された前記作業者、前記位置、および前記対象物に対応する前記関連性データを前記関連性データベースから抽出し、
前記第一特定部、前記第二特定部、前記第三特定部、および前記第四特定部のいずれか1つは、前記抽出された前記関連性データから、前記作業者、前記位置、前記対象物および前記様態を特定する請求項1または請求項2に記載の活動記録装置。 - 前記第一特定部は、タッチパネル、ICカード読取り部、顔認識カメラ、または、指紋認識センサの少なくともいずれか1つの入力部からの情報に基づいて前記作業者を特定する請求項1に記載の活動記録装置。
- 前記第二特定部は、タッチパネル、GPS装置、電波強度センサ、または、加速度センサおよび地磁気センサの少なくともいずれか1つの入力部からの情報に基づいて前記位置を特定する請求項1に記載の活動記録装置。
- 前記第三特定部は、タッチパネル、または、前記対象物に対するセンサの少なくともいずれか1つの入力部からの情報に基づいて前記対象物を特定する請求項1に記載の活動記録装置。
- 前記第四特定部は、タッチパネル、モーションキャプチャ、または、加速度センサの少なくともいずれか1つの入力部からの情報に基づいて前記様態を特定する請求項1に記載の活動記録装置。
- タッチパネルを有するディスプレイを備える請求項1に記載の活動記録装置。
- 前記ディスプレイには、作業完了ボタンが表示され、
前記記録部は、前記作業完了ボタンが押されると、特定された前記作業者、前記位置、前記対象物および前記様態に対して、作業完了した時刻を活動時刻として付加して対応付けて前記活動データとして記録するとともに、前記活動データから実績作業時間を算出して、あらかじめ設定されている標準作業時間と前記実績作業時間とを対比して前記ディスプレイに表示する請求項8に記載の活動記録装置。 - 前記ディスプレイには、中断ボタンが表示し、前記中断ボタンが押されると複数の中断作業が選択肢として表示され、
前記第四特定部は、いずれかの前記中断作業が選択されると、前記様態として特定し、
前記記録部は、特定された、前記作業者、前記位置、前記対象物に対して、選択された前記中断作業を様態として、特定された時刻を活動時刻として付加して対応付けて前記活動データとして記録する請求項8または請求項9に記載の活動記録装置。 - 前記活動データを、他の通信機器に送信し、且つ、他の前記通信機器から送信された情報を受信する通信モジュールを備えた請求項1から請求項10のいずれか1項に記載の活動記録装置。
- 作業者の活動を活動データとして記録する活動記録プログラムにおいて、
コンピュータに、
作業者を特定する第一特定ステップと、
前記作業者の位置を特定する第二特定ステップと、
前記作業者の対象物を特定する第三特定ステップと、
前記作業者の様態を特定する第四特定ステップと、
前記第一特定ステップ、前記第二特定ステップ、前記第三特定ステップ、および前記第四特定ステップにより特定された、前記作業者、前記位置、前記対象物、および、前記様態に対して、特定された時刻を活動時刻として関連付けて前記活動データとして記録する記録ステップとを行わせる活動記録プログラム。 - 前記記録ステップは、特定された前記作業者、前記位置、前記対象物および前記様態の要素の内、少なくともいずれか1つの要素が変化するたびに、特定された時刻を活動時刻として付加して前記活動データとして記録する請求項12に記載の活動記録プログラム。
- 前記コンピュータに、前記作業者、前記位置、前記対象物および前記様態の関連性データを格納する関連性データベースを備え、
前記第一特定ステップ、前記第二特定ステップ、前記第三特定ステップのいずれか1つは、特定された前記作業者、前記位置、および前記対象物に対応する前記関連性データを前記関連性データベースから抽出し、
前記第一特定ステップ、前記第二特定ステップ、前記第三特定ステップ、および前記第四特定ステップのいずれか1つは、前記抽出された前記関連性データから、前記作業者、前記位置、前記対象物および前記様態を特定する請求項12または請求項13に記載の活動記録プログラム。 - 前記コンピュータは、タッチパネルを有するディスプレイからの情報を入力する請求項12に記載の活動記録プログラム。
- 前記ディスプレイに、作業完了ボタンを表示し、
前記記録ステップは、前記作業完了ボタンが押されると、特定された前記作業者、前記位置、前記対象物および前記様態に対して、作業完了した時刻を活動時刻として付加して対応付けて前記活動データとして記録するとともに、前記活動データから実績作業時間を算出して、あらかじめ設定されている標準作業時間と前記実績作業時間とを対比して前記ディスプレイに表示する請求項15に記載の活動記録プログラム。 - 前記ディスプレイには、中断ボタンを表示し、前記中断ボタンが押されると複数の中断作業が選択肢として表示され、
前記第四特定ステップは、いずれかの前記中断作業が選択されると、前記様態として特定し、
前記記録ステップは、特定された、前記作業者、前記位置、前記対象物に対して、選択された前記中断作業を様態として、特定された時刻を活動時刻として付加して対応付けて前記活動データとして記録する請求項15または請求項16に記載の活動記録プログラム。 - 作業者の活動を活動データとして記録する活動記録方法において、
作業者を特定する第一特定工程と、
前記作業者の位置を特定する第二特定工程と、
前記作業者の対象物を特定する第三特定工程と、
前記作業者の様態を特定する第四特定工程と、
前記第一特定工程、前記第二特定工程、前記第三特定工程、および前記第四特定工程により特定された、作業者、位置、対象物、および、様態に対して、特定された時刻を活動時刻を関連付けて前記活動データとして記録する記録工程とを備える活動記録方法。 - 前記記録工程は、特定された前記作業者、前記位置、前記対象物および前記様態の要素の内、少なくともいずれか1つの要素が変化するたびに、特定された時刻を活動時刻として付加して前記活動データとして記録する請求項18に記載の活動記録方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112016003284.6T DE112016003284T5 (de) | 2015-07-22 | 2016-03-17 | Aktivitäts-aufzeichner, aktivitäts-aufzeichnungsprogramm und aktivitäts-aufzeichnungsverfahren |
JP2017529472A JP6359193B2 (ja) | 2015-07-22 | 2016-03-17 | 活動記録装置、活動記録プログラム、および、活動記録方法 |
CN201680018777.8A CN107430397A (zh) | 2015-07-22 | 2016-03-17 | 活动记录装置、活动记录程序以及活动记录方法 |
US15/563,451 US20180122157A1 (en) | 2015-07-22 | 2016-03-17 | Activity recorder, activity recording program, and activity recording method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-144724 | 2015-07-22 | ||
JP2015144724 | 2015-07-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017013899A1 true WO2017013899A1 (ja) | 2017-01-26 |
Family
ID=57833855
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/058433 WO2017013899A1 (ja) | 2015-07-22 | 2016-03-17 | 活動記録装置、活動記録プログラム、および、活動記録方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20180122157A1 (ja) |
JP (1) | JP6359193B2 (ja) |
CN (1) | CN107430397A (ja) |
DE (1) | DE112016003284T5 (ja) |
WO (1) | WO2017013899A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018147389A (ja) * | 2017-03-08 | 2018-09-20 | 株式会社日立製作所 | 製造管理方法、及び製造管理システム |
JP2018163556A (ja) * | 2017-03-27 | 2018-10-18 | 三菱重工業株式会社 | 作業管理装置、作業管理方法およびプログラム |
WO2019017061A1 (ja) * | 2017-07-18 | 2019-01-24 | 株式会社日立製作所 | 進捗・稼動監視システムおよび方法 |
JP6489562B1 (ja) * | 2017-11-30 | 2019-03-27 | 味の素物流株式会社 | 物流倉庫内作業把握システム |
JP2019053368A (ja) * | 2017-09-13 | 2019-04-04 | 株式会社日立製作所 | 作業判別システム、学習装置、及び学習方法 |
WO2019130479A1 (ja) * | 2017-12-27 | 2019-07-04 | 株式会社シナプスイノベーション | 作業実績管理システム及び方法 |
EP3522084A1 (en) * | 2018-02-05 | 2019-08-07 | Yokogawa Electric Corporation | Operation evaluation device, operation evalutation method, and non-transistory computer readable storage medium |
WO2020044797A1 (ja) * | 2018-08-27 | 2020-03-05 | 三菱電機株式会社 | 情報処理装置、情報処理システム、及び情報処理方法 |
KR20200117357A (ko) * | 2019-04-04 | 2020-10-14 | (사)사단법인한국농식품아이씨티융복합산업협회 | 스마트팜 작업량 측정장치 |
WO2021166402A1 (ja) * | 2020-02-21 | 2021-08-26 | オムロン株式会社 | 行動解析装置及び行動解析方法 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018045395A (ja) * | 2016-09-13 | 2018-03-22 | 株式会社ジェイテクト | 教育支援装置 |
CN108174135B (zh) * | 2018-01-05 | 2021-03-12 | 张家昊 | 一种视频证据的拍摄方法及装置 |
JP2021033336A (ja) * | 2019-08-13 | 2021-03-01 | 株式会社ディスコ | 作業時間集計システム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005250726A (ja) * | 2004-03-03 | 2005-09-15 | Mori Seiki Co Ltd | 作業進捗管理装置及び作業進捗管理システム |
JP2006178583A (ja) * | 2004-12-21 | 2006-07-06 | Yokogawa Electric Corp | 作業者行動記録システム |
JP2008287665A (ja) * | 2007-05-21 | 2008-11-27 | Nissan Motor Co Ltd | 製品履歴管理システムおよび製品履歴管理方法 |
JP2009294732A (ja) * | 2008-06-03 | 2009-12-17 | Hitachi Ltd | 作業要素時間出力装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4735172B2 (ja) * | 2005-10-06 | 2011-07-27 | オムロン株式会社 | 生産管理装置、生産管理方法、生産管理プログラム、生産管理プログラムを記録した記録媒体、および生産システム |
JP6057607B2 (ja) * | 2012-05-23 | 2017-01-11 | 三菱重工業株式会社 | 生産管理システム及び方法 |
-
2016
- 2016-03-17 JP JP2017529472A patent/JP6359193B2/ja active Active
- 2016-03-17 CN CN201680018777.8A patent/CN107430397A/zh active Pending
- 2016-03-17 US US15/563,451 patent/US20180122157A1/en not_active Abandoned
- 2016-03-17 DE DE112016003284.6T patent/DE112016003284T5/de active Pending
- 2016-03-17 WO PCT/JP2016/058433 patent/WO2017013899A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005250726A (ja) * | 2004-03-03 | 2005-09-15 | Mori Seiki Co Ltd | 作業進捗管理装置及び作業進捗管理システム |
JP2006178583A (ja) * | 2004-12-21 | 2006-07-06 | Yokogawa Electric Corp | 作業者行動記録システム |
JP2008287665A (ja) * | 2007-05-21 | 2008-11-27 | Nissan Motor Co Ltd | 製品履歴管理システムおよび製品履歴管理方法 |
JP2009294732A (ja) * | 2008-06-03 | 2009-12-17 | Hitachi Ltd | 作業要素時間出力装置 |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10691112B2 (en) | 2017-03-08 | 2020-06-23 | Hitachi, Ltd. | Manufacturing management method and manufacturing management system |
CN108573292A (zh) * | 2017-03-08 | 2018-09-25 | 株式会社日立制作所 | 制造管理方法以及制造管理系统 |
JP2018147389A (ja) * | 2017-03-08 | 2018-09-20 | 株式会社日立製作所 | 製造管理方法、及び製造管理システム |
JP2018163556A (ja) * | 2017-03-27 | 2018-10-18 | 三菱重工業株式会社 | 作業管理装置、作業管理方法およびプログラム |
WO2019017061A1 (ja) * | 2017-07-18 | 2019-01-24 | 株式会社日立製作所 | 進捗・稼動監視システムおよび方法 |
JP2019053368A (ja) * | 2017-09-13 | 2019-04-04 | 株式会社日立製作所 | 作業判別システム、学習装置、及び学習方法 |
JP6489562B1 (ja) * | 2017-11-30 | 2019-03-27 | 味の素物流株式会社 | 物流倉庫内作業把握システム |
JP2019101693A (ja) * | 2017-11-30 | 2019-06-24 | 味の素物流株式会社 | 物流倉庫内作業把握システム |
WO2019130479A1 (ja) * | 2017-12-27 | 2019-07-04 | 株式会社シナプスイノベーション | 作業実績管理システム及び方法 |
EP3522084A1 (en) * | 2018-02-05 | 2019-08-07 | Yokogawa Electric Corporation | Operation evaluation device, operation evalutation method, and non-transistory computer readable storage medium |
US11500350B2 (en) | 2018-02-05 | 2022-11-15 | Yokogawa Electric Corporation | Operation evaluation device, operation evaluation method, and non-transitory computer readable storage medium |
WO2020044797A1 (ja) * | 2018-08-27 | 2020-03-05 | 三菱電機株式会社 | 情報処理装置、情報処理システム、及び情報処理方法 |
JPWO2020044797A1 (ja) * | 2018-08-27 | 2021-08-10 | 三菱電機株式会社 | 情報処理装置、情報処理システム、及び情報処理方法 |
KR20200117357A (ko) * | 2019-04-04 | 2020-10-14 | (사)사단법인한국농식품아이씨티융복합산업협회 | 스마트팜 작업량 측정장치 |
KR102320685B1 (ko) * | 2019-04-04 | 2021-11-01 | (사)사단법인한국농식품아이씨티융복합산업협회 | 스마트팜 작업량 측정방법 |
WO2021166402A1 (ja) * | 2020-02-21 | 2021-08-26 | オムロン株式会社 | 行動解析装置及び行動解析方法 |
Also Published As
Publication number | Publication date |
---|---|
US20180122157A1 (en) | 2018-05-03 |
JPWO2017013899A1 (ja) | 2017-09-14 |
JP6359193B2 (ja) | 2018-07-18 |
CN107430397A (zh) | 2017-12-01 |
DE112016003284T5 (de) | 2018-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6359193B2 (ja) | 活動記録装置、活動記録プログラム、および、活動記録方法 | |
US10679307B2 (en) | Systems and methods for optimizing project efficiency | |
JP5416322B2 (ja) | 作業管理システム、作業管理端末、プログラム及び作業管理方法 | |
CN106020138B (zh) | 针对工业数据的分层图呈现 | |
US20150066550A1 (en) | Flow line data analysis device, system, non-transitory computer readable medium and method | |
CN108369419A (zh) | 使用移动机器人的对象观测来生成时空对象清单并且使用该清单来确定用于移动机器人的监测参数 | |
US20160260046A1 (en) | Tracking worker activity | |
US20080027578A1 (en) | Parts production management system and parts production management method | |
US20150066551A1 (en) | Flow line data analysis device, system, program and method | |
WO2010047150A1 (ja) | 作業情報処理装置、プログラムおよび作業情報処理方法 | |
JP2015043138A (ja) | 資材管理方法および資材管理システム | |
EP2546815B1 (en) | System and method of alarm installation and configuration | |
Kuandee et al. | Asset Supply Chain Management System-based IoT Technology for Higher Education Institutions. | |
US20160026655A1 (en) | Space Equipment Recognition and Control Using Handheld Devices | |
Xie et al. | Implementation of BIM/RFID in computer-aided design-manufacturing-installation process | |
CN108229780B (zh) | 一种基于平板电脑的车间管理系统和方法 | |
JPWO2019039126A1 (ja) | 活動記録装置、活動記録プログラム、および、活動記録方法 | |
WO2017203598A1 (ja) | 作業支援システム、割振りシステム及び割振り方法 | |
JP2013120579A (ja) | ソフトウェア開発に関する作業実績管理方法、作業実績管理プログラムおよび管理サーバ | |
JP2001159916A (ja) | 設備管理支援システム | |
JP2016177591A (ja) | 人員管理システム、情報解析装置、人員管理方法及び人員管理プログラム | |
JP2013037478A (ja) | 作業工数算出装置、作業工数算出方法、およびプログラム | |
JP7090820B1 (ja) | ライン管理支援装置、ライン管理支援方法及びプログラム | |
JP6860058B1 (ja) | 管理システム | |
US20220383233A1 (en) | Work support system, work support method, and work support program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16827464 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017529472 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15563451 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112016003284 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16827464 Country of ref document: EP Kind code of ref document: A1 |