WO2021191941A1 - プラント運転支援装置、プラント運転支援方法 - Google Patents
プラント運転支援装置、プラント運転支援方法 Download PDFInfo
- Publication number
- WO2021191941A1 WO2021191941A1 PCT/JP2020/012602 JP2020012602W WO2021191941A1 WO 2021191941 A1 WO2021191941 A1 WO 2021191941A1 JP 2020012602 W JP2020012602 W JP 2020012602W WO 2021191941 A1 WO2021191941 A1 WO 2021191941A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- evaluation
- teamwork
- support
- index value
- user
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06393—Score-carding, benchmarking or key performance indicator [KPI] analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y04—INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
- Y04S—SYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
- Y04S10/00—Systems supporting electrical power generation, transmission or distribution
- Y04S10/50—Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications
Definitions
- This application relates to a plant operation support device and a plant operation support method.
- GUI graphical user interface
- plant operation management is performed by an operation team consisting of multiple people who monitor and operate the central control panel of the monitoring and control room.
- an evaluation method has been proposed in which teamwork during driving training is evaluated and appropriate advice is presented in the case of malfunction (see, for example, Patent Documents 1 and 2). .).
- Japanese Unexamined Patent Publication No. 2003-271048 paragraphs 0027 to 0031, FIG. 2
- Japanese Unexamined Patent Publication No. 2019-36205 paragraphs 0022 to 0023, 0049 to 0051, FIG. 1
- the purpose of this application is to disclose the technology for solving the above-mentioned problems, and to evaluate teamwork and support appropriate plant operation without preparing fixed correct answer data. And.
- the plant operation support device disclosed in the present application is provided corresponding to each of a plurality of members constituting the team, and is an output device that presents information for performing plant operation as the team, and each of the plurality of members.
- an index value obtained by quantifying the degree of each evaluation item is used for a plurality of evaluation items for evaluating any of the state, the communication state between the members, and the workload state of each of the plurality of members.
- An index value calculation unit to be calculated, a teamwork evaluation unit that evaluates the teamwork of the team based on the index value, and a support target person including a member of the team are selected based on the evaluation result of the teamwork.
- the teamwork evaluation unit is provided with a presentation information generation unit that generates , The feature is to evaluate the teamwork.
- the plant operation support method disclosed in the present application is one of a state of each of a plurality of members who operate the plant as a team, a state of communication between the members, and a workload state of each of the plurality of members.
- an index value calculation step for calculating an index value quantified for each evaluation item
- a teamwork evaluation step for evaluating the teamwork of the team based on the calculated index value
- the support target person including the members of the team is selected, and the support content determination step for determining the support content according to the selected support target person, and each of the selected support target people
- the teamwork evaluation step includes a support content presentation step for presenting the determined support content, and is a combination of logical expressions for comparing the threshold value set for each of the plurality of evaluation items and the calculated index value. It is characterized in that the teamwork is evaluated by using the constructed evaluation logic.
- teamwork is evaluated using an evaluation logic composed of a combination of logical expressions, so a fixed correct answer data should be prepared. It is possible to evaluate teamwork and support appropriate plant operation.
- FIG. 1 It is a block diagram for demonstrating the structure of the plant operation support apparatus which concerns on Embodiment 1.
- FIG. It is a figure which shows the hardware configuration of the plant operation support apparatus which concerns on Embodiment 1.
- FIG. It is a figure which shows the data structure stored in the procedure database in the plant operation support apparatus which concerns on Embodiment 1.
- FIG. It is a figure which visualized a part of the index value calculated for a certain user in the plant operation support apparatus which concerns on Embodiment 1.
- FIG. It is a figure which shows the data structure of the index prediction value of the workload calculated by the index prediction value calculation unit in the plant operation support apparatus which concerns on Embodiment 1.
- FIG. 5 is a flowchart showing an evaluation logic for determining whether or not the authority gradient stored in the evaluation logic database is high in the plant operation support device according to the first embodiment. It is a figure which shows the data structure of the teamwork evaluation data evaluated by the teamwork evaluation part in the plant operation support apparatus which concerns on Embodiment 1. FIG. It is a figure which shows the data structure of the support content determination table which the support content determination part has in the plant operation support apparatus which concerns on Embodiment 1. FIG. It is a figure which shows the screen display example of the operation support information when it is evaluated that the team performance has deteriorated in the plant operation support device which concerns on Embodiment 1. FIG.
- FIG. 1 It is a flowchart for demonstrating the operation of updating the evaluation logic in the plant operation support apparatus which concerns on Embodiment 1.
- FIG. It is a figure which shows the screen display example at the time of editing the evaluation logic by the screen operation in the plant operation support apparatus which concerns on Embodiment 1.
- FIG. It is a figure for demonstrating the data at the time of determining the start of change at the time of automatically changing the evaluation logic in the plant operation support apparatus which concerns on Embodiment 1.
- Embodiment 1. 1 to 13 are for explaining the plant operation support device or the plant operation support method according to the first embodiment
- FIG. 1 is a block diagram for explaining a plant operation support device configuration
- FIG. 2 is a block diagram.
- FIG. 3 is a flowchart for explaining the operation of the plant operation support device, that is, the plant operation support method.
- FIG. 4 is a diagram showing a data structure stored in the procedure database constituting the plant operation support device.
- FIG. 5 is a time-series visualization of a part of the index value calculated for a certain user calculated by the index prediction value calculation unit
- FIG. 6 is the workload calculated for each user by the index prediction value calculation unit. It is a figure which shows the data structure of the index predicted value.
- FIG. 7 is a flowchart showing the structure of the evaluation logic stored in the evaluation logic database to determine whether or not the authority gradient is high
- FIG. 8 is a diagram showing the data structure of the teamwork evaluation data evaluated by the teamwork evaluation unit.
- FIG. 9 is a diagram showing a data structure of a support content determination table possessed by the support content determination unit.
- FIG. 10 is a diagram showing an example of screen display of driving support information when it is evaluated that team performance has deteriorated from the viewpoint of the authority of the supervisor
- FIG. 11 is a flowchart for explaining an operation of updating the evaluation logic
- FIG. 12 is a flowchart.
- FIG. 13 is a diagram showing a screen display example when the evaluation logic is edited by screen operation
- FIG. 13 is a diagram for explaining data when determining the start of change when the evaluation logic is automatically changed.
- the operation of the plant by the prerequisite operation team will be described.
- the operation of a plant is performed by an operation team consisting of a plurality of people who monitor and operate the central control panel of the monitoring control room.
- the team consists of multiple operators who monitor and operate, and supervisors who monitor the work instructions and actions of the operators. Operators and supervisors are provided with their own input / output devices and output devices (input / output devices) for monitoring and control, and use the input / output devices given to them to monitor and control the plant. implement.
- the supervisor first instructs the operator on the work contents.
- the operator carries out the work using the given input / output device.
- the operator reports the completion to the supervisor, and the supervisor uses his / her own input / output device to confirm the work performed by the operator.
- the operation mode of the plant in addition to the case where all the members of the operation team are in the monitoring and control room and the operation is performed while performing oral communication, the operator of the monitoring and control room and the supervisor in a remote place are remote. It is also assumed that you will communicate with.
- each operator executes the task assigned to him / her according to the instructions from the supervisor according to the procedure.
- a situation where the workload is high for the operator such as a situation where task execution is required within a limited time, human errors such as deterioration of work efficiency, operation error, and situational awareness error may occur.
- the plant operation support device 10 of the present application includes a plant status determination processing unit 20 that determines the status of a plant (not shown) and a user information collection processing unit 30 that collects and processes user information. Then, based on the evaluation logic L4, the teamwork evaluation processing unit 40 that evaluates the teamwork state from the information on the situation of the plant and the user, and the support information that presents the support information based on the evaluation result (teamwork evaluation data E4).
- the presentation processing unit 50 is provided. Further, it includes an evaluation logic update processing unit 60 that updates the evaluation logic L4 used for evaluating the teamwork state, which is a characteristic configuration of the present application.
- the plant status determination processing unit 20 identifies the plant information collection unit 21 that collects plant information, the plant information database 22 that stores the collected data, the procedure database 23 that stores the operation procedure data, and the current work step. It has a procedure determination unit 24.
- the database is abbreviated as "DB”.
- the user information collection processing unit 30 includes a user information collection unit 31 that collects data of supervisors and operators (collectively referred to as users) forming a driving team, and a user information database 32 that stores the collected user information. Have.
- the teamwork evaluation processing unit 40 includes an index value calculation unit 41 that calculates an index value (index data D41) required for teamwork evaluation, an index prediction value calculation unit 42 that calculates a time-series index prediction value P42 for each user, and It has a teamwork evaluation unit 43 that evaluates teamwork. It also has an evaluation logic database 44 that stores the evaluation logic L4 used for teamwork evaluation.
- the support information presentation processing unit 50 determines the support content based on the teamwork evaluation result (teamwork evaluation data E4), the support content determination unit 51, the support result database 52 that accumulates the determined support content, and the support content. It has a presentation information generation unit 53 that generates presentation data. Then, it has a plurality of output devices 54 that are assigned to each of the operators and supervisors constituting the team and present the generated presentation information to the necessary users.
- the evaluation logic update processing unit 60 has an evaluation logic update unit 61 that updates the evaluation logic L4 based on the support content data stored in the support result database 52 and the data stored in the evaluation logic database 44. .. Further, it has an input device 62 that receives input of necessary data by the user when updating the evaluation logic L4.
- the plant operation support device 10 is configured by a combination of a plurality of functions, and is configured by combining independent hardware such as an execution procedure determination unit 24 and an index value calculation unit 41. It does not mean to do.
- the processor 11, the memory 12, the hard disk 13, the input device 14, the output device 15, and the system bus 16 connecting them are configured, and each function is realized by the installed software. It may be one.
- the memory 12 and the hard disk 13 function as storage devices, and are provided with a volatile storage device such as a random access memory and a non-volatile auxiliary storage device such as a flash memory, although not shown. Further, a hard disk may be provided as an auxiliary storage device instead of the flash memory.
- the processor 11 executes the program input from the storage device. In this case, a program is input from the auxiliary storage device to the processor 11 via the volatile storage device. Further, the processor 11 may output data such as a calculation result to the volatile storage device of the storage device, or may store the data in the auxiliary storage device via the volatile storage device.
- the plant operation support device 10 evaluates teamwork based on the plant status and the user status (user information), and when support is required, the support information is output to the output device 54. The operation up to the presentation will be described.
- the plant status determination processing unit 20 determines the procedure and work step performed by the user based on the plant information and the operation procedure information, and transmits the result to the teamwork evaluation processing unit 40. ST101 will be described.
- the plant information collection unit 21 periodically collects the plant data D22 from the plant information database 22 and transmits it to the implementation procedure determination unit 24 together with the acquired time.
- the plant information database 22 stores, for example, plant state information including alarms of various plant equipment such as pumps and valves, plant state information including parameter values, and operation log information which is operation log information of operators and supervisors. There is. Further, the operation log information is composed of work contents (including work identification information D233 described later), user identification information D321 indicating a user ID, and time data.
- the data in the plant information database 22 may be real-time data generated by the plant or data recorded in the past plant operation.
- the procedure database 23 stores information on the response procedure for each event and the work steps constituting the procedure.
- the procedure information D23 defines a procedure consisting of one or more for each event, and the procedure is composed of work steps consisting of one or more.
- Event identification information D231 indicating an event ID is assigned to an event
- procedure identification information D232 indicating a procedure ID is assigned to a procedure
- work identification information D233 indicating a work ID is assigned to each work step.
- standard work time information D235 indicating the standard work time
- workload value information D236 indicating the work load value
- work start condition information D237 indicating the work start condition
- work end condition information D238 indicating the work end condition
- the standard work time is the time normally required to execute each work step (work time including waiting time due to plant behavior).
- the workload value is a load (work load) applied to human work (confirmation of an object, operation, etc.) in performing a work step.
- the workload value includes, for example, the time required for human action.
- a workload value can be calculated by adding up human perception, cognitive brain processing time, and physical body movement time when performing work steps using a method based on a known human information processing model. ..
- the work start condition and the work end condition are conditions for starting or ending the work step. Examples of work start conditions include operation logs (determined that operation has started), plant equipment parameter values that are operation start conditions, alarm issuance, and work end conditions are, for example, plant equipment parameter value conditions and alarm stop. It is stipulated.
- the implementation procedure determination unit 24 receives the plant status information and the operation log information from the plant information collection unit 21, identifies the event that has occurred, the response procedure associated with the event, and the current work step. Inside the implementation procedure determination unit 24, there is a knowledge base that stores data necessary for determining an event, such as the relationship between the cause of a defect and the ripple of an event. An event is specified using the acquired plant state information, and the event identification information D231 corresponding to the event is obtained by referring to the procedure database 23.
- the procedure ID and the work ID being executed are specified with reference to the work start condition information D237 and the work end condition information D238.
- the work start time and the work end time are recorded as the work start time information D2391 and the work end time information D2392, respectively.
- Teamwork with the specified procedure ID, work ID, and information about the user ID being executed (procedure identification information D232, work identification information D233, user identification information D321) together with work start time information D2391 and work end time information D2392. It is transmitted to the evaluation processing unit 40. As a result, step ST101 is completed and the process proceeds to step ST102.
- the user information collection step ST102 that collects the user's time-series information by the user information collection processing unit 30 and transmits the collected information to the teamwork evaluation processing unit 40 will be described.
- the user information database 32 stores time-series sensor data (for example, biometric data) of each user acquired by a contact-type or non-contact-type sensor.
- the user information collecting unit 31 periodically collects biometric data stored in the user information database 32.
- the user information database 32 stores user identification information D321 that identifies each user and time-series data (sensor data D322) of various sensors in a form that is associated with each other.
- the supervisor is User-A
- the operators are User-B and User-C.
- the sensor data D322 is a combination of the acquired time data and the sensor value, and examples thereof include heart rate, voice, respiration, electrocardiographic waveform, blood pressure, and body temperature.
- the user information collecting unit 31 transmits the user identification information D321 and various sensor data D322 to the index value calculation unit 41 of the teamwork evaluation processing unit 40 at a predetermined cycle.
- step ST102 is completed and the process proceeds to step ST103.
- the plant status determination step ST101 and the user information collection step ST102 may be processed in parallel.
- the teamwork evaluation processing unit 40 calculates index values and index prediction values required for teamwork evaluation based on the input data from the plant status determination processing unit 20 and the user information collection processing unit 30 (step ST103), and they are used.
- the teamwork status is evaluated comprehensively (step ST104). Then, when it is evaluated that the team state is not sound (“No” in step ST105), the evaluation result is transmitted to the support information presentation processing unit 50.
- the index value calculation unit 41 calculates the index value required for teamwork evaluation.
- indicators include those calculated for each individual user, teamwork indicators between multiple users such as between operators and supervisors, and between operators.
- Examples of individual indicators include those related to speech, those related to tone, emotions, and those related to workload. Twice
- the volume of voice, utterance speed, wording (number of times a specific word is spoken, word content, etc.) are used as indicators, and in the case of tone, "spoken in a strong command tone".
- the number of times "spoken in an intimidating tone” and the number of times "spoken in an intimidating tone” are indicators.
- the onset time, onset time, and onset time of emotions such as normality, excitement, joy, stress, irritability, depression, fatigue, tension, calmness, and relaxation are indicators.
- the work load the increase / decrease in the actual work time with respect to the standard work time, the number of work steps performed in parallel, and the like are indicators.
- an index among a plurality of users for example, how much the conversation pattern matches a series of patterns such as the degree of matching of the conversation pattern (instruction from the supervisor to the operator, repetition of the instruction content of the operator, report of the result of the operator, etc.). ), Interval (time) between conversations, and other indicators related to communication.
- the index value calculation unit 41 there is an analysis logic for calculating each index value.
- the individual index is calculated for each individual user in the driving team, and the index between multiple users is calculated by combining the input data of multiple users.
- the index related to utterance which is an individual index, is calculated by analyzing time-series voice data. For example, the loudness of the voice is calculated from the sound pressure level of the voice data, and the utterance speed, wording, and tone are calculated from the result of voice recognition and text conversion. It is calculated how much the sound pressure level or the utterance speed increases or decreases with respect to the standard value (index value measured in advance in normal times, etc.) for each user.
- Emotional indicators can be calculated based on voice data and biological data such as heart rate and respiration.
- the work information corresponding to the work ID obtained from the execution procedure determination unit 24 is referred to from the procedure database 23.
- the time from the time when the work start condition is satisfied and the work is started to the time when the work end condition is satisfied is measured as the work actual time, and the standard work is performed.
- the difference from time For example, when the obtained work ID is AI-2, if there is work end time information D2392 in the output data of the execution procedure determination unit 24, it is considered that the step execution is completed.
- the difference from the work start time information D2391 (for example, 12:23:41) from the work end time (for example, 12:23:50) is calculated (0 hour 0 minute 9 seconds). Further, the time (-0 hours 0 minutes 1 second) obtained by subtracting the standard working time shown in the standard working time information D235 is obtained.
- the number of work steps being performed in parallel is determined from the user ID, procedure ID, and work ID received from the execution procedure determination unit 24.
- the degree of matching of conversation patterns which is an index between a plurality of users, is calculated by converting voice data into text through voice recognition, extracting words and phrases indicating instructions, repetitions, result reports, and the like, and calculating the degree of matching with conversation patterns. For example, the instruction is extracted as a phrase indicating "Please XX XX", the repeat is "XX XX", and the result report is "XX XX”.
- the above-mentioned analysis method is an example, and the implementation method is not limited to this.
- the index value calculation unit 41 stores the index value of each index, the data used for the index calculation, the user ID, and the processing time data in the database held in the index value calculation unit 41 as time series data.
- the data used for index calculation voice data, biological data, conversation text data, etc. are accumulated, and as the user ID, one user ID is used for an individual index, and a plurality of related data are used for an index among a plurality of users. Accumulate user ID.
- the processing time data for example, the time when the index value is calculated is accumulated.
- the actual work time obtained at the time of calculating the index value of the workload described above is stored as the index data D41 in the database held in the index value calculation unit 41 together with the procedure ID, the work ID, the standard work time, and the user ID.
- the current time data (work record recording time) and data indicating whether or not the work step is completed (work completed / work in progress) (work step execution status information) are also accumulated as index data D41.
- the process of calculating the index value in step ST103 is completed.
- the index prediction value calculation unit 42 calculates the time-series index prediction value of each user by predicting the future movement of the index value as the calculation process of the index prediction value in step ST103, and teamwork the result. It is transmitted to the evaluation unit 43.
- the index prediction value calculation unit 42 includes a function of simulating future plant operation and a function of predicting the index value, and based on these functions, predicts the workload value of the work step to be implemented in the future for each user.
- the procedure ID, work ID, work record time, work step execution status information, and standard work time of the latest work record recording time performed by a specific user are obtained. get.
- the procedure information D23 stored in the procedure database 23 the predicted values of the start time and the end time of each procedure step that occur after the work ID of the procedure ID are calculated.
- the time that satisfies the work start condition is calculated as the start time by performing plant simulation, and the end time is calculated by adding the standard work time to it.
- the work start time is the same method.
- An example is a method in which the end time is set to a time that satisfies the work end condition by calculation and plant simulation.
- the end time of the work step is predicted, and the plant simulation is performed in consideration of it.
- the prediction of the end time is calculated as, for example, the time obtained by adding the difference of the actual work time from the standard work time required to complete the work step to the actual work record time.
- the processing up to this point is performed at a predetermined time (for example, 30 minutes) until the end time of the procedure step is exceeded or until all the work steps in the event are processed. Further, if the user is performing other work steps in parallel, the same processing is performed for that as well.
- FIG. 5 a part of the index value is visualized in chronological order for a certain user.
- the user ID is User-B
- the horizontal axis represents time
- the length of the rectangle indicating the work W421 to W423 indicates the standard work time
- the left end and the right end indicate the start time and the end time, respectively.
- the overlapping parts along the vertical axis indicate the work to be performed in parallel.
- the ratio of the workload value (actual time related to human behavior) included in the standard working time is used.
- the work W421 is a work whose work ID is AI-3, and the work load value is 75,000 milliseconds (see FIG. 4) with respect to the standard work time of 2 minutes and 30 seconds (150 seconds). Is 0.5.
- the index value of the time zone in which the work does not occur is 0, and the index value is the sum of the index values of the respective works W421 to W423 in the time zone in which the plurality of works are performed in parallel. In this way, the time series data of the index value is calculated at predetermined intervals (for example, in units of 1 second).
- step ST103 is completed and the process proceeds to step ST104.
- the teamwork evaluation unit 43 based on the time series data of various index values accumulated in the index value calculation unit 41 and the index prediction value P42 output from the index prediction value calculation unit 42. Evaluate teamwork within the team (step ST104). If the teamwork is not sound as a result of the evaluation (“No” in step ST105), the evaluation result is transmitted to the support content determination unit 51 of the support information presentation processing unit 50, and the process proceeds to the support steps (steps ST106 to ST107). .. On the other hand, when the teamwork is sound (“Yes” in step ST105), the process ends without shifting to the support process.
- the teamwork evaluation unit 43 executes the evaluation process based on the evaluation logic L4 stored in the evaluation logic database 44. Inside the evaluation logic L4, it is determined whether or not there is a state that causes poor teamwork (for example, the authority gradient between the supervisor and the operator is abnormally high / low, the performance of a specific user in the team deteriorates, etc.). There is one or more quantitative criteria for this.
- the evaluation logic L4 for determining whether or not the teamwork failure corresponds to “high authority gradient” will be described.
- the high authority gradient means that the authority of the supervisor is abnormally high with respect to the operator, and the operator is atrophied and it is difficult to communicate appropriately with the supervisor.
- User-A indicates the supervisor
- User-B and User-C indicate the operator
- evaluation steps ST4401 and ST4402 in the evaluation logic L4 are for the supervisor
- evaluation steps ST4403 to ST4407 are for the operator. Represents a step.
- step ST4410 it is determined whether the authority gradient indicating soundness is high "not applicable” (step ST4410) or the authority gradient indicating unhealthy is high “applicable” (step ST4408 or ST4409). Will be done.
- the level is set so that the degree can be discriminated in two stages.
- evaluation is performed in evaluation step ST4407 from the viewpoint of the degree of urgency of countermeasures against this teamwork defect with an eye on the future plant operation situation. Is divided into two levels (level 2 is the one with higher urgency).
- level 2 is the one with higher urgency.
- step ST4401 to ST4407 various index values (index data D41) accumulated in the index value calculation unit 41 and the index prediction value P42 calculated by the index prediction value calculation unit 42 are referred to, and the evaluation criteria are set in each evaluation step. Determine if it meets. For example, regarding the evaluation of the emotion index in the evaluation steps ST4401 and ST4403, it is determined whether or not the emotion that causes a problem in teamwork continues for 30 seconds or more by the latest processing time data. For example, in step ST4401 for evaluating the supervisor (User-A), irritability and stress are regarded as problem emotions, and if the problem emotions continue for 30 seconds or more, it is determined that the next evaluation item should be further advanced. If not, a screening evaluation is performed to determine that it is not applicable.
- the supervisor User-A
- the evaluation step ST4402 it is determined whether or not the index value related to the tone has one or more "spoken in a strong command tone", and if there is one or more strong command tone, it is determined whether the authority gradient is high. Judge that it should be in a state that should be done, and if not, judge that it is not applicable. That is, a secondary screening evaluation is performed to evaluate whether or not the authority gradient is high. In this way, by screening evaluation according to the state of the supervisor, it is determined in the subsequent evaluation steps whether or not the authority gradient is high, that is, whether or not the driver side is affected.
- step ST4402 the driver (User-B, User-C) is targeted, and if the problem emotion continues for 30 seconds or more with depression, fatigue, tension, etc. as problem emotions, it is judged that the authority gradient is high. If not, move on to the next evaluation item.
- preprocessing such as removal of noise (such as a small break in the emotion expression period) of the time-series index value may be executed.
- the type of emotion held by the user can be determined by whether or not the combination of waveforms such as brain waves and heartbeats matches a pattern stored in advance. Furthermore, the duration of the problem emotion can be measured by the duration of the state. That is, it is possible to quantitatively determine whether it is "Yes” or "No” by using the time-series biometric data collected in the user information database 32.
- the evaluation step ST4404 compares the standard work time of the work step at the latest time stored in the index value calculation unit 41 with the actual work time, and for example, the actual work time is 10% or more longer than the standard work time (D235 ⁇ ). It is judged by whether or not it is 1.1 or more). It should be noted that the method is not limited to this, such as calculating and judging not only the work step at the latest time but also the work steps before that.
- the response time (conversation response time) in the conversation in the evaluation step ST4406 is determined based on the index value of the utterance interval in the conversation in the latest processing time data in the index value calculation unit 41. For example, if the conversation response time is 10% or more longer than the standard (standard x 1.1 or more), it is determined that the driver is atrophied and the interval is long, that is, the authority gradient is high.
- the evaluation step ST4407 is based on the index prediction value P42 for the time-series workload output from the index prediction value calculation unit 42. Determine the level of the gradient. For example, if the predicted workload value is 0.8 or more (“Yes”), it is judged to be level 2 indicating a high level, and if it is less than 0.8 (“No”), it indicates a low level. Judged as level 1. That is, each step is a logical expression that compares the threshold value with the index value and outputs whether or not it is correct, and the evaluation logic L4 outputs a plurality of types of evaluation results by connecting a plurality of logical expressions. Can be done.
- evaluation is performed for each supervisor and operator, and after reaching any of steps ST4408 to ST4410, if the evaluation result and the authority gradient are determined to be high, the user ID of the related user is determined.
- the user who is "Yes" in the evaluation item is assumed.
- step ST4412 it is determined whether or not there is another user who is not the evaluation processing target (step ST4412), and if there is a user who is not the processing target, the process returns to step ST4401, and the other users move to the next determination logic (step). ST4413).
- the process of evaluating the teamwork of the supervisor (User-A) and the operator (User-B) is given. It is assumed that the operator is carrying out the work whose work ID is AI-1. If the irritability continues for 35 seconds and the stress continues for 33 seconds as a result of the emotion index value of the supervisor (User-A) in step ST4401, the process proceeds to step ST4402. If the supervisor (User-A) tone index includes one "speaking of a strong command tone" in step ST4402, the process proceeds to step ST4403 for evaluating the situation of the operator.
- step ST4404 determines whether the duration of depression, fatigue, and tension of the operator (User-B) is 28 seconds, 25 seconds, and 20 seconds, respectively. If the duration of depression, fatigue, and tension of the operator (User-B) is 28 seconds, 25 seconds, and 20 seconds, respectively, the process proceeds to step ST4404 to execute the next evaluation item.
- step ST4404 when the difference in the actual work time from the standard work time of the operator (User-B) is "-0 hours: 0 minutes: 6 seconds", it is 40% shorter than the standard work time (15 seconds). , The process proceeds to step ST4405 for executing the next evaluation item. If the voice volume of User-B is 17% smaller than the reference in step ST4405 (reference x 0.83), the process proceeds to step ST4406 in which the next evaluation item is executed.
- step ST4406 If the conversation response time between the supervisor (User-A) and the operator (User-B) is 50% higher than the standard (standard x 1.5) in step ST4406, it is determined that the authority gradient is high. Then, the process proceeds to step ST4407.
- the index predicted value P42 of the workload is the data shown in FIG. 6, in step ST4407, the maximum value of the index predicted value P42 of the workload with respect to User-B is 0.71, which is a reference value. There is no time when it becomes 8 or more. Therefore, although the authority gradient is high, the level is judged to be level 1 indicating a low level (step ST4408).
- the teamwork evaluation data E4 as shown in FIG. 8 is created. do.
- the data generated as the team status information D441 indicates the team status that identifies each teamwork defect.
- Evaluation result information D442 is data showing the evaluation result for each team state, and the result for "authority gradient: high” indicates that "authority gradient of level 1 corresponds to high", and "authority gradient: low”. The result for is "not applicable”.
- the evaluation result for the team status of "healthy” is "applicable” only when all the evaluation results of poor teamwork are “not applicable”.
- a mark that identifies the user ID that was applied when determining teamwork failure and the degree of contribution to the failure factor (“ ⁇ ” for the user of the main factor, “ ⁇ ” for the user of the secondary factor).
- the index value is created as the index value information D444 by collecting the names and values of all the index values used for the determination in the evaluation logic and the user IDs of one or more related users.
- the teamwork evaluation unit 43 uses a different evaluation logic L4 and uses other items.
- Can evaluate poor teamwork For example, items such as low authority gradient, poor user performance due to inappropriate workload, poor user performance due to low alertness, excessive trust / dependence on supervisors or operators, team withdrawal due to lack of responsibility, etc. Can be mentioned.
- the support information presentation processing unit 50 determines the support content based on the teamwork evaluation data E4 output from the teamwork evaluation unit 43, generates presentation information, and presents the support information to the output device 54 of an appropriate user. (Steps ST106 to ST107).
- the support content determination unit 51 executes the support content determination step ST106 for determining what kind of support is to be provided to which user from the teamwork evaluation data E4 transmitted from the teamwork evaluation unit 43.
- the support content determination unit 51 has a support content determination table T51 for selecting the support content, the user who provides the support, and the presentation method.
- the support content determination table T51 stores data for each item of support name information D511, determination condition information D512, information presentation destination user information D513, presentation method information D514, and support content information D515 indicating the type of support content. ..
- the support name information D511 stores a support name that distinguishes the support contents such as "authority gradient: high level 1" and "authority gradient: high level 2".
- the decision condition information D512 stores conditional expressions (decision conditions) necessary for determination, such as "authority gradient: high level 1" and "authority gradient: high level 2”.
- Logical formulas (AND, OR, etc.) according to the team status (team status information D441: FIG. 8) can be stored here, and even if a plurality of team statuses are applicable, the support contents can be comprehensively described. can.
- Information presentation destination user information D513 describes one or more user IDs of support target persons (information presentation destinations) who directly present support information.
- the support name is "authority gradient: high level 1"
- "User-B” and "User-C” both are operators
- the user in the operation team is the information presentation destination user, but the user IDs of the worker outside the plant central control room, other operators, and the worker who is the supervisor's boss are stored as the information presentation destination user. You may.
- the presentation method information D514 stores one of the types of information presentation methods, such as "operation screen message display” and "voice presentation”, as the presentation method.
- the support content information D515 stores, for example, support content such as advice and recommendations for problems with poor teamwork.
- the support content determination unit 51 uses the support content determination table T51 to determine one support item corresponding to the team status and evaluation result of the extracted teamwork evaluation data E4.
- the team status extracted in this example is "authority gradient: high” and the evaluation result is "corresponding: level 1”
- the support name information D511 of the support content determination table T51 "authority gradient: high level 1” Items are extracted.
- the related user information D443 includes "User-A” and "User-B”, and the "User-B" in the information presentation destination user information D513 matches in the support content determination table T51, so that the information The presentation destination user is determined to User-B.
- the presentation method the method described in the presentation method information D514 is adopted.
- the signal data S51 for instructing the generation of the presentation information including the support name, the information presentation destination user, the presentation method, and the support content information determined in this way is created and transmitted to the presentation information generation unit 53.
- the support result database 52 stores the time data in which the support content determination process is completed, and the data D51 including the finally determined support name, information presentation destination user, presentation method, and support content.
- the data D44 including the team status, the evaluation result, the related user, and the index value of the teamwork evaluation data E4 linked to the support content data is also accumulated in the support result database 52.
- the presentation information generation unit 53 performs a process of generating presentation data D53 to be transmitted to the output device 54 based on the received signal data S51 and outputting it to the output device 54 (step ST107). Specifically, the presentation data D53 according to the presentation method information D514 is created and output to the output device 54 used by the information presentation destination user.
- FIG. 10 shows an example in which the support information whose support name is "authoritative gradient: high level 1" is output on the operation screen G5. Since the information presentation destination user is "User-B" and the presentation method is "message display on the operation screen", the message Gm5 described in the support content is drawn on the operation screen G5 of the operator of User-B.
- the evaluation logic update processing unit 60 plays a role of changing the evaluation logic L4 for the teamwork evaluation processing unit 40 to evaluate the teamwork state. First, a method of manually changing the evaluation logic L4 will be described.
- the evaluation logic update unit 61 visualizes the actual data of the support results currently registered based on the data D44, the data D51, and the evaluation logic L4 accumulated in the support result database 52. Then, the user selects a change target, accepts edit inputs such as a threshold value for each evaluation item, and a logical expression, and performs a process of storing the updated evaluation logic L4 in the evaluation logic database 44. The user operates using an input device 62 such as a mouse and a keyboard. Further, the information necessary for the change is output to the output device of the evaluation logic update unit 61.
- FIG. 11 shows the processing flow of the evaluation logic update unit 61.
- the evaluation logic L4 to be changed is selected (step ST611).
- the names of all evaluation logics L4 registered in the evaluation logic database 44 for example, "authoritative gradient: high”, “authoritative gradient: low”, etc.
- the data D44 and the data D51 are linked and output in a table format as reference information when selecting the change target. If there are multiple evaluation results in one evaluation logic L4, all of them are output in order from the latest time data.
- an edit screen for editing the selected evaluation logic L4 is output (step ST612).
- the edit screen G6 is a visualization of the evaluation logic L4 of the “authority gradient: high” described in the title Gt6 described in FIG. 7 in a flowchart format.
- An editable area Ge6 indicated by a rectangle is arranged in each evaluation step display field Gf6 drawn in the edit screen G6, and in this example, the currently set judgment criterion is set in the editable area Ge6.
- the threshold is displayed.
- the threshold value can be changed, but also changes such as the target user in the evaluation step, the index value, the branching condition (Yes / No), the branching destination, the order of the evaluation step, and the addition of a new evaluation step can be made. good.
- candidates may be displayed on the screen and selected from them to be changed, or new additions / deletions may be possible.
- the evaluation logic L4 may be edited graphically. By preparing evaluation step parts, connection line parts, etc. as evaluation logic parts, arranging them on the edit screen G6, setting branch conditions, and performing processing to judge the consistency of logic, the evaluation step can be edited. It will be possible.
- the evaluation logic update unit 61 calculates a more suitable value for the threshold value of the evaluation logic L4 based on the index value (index value information D444) accumulated in the past, and uses it as a candidate near the editable area Ge6. You may present it. For example, in the index value (duration of problem emotion) in the evaluation step Gf6 at the top of FIG. 12, if a value exceeding the current threshold value (30 seconds) is recorded more than a certain number of times, that value is presented as a candidate. do.
- the evaluation result corresponds to "healthy” in the teamwork evaluation unit 43 ("Yes" in step ST105)
- the teamwork evaluation data E4 including the index value calculated by each evaluation logic is transmitted to the support information presentation processing unit 50, and the support content determination unit 51 accumulates the data D44 in the support result database 52. May be good.
- a more suitable value may be calculated and set, including the index value when the value becomes “not applicable”, and the candidate value may be presented.
- step ST612 the user first selects whether to increase or decrease the number of detections. Then, a more suitable value is calculated and set for the threshold value set to be a certain amount or more surplus or lower than the representative value (for example, the average value) of the accumulated index value, and the candidate value is set. It may be presented.
- the above-mentioned process shows an example in which candidates are presented and the user is made to input the threshold value, the user may be made to select whether or not to set the candidate values in a batch, and the values may be automatically input.
- the evaluation logic update unit 61 described above outputs and presents the edited contents, the processes of steps ST611 to ST613 may be performed without outputting based on the input setting data.
- the evaluation logic update unit 61 changes the evaluation logic L4 based on the accumulated index value.
- FIG. 13 is a histogram showing the accumulation results of the duration of feelings of irritability or stress (problem feelings) among the index values.
- the case where the emotion duration lasts for 10 seconds or more is accumulated as data, and the data for 20 times in total is accumulated.
- the user specifies in advance the probability of the occurrence frequency of the number of detections using the input device 62. Assuming that the user has set the probability of the occurrence frequency of the number of detections to 20%, in the data of FIG. 13, the portion corresponding to the dotted line Th6 on the graph is 20%, that is, the threshold value for detecting four times. Therefore, the threshold value of the duration of the problem emotion is automatically updated as 45 seconds.
- the method of inputting the probability of occurrence frequency by the user has been described, but the present invention is not limited to this.
- a method of automatically changing the evaluation logic L4 using machine learning such as learning a data pattern using the Mahalanobis distance and setting a threshold value with data significantly deviated from the data as an abnormal value, may be used. good.
- an evaluation logic L4 configured by combining evaluation steps for quantitatively determining whether the value is “Yes” or “No” is provided. I tried to evaluate teamwork using it. That is, the evaluation is performed using the evaluation logic L4, which is a combination of a plurality of logical expressions that evaluate the numerical value as a threshold value and output whether it is correct or not. Therefore, the evaluation logic L4 can be easily updated according to the actual situation as described above without accumulating correct answer data or good cases, whether manually or automatically.
- the evaluation result corresponds to "healthy" in the teamwork evaluation unit 43
- an example of terminating the process is shown, but even in this case, the team state is sound.
- the message may be created by the presentation information generation unit 53 and output to the output device 54.
- the support content determination unit 51 has shown a method of determining one support content, but the present invention is not limited to this. Instead of the comprehensive support content determination table T51 that can be narrowed down to one, multiple decision conditions are matched, and multiple support contents are determined and presentation information is created using the prioritized data. , May be presented to the output device 54 in order.
- an example of using the waveform of biometric data to grasp the mental state (type of emotion) of the user is shown, but the present invention is not limited to this. For example, if the combination of words that appears during a conversation matches an emotion, it is determined that the emotion is in that emotional state, and the emotional pattern is the duration of the biological data pattern such as heartbeat and brain waves when the determination is made. You may try to estimate the duration.
- information is presented for performing plant operation as a team, which is provided corresponding to each of a plurality of members (users) constituting the team.
- the state of each of the plurality of members (users) the state of communication between the members (users)
- the workload state of each of the plurality of members users
- the index value calculation unit 41 (including the index prediction value calculation unit 42) that calculates the index value (index data D41, index prediction value P42) that quantifies the degree of each evaluation item, and the index value Based on the teamwork evaluation unit 43, which evaluates the teamwork of the team based on the teamwork, and the teamwork evaluation result (teamwork evaluation data E4), the support target persons including the team members (users) are selected and selected support.
- the teamwork evaluation unit 43 includes a presentation information generation unit 53 that generates data D53), and the teamwork evaluation unit 43 is an evaluation composed of a combination of logical expressions for comparing the threshold values set for each of a plurality of evaluation items and the calculated index values. Since the logic L4 is used and configured to evaluate the teamwork, it is possible to evaluate the teamwork and support appropriate plant operation without preparing fixed correct answer data.
- the index value calculation unit 41 should calculate the measurement result of biometric data related to the emotions of each of the plurality of members (users) (for example, the time during which the emotions that are problematic for teamwork continue) as the index value. For example, it is possible to quantitatively evaluate whether or not the mental state is not good for teamwork (the user has a problem feeling), that is, the user's mental state can be quantitatively evaluated, and it can be incorporated into the above-mentioned logical formula of the evaluation logic L4. can.
- the index value calculation unit 41 calculates any of the voice volume, the utterance interval, and the tone in the conversation between the members (users) as the index value, the communication state between the users can be quantitatively calculated. It can be evaluated and can be incorporated into the above-mentioned logical expression of the evaluation logic L4.
- the evaluation logic update unit 61 for correcting either the threshold value or the connection relationship is provided, the evaluation logic L4 can be easily corrected even if there is no teacher data or correct answer data, which suits the actual situation. Appropriate teamwork evaluation becomes possible.
- the evaluation logic update unit 61 automatically updates the threshold value of the evaluation items for which the abnormal value is set as the threshold value by using the probability of the occurrence frequency of the event or the statistical analysis method such as the Mahalanobis distance. By doing so, the evaluation logic L4 is automatically updated using the data in the actual plant operation, and an appropriate teamwork evaluation suitable for the actual situation becomes possible.
- Support content determination step (step ST106) that selects support target persons including the members (users) of the above and determines the support content according to the selected support target person, and the support determined for each selected support target person.
- the evaluation is composed of a combination of logical expressions for comparing the threshold value set for each of a plurality of evaluation items and the calculated index value. Since the logic L4 is used and configured to evaluate the teamwork, it is possible to evaluate the teamwork and support appropriate plant operation without preparing fixed correct answer data.
- the evaluation logic L4 can be easily corrected even if there is no teacher data or correct answer data. Appropriate teamwork evaluation that suits the actual situation becomes possible.
- 10 Plant operation support device
- 20 Plant status determination processing unit
- 30 User information collection processing unit
- 40 Teamwork evaluation processing unit
- 41 Index value calculation unit
- 42 Index prediction value calculation unit (index value calculation unit) )
- 43 Teamwork evaluation unit
- 50 Support information presentation processing unit
- 51 Support content determination unit
- 53 Presentation information generation unit
- 54 Output device
- 60 Evaluation logic update processing unit
- 61 Evaluation logic update unit
- D41 Index data (index value)
- D53 Presentation data
- E4 Teamwork evaluation data
- L4 Evaluation logic
- P42 Index prediction value (index value).
Landscapes
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Development Economics (AREA)
- Educational Administration (AREA)
- Operations Research (AREA)
- Marketing (AREA)
- Game Theory and Decision Science (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Testing And Monitoring For Control Systems (AREA)
Abstract
Description
図1~図13は、実施の形態1にかかるプラント運転支援装置あるいはプラント運転支援方法について説明するためのものであり、図1はプラント運転支援装置構成を説明するためのブロック図、図2はプラント運転支援装置のハードウェア構成の例を示すブロック図、図3はプラント運転支援装置の動作、つまりプラント運転支援方法について説明するためのフローチャートである。また、図4はプラント運転支援装置を構成する手順データベースに格納されたデータ構造を示す図である。
Claims (7)
- チームを構成する複数の構成員それぞれに対応して設けられ、前記チームとしてプラント運転を行うための情報提示を行う出力装置、
前記複数の構成員それぞれの状態、前記構成員どうしの意思の疎通状態、および前記複数の構成員それぞれの作業負荷状態、のいずれかを評価する複数の評価項目に対し、評価項目ごとにその程度を数値化した指標値を算出する指標値算出部、
前記指標値に基づいて前記チームのチームワークを評価するチームワーク評価部、
前記チームワークの評価結果に基づいて、前記チームの構成員を含む支援対象者を選定し、選定した支援対象者に応じた支援内容を決定する支援内容決定部、および
前記出力装置のうち、前記選定した支援対象者に対応した出力装置ごとに、決定した支援内容を提示するための提示情報を生成する提示情報生成部、を備え、
前記チームワーク評価部は、前記複数の評価項目ごとに設定された閾値と算出された指標値を比較する論理式の繋ぎ合わせで構成した評価ロジックを用い、前記チームワークを評価することを特徴とするプラント運転支援装置。 - 前記指標値算出部は、前記複数の構成員それぞれの感情に関連する生体データの測定結果を、前記指標値として算出することを特徴とする請求項1に記載のプラント運転支援装置。
- 前記指標値算出部は、前記構成員どうしの会話における声量、発話の間隔、および口調のいずれかを、前記指標値として算出することを特徴とする請求項1または2に記載のプラント運転支援装置。
- 前記閾値、および前記繋ぎ合わせの関係のいずれかを修正する評価ロジック更新部を備えたことを特徴とする請求項1から3のいずれか1項に記載のプラント運転支援装置。
- 前記評価ロジック更新部は、前記評価項目のうち、異常値を前記閾値として設定するものに対して、統計的分析手法を用いて、当該閾値を自動更新することを特徴とする請求項4に記載のプラント運転支援装置。
- チームとしてプラントを運転する複数の構成員それぞれの状態、前記構成員どうしの意思の疎通状態、および前記複数の構成員それぞれの作業負荷状態、のいずれかを評価する複数の評価項目に対し、評価項目ごとに数値化した指標値を算出する指標値算出ステップ、
前記指標値に基づいて前記チームのチームワークを評価するチームワーク評価ステップ、
前記チームワークの評価結果に基づいて、前記チームの構成員を含む支援対象者を選定し、選定した支援対象者に応じた支援内容の決定を行う支援内容決定ステップ、および
前記選定した支援対象者ごとに、決定した支援内容を提示する支援内容提示ステップ、を含み、
前記チームワーク評価ステップでは、前記複数の評価項目ごとに設定された閾値と算出された指標値を比較する論理式の繋ぎ合わせで構成した評価ロジックを用い、前記チームワークを評価することを特徴とするプラント運転支援方法。 - 前記閾値、および前記繋ぎ合わせの関係のいずれかを修正する評価ロジック更新ステップを含むことを特徴とする請求項6に記載のプラント運転支援方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/789,553 US11934986B2 (en) | 2020-03-23 | 2020-03-23 | Plant operation support apparatus and plant operation support method |
JP2022509758A JP7305030B2 (ja) | 2020-03-23 | 2020-03-23 | プラント運転支援装置、プラント運転支援方法 |
PCT/JP2020/012602 WO2021191941A1 (ja) | 2020-03-23 | 2020-03-23 | プラント運転支援装置、プラント運転支援方法 |
CA3171492A CA3171492A1 (en) | 2020-03-23 | 2020-03-23 | Plant operation support apparatus and plant operation support method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/012602 WO2021191941A1 (ja) | 2020-03-23 | 2020-03-23 | プラント運転支援装置、プラント運転支援方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021191941A1 true WO2021191941A1 (ja) | 2021-09-30 |
Family
ID=77891126
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/012602 WO2021191941A1 (ja) | 2020-03-23 | 2020-03-23 | プラント運転支援装置、プラント運転支援方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US11934986B2 (ja) |
JP (1) | JP7305030B2 (ja) |
CA (1) | CA3171492A1 (ja) |
WO (1) | WO2021191941A1 (ja) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001269770A (ja) * | 2000-03-24 | 2001-10-02 | Kawasaki Steel Corp | 溶融金属取扱い設備の異常自動検出方法 |
JP2004054954A (ja) * | 2002-07-17 | 2004-02-19 | Tokio Marine & Fire Insurance Co Ltd | リスク診断システム、リスクマップデータ生成方法及びプログラム |
JP2007272564A (ja) * | 2006-03-31 | 2007-10-18 | Railway Technical Res Inst | チームによる業務の活性度の評価システムおよびそれを用いた業務雰囲気の活性化システム |
JP2011175479A (ja) * | 2010-02-24 | 2011-09-08 | Nec Corp | 知的生産性評価装置、知的生産性評価方法およびプログラム |
JP2012217518A (ja) * | 2011-04-05 | 2012-11-12 | Hitachi Ltd | 人間行動分析システム及び方法 |
JP2016007363A (ja) * | 2014-06-25 | 2016-01-18 | 日本電信電話株式会社 | 集団感情推定装置、集団感情推定方法及び集団感情推定プログラム |
JP2018207650A (ja) * | 2017-06-02 | 2018-12-27 | 三菱日立パワーシステムズ株式会社 | 回転電機の特徴量評価システムおよび回転電機の特徴量評価方法 |
JP2019036205A (ja) * | 2017-08-18 | 2019-03-07 | 三菱重工業株式会社 | 評価システム、評価方法及びプログラム |
WO2019049356A1 (ja) * | 2017-09-11 | 2019-03-14 | 株式会社日立製作所 | 情報処理システム |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01123304A (ja) | 1987-11-07 | 1989-05-16 | Toshiba Corp | プラント監視制御装置 |
CA2374578C (en) * | 2000-03-17 | 2016-01-12 | Siemens Aktiengesellschaft | Plant maintenance technology architecture |
JP2003271048A (ja) | 2002-03-14 | 2003-09-25 | Mitsubishi Heavy Ind Ltd | 訓練評価装置及び訓練評価方法 |
US20040153355A1 (en) * | 2003-01-31 | 2004-08-05 | Deering Anne M. | Method and system for assessing leadership skills |
JP4693758B2 (ja) | 2006-12-05 | 2011-06-01 | 株式会社東芝 | プラント監視制御装置 |
CN101119365B (zh) * | 2007-09-13 | 2012-09-05 | 复旦大学 | 大规模协同环境下的协同交互优化方法 |
WO2009076203A1 (en) * | 2007-12-05 | 2009-06-18 | Florida Gulf Coast University | System and methods for facilitating collaboration of a group |
US20100010879A1 (en) * | 2008-07-08 | 2010-01-14 | Ford Motor Company | Productivity operations system and methodology for improving manufacturing productivity |
US20110246340A1 (en) * | 2010-04-02 | 2011-10-06 | Tracelink, Inc. | Method and system for collaborative execution of business processes |
US9983670B2 (en) * | 2012-09-14 | 2018-05-29 | Interaxon Inc. | Systems and methods for collecting, analyzing, and sharing bio-signal and non-bio-signal data |
WO2014059191A2 (en) * | 2012-10-10 | 2014-04-17 | Daniel Wartel Daniel | Productivity assessment and rewards systems and processes therefor |
US9325591B1 (en) * | 2012-12-28 | 2016-04-26 | Google Inc. | Automatic analysis and quality detection of media |
US20140278241A1 (en) * | 2013-03-15 | 2014-09-18 | General Electric Company | Performance monitoring and analysis for power plants |
KR20160089152A (ko) * | 2015-01-19 | 2016-07-27 | 주식회사 엔씨소프트 | 화행 분석을 통한 스티커 추천 방법 및 시스템 |
WO2018005656A1 (en) * | 2016-06-29 | 2018-01-04 | ITY Labs Corp. | System and method for determining user metrics |
US20180101776A1 (en) * | 2016-10-12 | 2018-04-12 | Microsoft Technology Licensing, Llc | Extracting An Emotional State From Device Data |
US20200401977A1 (en) * | 2019-06-18 | 2020-12-24 | Didi Research America, Llc | System and method for evaluation |
US20210150485A1 (en) * | 2019-11-20 | 2021-05-20 | International Business Machines Corporation | Dynamic determination of job requirements and candidate assessment |
US20210304107A1 (en) * | 2020-03-26 | 2021-09-30 | SalesRT LLC | Employee performance monitoring and analysis |
-
2020
- 2020-03-23 CA CA3171492A patent/CA3171492A1/en active Pending
- 2020-03-23 JP JP2022509758A patent/JP7305030B2/ja active Active
- 2020-03-23 WO PCT/JP2020/012602 patent/WO2021191941A1/ja active Application Filing
- 2020-03-23 US US17/789,553 patent/US11934986B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001269770A (ja) * | 2000-03-24 | 2001-10-02 | Kawasaki Steel Corp | 溶融金属取扱い設備の異常自動検出方法 |
JP2004054954A (ja) * | 2002-07-17 | 2004-02-19 | Tokio Marine & Fire Insurance Co Ltd | リスク診断システム、リスクマップデータ生成方法及びプログラム |
JP2007272564A (ja) * | 2006-03-31 | 2007-10-18 | Railway Technical Res Inst | チームによる業務の活性度の評価システムおよびそれを用いた業務雰囲気の活性化システム |
JP2011175479A (ja) * | 2010-02-24 | 2011-09-08 | Nec Corp | 知的生産性評価装置、知的生産性評価方法およびプログラム |
JP2012217518A (ja) * | 2011-04-05 | 2012-11-12 | Hitachi Ltd | 人間行動分析システム及び方法 |
JP2016007363A (ja) * | 2014-06-25 | 2016-01-18 | 日本電信電話株式会社 | 集団感情推定装置、集団感情推定方法及び集団感情推定プログラム |
JP2018207650A (ja) * | 2017-06-02 | 2018-12-27 | 三菱日立パワーシステムズ株式会社 | 回転電機の特徴量評価システムおよび回転電機の特徴量評価方法 |
JP2019036205A (ja) * | 2017-08-18 | 2019-03-07 | 三菱重工業株式会社 | 評価システム、評価方法及びプログラム |
WO2019049356A1 (ja) * | 2017-09-11 | 2019-03-14 | 株式会社日立製作所 | 情報処理システム |
Also Published As
Publication number | Publication date |
---|---|
US20230037720A1 (en) | 2023-02-09 |
JP7305030B2 (ja) | 2023-07-07 |
CA3171492A1 (en) | 2021-09-30 |
US11934986B2 (en) | 2024-03-19 |
JPWO2021191941A1 (ja) | 2021-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8032372B1 (en) | Dictation selection | |
US9700218B2 (en) | Systems and methods for reducing nuisance alarms in medical devices | |
Havlikova et al. | Human reliability in man-machine systems | |
Harte et al. | Process models of decision making | |
US11923094B2 (en) | Monitoring predictive models | |
RU2619644C2 (ru) | Клиническая система поддержки принятия решений для прогнозирующего планирования выписки | |
US10052056B2 (en) | System for configuring collective emotional architecture of individual and methods thereof | |
JP5302911B2 (ja) | 疲労度評価システムおよびそれを用いた企業内疲労度評価システム並びに疲労度評価方法 | |
CN112908481B (zh) | 一种自动化个人健康评估及管理方法及系统 | |
US20220128974A1 (en) | Plant monitoring and control apparatus and plant monitoring and control method | |
JP4631464B2 (ja) | 体調判定装置およびそのプログラム | |
KR102659616B1 (ko) | 음성 특성 기반 알츠하이머병 예측 방법 및 장치 | |
WO2021191941A1 (ja) | プラント運転支援装置、プラント運転支援方法 | |
Lund et al. | Sentence-based experience logging in new hearing aid users | |
CN114496224A (zh) | 一种记忆认知障碍筛查评估方法 | |
KR102496412B1 (ko) | 청지각능력 훈련 시스템의 동작 방법 | |
JP7270187B2 (ja) | 就労支援装置、就労支援方法及び就労支援プログラム | |
Azzini et al. | Automated spoken dialog system for home care and data acquisition from chronic patients | |
CN118136270B (zh) | 一种基于数据分析慢病健康监测与预警系统与方法 | |
US11630943B2 (en) | Systems and methods to briefly deviate from and resume back to amending a section of a note | |
Anjou et al. | A computer system for occupational health epidemiology | |
US20210068751A1 (en) | Autism support system, wearable device, and methods of use | |
RU2813438C1 (ru) | Система и способ выявления и использования эталонного эмоционально-интеллектуального профиля (эи-профиля) по группам анализа | |
CN118136270A (zh) | 一种基于数据分析慢病健康监测与预警系统与方法 | |
KR20230078569A (ko) | 기계학습을 기반으로 하는 covid-19 환자의 정신건강 평가 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20926824 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022509758 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3171492 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20926824 Country of ref document: EP Kind code of ref document: A1 |