WO2021176663A1 - Dispositif de surveillance - Google Patents

Dispositif de surveillance Download PDF

Info

Publication number
WO2021176663A1
WO2021176663A1 PCT/JP2020/009493 JP2020009493W WO2021176663A1 WO 2021176663 A1 WO2021176663 A1 WO 2021176663A1 JP 2020009493 W JP2020009493 W JP 2020009493W WO 2021176663 A1 WO2021176663 A1 WO 2021176663A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
event
worker
production site
production
Prior art date
Application number
PCT/JP2020/009493
Other languages
English (en)
Japanese (ja)
Inventor
洋平 出口
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to CN202080097903.XA priority Critical patent/CN115244564A/zh
Priority to PCT/JP2020/009493 priority patent/WO2021176663A1/fr
Priority to JP2020553664A priority patent/JP6837615B1/ja
Publication of WO2021176663A1 publication Critical patent/WO2021176663A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/25Manufacturing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Definitions

  • This disclosure relates to a monitoring device that monitors the situation at the production site.
  • the information obtained at the production site is monitored, and the monitoring results are used for various purposes such as improving production efficiency, supporting workers, and maintaining the production system.
  • Patent Document 1 when a trouble occurs, information such as the content of work performed by the worker for troubleshooting and the work method is stored in association with the information of the place where the trouble occurred.
  • An invention that allows an operator to check information such as work contents and work methods for troubleshooting that was performed in the past at the place where a new trouble occurred when a new trouble occurs is described. Has been done.
  • the present disclosure has been made in view of the above, and an object of the present disclosure is to obtain a monitoring device capable of notifying an operator of an event occurring at a production site in advance.
  • the present disclosure is a monitoring device for monitoring the situation at the production site, and the position information of the worker terminal held by the worker and the conversation of the worker.
  • the worker information acquisition unit that acquires the worker information including the text data obtained by converting the voice data including the contents into text, the history data of the operations performed on the production equipment installed at the production site, and the history data of the operations performed on the production equipment installed at the production site.
  • the monitoring device includes event information indicating an event that occurred at the production site, worker information and device information acquired at the event occurrence timing, which is the timing at which the event occurred, and a certain period before the event occurrence timing. Based on the acquired worker information and device information, learn the event that occurs at the production site, the state of the worker when the event occurs, and the state of the production device when the event occurs. A trained model storage unit for storing the generated trained model is provided. In addition, when the worker information and the device information are newly acquired, the monitoring device predicts an event that occurs at the production site based on the learned model and the newly acquired worker information and the device information. It has an inference unit.
  • the monitoring device has the effect of being able to notify the operator in advance of an event occurring at the production site.
  • Diagram showing a configuration example of a learning device The figure which shows the structural example of the data acquisition part provided in the learning device.
  • Flowchart showing learning operation by learning device The figure which shows the configuration example of the inference device
  • Flowchart showing inference operation by inference device The figure which shows an example of the method in which a worker terminal notifies a prediction result by a monitoring device.
  • FIG. 1 is a diagram showing a configuration example of a monitoring system including the monitoring device according to the embodiment.
  • the monitoring system 100 includes a management server 1, a field device 2, a worker terminal 3, and a production device 4.
  • the management server 1 has a storage device 12 that receives and stores information acquired from the worker terminal 3 and the production device 4 by the field device 2, an event that occurs at the production site, a state of the worker when the event occurs, and an event. It includes a learning device 11 that learns the state of the production device 4 when it occurs by using the information stored in the storage device 12. Examples of events that occur at the production site include a stoppage of the production device 4, a failure of the production device 4, a repair of the production device 4, an injury of a worker, and the like. The details of the learning device 11 will be described later.
  • the storage device 12 includes a worker information storage unit 121, a device information storage unit 122, and a learned model storage unit 123.
  • the worker information storage unit 121 stores the worker information configured by the field equipment 2 including various information regarding the state of the worker acquired from the worker terminal 3.
  • the device information storage unit 122 stores device information configured by the field device 2 including various information regarding the state of the production device 4 acquired from the production device 4.
  • the trained model storage unit 123 stores the trained model generated by the learning device 11.
  • the management server 1 may be connected to the field device 2 via the Internet or the like, that is, the monitoring system 100 may be realized by using cloud computing.
  • the field device 2 is, for example, an industrial personal computer installed at a production site, and has a function of communicating with a worker terminal 3 and a production device 4 via a wired line or a wireless line.
  • the field device 2 may be a programmable logic controller, a programmable display, or the like. In the configuration example shown in FIG. 1, one field device 2 is used, but a configuration including two or more field devices 2 may be included.
  • the field equipment 2 acquires various information output from the worker terminal 3 and transmits the worker information to the management server 1 and the worker information acquisition unit 21 that collectively generates the worker information by collecting the acquired various information.
  • the worker information includes time information indicating the time when various information was acquired and identification information of the worker terminal 3 from which the information was acquired.
  • the device information includes time information indicating the time when various information is acquired and identification information of the production device 4 from which the information is acquired.
  • the worker information acquisition unit 21 repeatedly acquires worker information at regular intervals.
  • the device information acquisition unit 23 repeatedly acquires device information at regular intervals.
  • the field device 2 includes an inference device 25.
  • the inference device 25 is in a state in which learning by the learning device 11 of the management server 1 has been completed, that is, in a state in which the learned model storage unit 123 of the management server 1 stores the learned model, and the worker information acquisition unit 21 And when the device information acquisition unit 23 newly acquires the worker information and the device information, it is based on the newly acquired worker information and the device information and the learned model stored in the learned model storage unit 123. And perform inference processing. In the inference process performed by the inference device 25, an event occurring at the production site is inferred. The details of the inference device 25 will be described later.
  • Each component of the field device 2 constitutes the monitoring device 5 together with the learning device 11 and the storage device 12 included in the management server 1.
  • FIG. 1 shows an example in which the field equipment 2 acquires the above information (worker information, device information) from the worker terminal 3 and the production device 4 and transmits the above information (worker information, device information) to the management server 1.
  • the management server 1 may be configured to directly acquire the above information from the worker terminal 3 and the production device 4.
  • the worker information transmission unit 22 transmits the worker information acquired by the worker information acquisition unit 21 to the management server 1 to some extent.
  • the worker information transmission unit 22 transmits, for example, the worker information to the management server 1 at regular intervals.
  • the device information transmission unit 24 also transmits the device information acquired by the device information acquisition unit 23 to the management server 1 to some extent.
  • the timing at which the worker information transmitting unit 22 transmits the worker information and the timing at which the device information transmitting unit 24 transmits the device information may be the same or different.
  • the worker terminal 3 is a terminal device held by a worker at a production site, and corresponds to a tablet terminal, a smartphone, or the like. One or more worker terminals 3 are included in the monitoring system 100.
  • the worker terminal 3 has a display unit 31, an operation unit 32, a position information acquisition unit 33 that acquires the position information of the worker terminal 3, and a voice that collects sound and generates sound data including the conversation content of the worker.
  • the position information, the video data, and the text data constitute the above-mentioned worker information. That is, the worker information includes the position information of the worker terminal 3, the text data in which the voice of the worker is converted into text, the video data taken by the worker terminal 3, the time information, and the worker terminal 3. It is configured to include identification information.
  • the worker terminal 3 may not be provided with the voice conversion unit 36. In this case, the field equipment 2 or the management server 1 converts the contents of the worker's conversation into text data as needed.
  • the production device 4 is various devices installed on the production line, and corresponds to, for example, an industrial robot. One or more production devices 4 are included in the monitoring system 100.
  • the production device 4 includes an information collecting unit 41 that collects information on the operating state of the production device 4, and a video acquisition unit 42 that shoots and generates video data.
  • the information collected by the information collecting unit 41 includes an operation log 411 which is history data of operations performed by an operator on the production device 4, and sensor information 412 which shows sensing results by various sensors (not shown).
  • the video data, the operation log 411, and the sensor information 412 constitute the above-mentioned device information. That is, the device information includes video data, an operation log 411, a sensor information 412, time information, and identification information of the production device 4.
  • FIG. 2 is a diagram showing a configuration example of the learning device 11.
  • FIG. 2 also shows a storage device 12 connected to the learning device 11.
  • the learning device 11 includes a data acquisition unit 111 and a model generation unit 112.
  • the learning device 11 is realized by the arithmetic unit 10 that constitutes the management server 1.
  • the data acquisition unit 111 extracts and learns information related to an event that has occurred at the production site from the worker information stored in the worker information storage unit 121 and the device information stored in the device information storage unit 122. Generate data for.
  • FIG. 3 is a diagram showing a configuration example of the data acquisition unit 111 included in the learning device 11.
  • the data acquisition unit 111 includes an event detection unit 111A and a learning data generation unit 111B.
  • the event detection unit 111A analyzes the worker information stored in the worker information storage unit 121 and the device information stored in the device information storage unit 122 to detect an event generated at the production site.
  • the learning data generation unit 111B is based on the worker information and device information acquired at the event occurrence timing and the worker information and device information generated in a certain period before the event occurrence timing. To generate. That is, the learning data generation unit 111B includes the worker information and device information acquired at the event occurrence timing, the worker information and device information generated in a certain period before the event occurrence timing, and the generated event. Data for learning is generated by associating with the event information indicating.
  • the model generation unit 112 generated the correspondence relationship between the event that occurs at the production site, the state of the worker when the event occurs, and the state of the production device 4 when the event occurs, by the data acquisition unit 111.
  • a trained model is generated by training based on the training data.
  • a machine learning algorithm in the model generation unit 112 a Bayesian network, a Monte Carlo method, or the like can be considered.
  • FIG. 4 is a flowchart showing a learning operation by the learning device 11.
  • the operation shown in FIG. 4 is started when, for example, the management server 1 receives an operation instructing the start of the learning operation from the user.
  • the learning device 11 starts operation, first, the data stored in the storage device 12, specifically, the worker information stored in the worker information storage unit 121 and the device information storage unit 122 are stored. Start the analysis of the device information and check the event that occurred at the production site. This process is performed by the event detection unit 111A of the data acquisition unit 111. Specifically, the event detection unit 111A includes the first acquired worker information among the worker information stored in the worker information storage unit 121 and the device information stored in the device information storage unit 122. The device information, that is, the worker information and the device information acquired at the earliest time is analyzed, and the presence or absence of an event occurrence is confirmed (step S11).
  • the event detection unit 111A determines that an event has occurred, for example, when the sensor information 412 included in the device information includes the sensor information 412 indicating an abnormal value. Further, the event detection unit 111A determines that an event has occurred when the operation log 411 included in the device information includes information indicating that the operation for urgently stopping the production device 4 has been executed. Further, the event detection unit 111A determines that the event has occurred when the text data included in the worker information includes words reminiscent of the occurrence of the event. Words that evoke the occurrence of an event are, for example, "ah”, “eh”, “bad”, “that?", "Funny", and so on.
  • these words are registered in the management server 1 in advance as keywords reminiscent of the occurrence of an event, and the event detection unit 111A determines that an event has occurred when the registered keywords are detected. Further, the event detection unit 111A may determine whether or not an event has occurred from the operation of the operator obtained by analyzing the video data.
  • the learning data generation unit 111B When the event detection unit 111A detects the occurrence of an event (step S11: Yes), the learning data generation unit 111B generates learning data (step S12). Specifically, the learning data generation unit 111B includes the occurrence timing of the detected event, that is, the occurrence time, and stores the worker information and the device information acquired in a certain period before the occurrence timing in the work of the storage device 12. Learning data is generated based on the worker information and device information extracted from the person information storage unit 121 and the device information storage unit 122, and the event information indicating the detected event. The learning data is composed of position information and text data in the information included in the worker information, operation logs 411 and sensor information 412 in the information included in the device information, and event information.
  • the event information is, for example, the name of the event.
  • the event information may be generated by the learning data generation unit 111B from the contents of the worker information and the contents of the device information, but the user of the management server 1 inputs the event information using an input device such as a keyboard or a mouse. You may do so.
  • the user inputs event information, for example, in the learning device 11, when the event detection unit 111A detects an event, the worker information and the device information at the time of event detection are not described in FIG. 1 and the like. Is displayed, and the user is asked to confirm this display and input event information. Further, when the learning data generation unit 111B generates the event information, the generated event information may be displayed on the display device so that the user can confirm the content and the user can modify it as necessary.
  • the model generation unit 112 performs learning processing using the learning data to generate a trained model (step S13).
  • the learning process here uses existing algorithms such as the Bayesian network and the Monte Carlo method.
  • the model generation unit 112 performs learning by associating the position information, the text data, the operation log 411 and the sensor information 412 included in the learning data with the event information.
  • step S13 the process returns to step S11, and the event detection unit 111A returns to the worker information and device information acquired at the earliest time among the unconfirmed worker information and device information. To check if an event has occurred.
  • step S11 When the event detection unit 111A does not detect the occurrence of an event at the production site (step S11: No), the event detection unit 111A confirms whether the data analysis is completed, that is, whether there is unconfirmed worker information and device information (step S14). ). When the data analysis is not completed (step S14: No), the process returns to step S11, and the event detection unit 111A returns to the worker information and the device acquired at the earliest time among the unconfirmed worker information and the device information. Analyze the information and check if an event has occurred.
  • step S14 when the data analysis is completed (step S14: Yes), the model generation unit 112 outputs the trained model to the trained model storage unit 123, and the trained model storage unit 123 stores this (step S15). , The learning operation by the learning device 11 is completed.
  • the learning device 11 generates a learned model in the past in a state where the learned model storage unit 123 stores the learned model, that is, the learning device 11 performs a learning operation according to the flowchart shown in FIG.
  • the learning device 11 performs a learning operation according to the flowchart shown in FIG.
  • re-learning is performed according to the flowchart shown in FIG. 4, and the learned model stored in the learned model storage unit 123 is updated.
  • FIG. 5 is a diagram showing a configuration example of the inference device 25.
  • the inference device 25 includes a data acquisition unit 251 and an inference unit 252.
  • the inference device 25 is realized by the arithmetic unit 20 that constitutes the field device 2.
  • the data acquisition unit 251 acquires data to be used in the inference process from the worker information output by the worker information acquisition unit 21 and the device information output by the device information acquisition unit 23. Specifically, the data acquisition unit 251 acquires the position information and the text data from the worker information, and also acquires the operation log 411 and the sensor information 412 from the device information. In the following description, for convenience, the position information, text data, operation log 411, and sensor information 412 acquired by the data acquisition unit 251 may be collectively referred to as inference data.
  • the inference unit 252 estimates an event that occurs at the production site based on the inference data acquired by the data acquisition unit 251 and the learned model stored in the learned model storage unit 123, and outputs it as an inference result. do.
  • the inference result output by the inference unit 252 is transmitted to, for example, the worker terminal 3.
  • the worker terminal 3 displays the inference result 311 on the display unit 31.
  • the inference result output by the inference unit 252 may be transmitted to other devices such as a management server 1, a production device 4, and a terminal device held by a manager at a production site (not shown).
  • FIG. 6 shows the inference operation by the inference device 25 in a flowchart.
  • FIG. 6 is a flowchart showing an inference operation by the inference device 25. The operation shown in FIG. 6 is executed, for example, when the field equipment 2 acquires the worker information and the device information.
  • the data acquisition unit 251 first performs data acquisition processing, and the inference device 25 is used for inference from the worker information output by the worker information acquisition unit 21 and the device information output by the device information acquisition unit 23.
  • Acquire data step S21.
  • the inference unit 252 performs inference processing using the inference data acquired by the data acquisition unit 251 and the learned model stored in the learned model storage unit 123, and performs an inference process to generate an event that occurs at the production site.
  • Predict step S22.
  • the inference unit 252 makes an inference using the inference data acquired in the past by the data acquisition unit 251 and predicts where in the production site, which event will occur in the future, and with what probability. ..
  • the inference unit 252 outputs an inference result when the inference process is completed (step S23).
  • the inference result output by the inference unit 252 is transmitted to the worker terminal 3, the terminal device held by the manager at the production site, and the like, and is notified to the worker, the manager, and the like via these devices.
  • FIG. 7 is a diagram showing an example of a method in which the worker terminal 3 notifies the prediction result by the monitoring device 5.
  • the worker terminal 3 superimposes the prediction result on the video acquired by the video acquisition unit 35 and notifies the worker.
  • FIG. 7 is an example of a notification method when the inference unit 252 predicts the occurrence of an event related to the production device 4, such as when the production device 4 is stopped, and the location (device) where the event is predicted to occur is indicated. Hatchings 321 and 322 are provided to indicate the effect. If different events are predicted to occur, different hatching will be used for each event.
  • Places with a high probability of event occurrence may be emphasized as hatching that is different from the surroundings. Note that hatching is an example, and the type of event and the probability of occurrence may be determined using text display, color, symbol, and the like. Details of the event can be confirmed by moving the mouse over the place where the event is expected to occur, or by displaying a pop-up when the place where the event is expected to occur is touched. By using the notification method as shown in FIG. 7, the worker can know at a glance where the event occurs at the production site.
  • the inference result by the inference device 25 is transmitted to another external device such as the worker terminal 3, and the other device notifies the inference result.
  • the field device 2 provided with the inference device 25 May be notified.
  • all or a part of a plurality of devices existing at the production site or in the vicinity of the production site, such as the field device 2 and the worker terminal 3, may be configured to notify the inference result.
  • the monitoring device 5 is obtained from the worker information acquired from the worker terminal 3 held by the worker who works at the production site and the production device 4 installed at the production site.
  • a learning device 11 is provided that learns an event that occurs at a production site, a state of a worker when the event occurs, and a state of the production device 4 based on the acquired device information, and generates a trained model. .. Further, when the monitoring device 5 acquires the worker information and the device information in the state where the learned model has been generated by the learning device 11, the monitoring device 5 is based on the acquired worker information and the device information and the learned model at the production site. Predict the events that will occur.
  • the configuration shown in the above-described embodiment shows an example of the contents of the present disclosure, can be combined with another known technique, and is one of the configurations as long as it does not deviate from the gist of the present disclosure. It is also possible to omit or change the part.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • Marketing (AREA)
  • Manufacturing & Machinery (AREA)
  • General Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • Game Theory and Decision Science (AREA)
  • Quality & Reliability (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Accounting & Taxation (AREA)
  • Computing Systems (AREA)
  • General Factory Administration (AREA)

Abstract

L'invention concerne un dispositif de surveillance (5) comprenant : une unité d'acquisition d'informations de travailleur (21) qui acquiert des informations de travailleur comprenant des informations de position sur un terminal de travailleur (3) et des données de texte pouvant être obtenues par conversion des données vocales en texte ; une unité d'acquisition d'informations de dispositif (23) qui acquiert des informations de dispositif comprenant des données historiques concernant des opérations effectuées sur un dispositif de production (4) ainsi que des informations de capteur indiquant un résultat de détection au moyen d'un capteur ; une unité de stockage de modèle appris (123) qui stocke un modèle appris généré par l'apprentissage d'un événement survenu sur un site de production, l'état d'un travailleur au moment de l'événement et l'état du dispositif de production, sur la base des informations d'événement indiquant l'événement survenu sur le site de production, des informations de travailleur et des informations de dispositif acquises à une heure d'occurrence d'événement, et des informations de travailleur et des informations de dispositif acquises pendant une certaine période avant l'heure d'occurrence de l'événement ; et une unité d'inférence (252) qui prédit l'occurrence d'un événement sur le site de production d'après le modèle appris, ainsi que les informations de travailleur et les informations de dispositif nouvellement acquises.
PCT/JP2020/009493 2020-03-05 2020-03-05 Dispositif de surveillance WO2021176663A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202080097903.XA CN115244564A (zh) 2020-03-05 2020-03-05 监视装置
PCT/JP2020/009493 WO2021176663A1 (fr) 2020-03-05 2020-03-05 Dispositif de surveillance
JP2020553664A JP6837615B1 (ja) 2020-03-05 2020-03-05 監視装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/009493 WO2021176663A1 (fr) 2020-03-05 2020-03-05 Dispositif de surveillance

Publications (1)

Publication Number Publication Date
WO2021176663A1 true WO2021176663A1 (fr) 2021-09-10

Family

ID=74673653

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/009493 WO2021176663A1 (fr) 2020-03-05 2020-03-05 Dispositif de surveillance

Country Status (3)

Country Link
JP (1) JP6837615B1 (fr)
CN (1) CN115244564A (fr)
WO (1) WO2021176663A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2022190257A1 (fr) * 2021-03-10 2022-09-15

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003140728A (ja) * 2001-11-01 2003-05-16 Fujitsu Ltd 作業調査分析システム、作業分析装置、および作業調査分析方法
JP2007172131A (ja) * 2005-12-20 2007-07-05 Nec Fielding Ltd 障害予測システム、障害予測方法、障害予測プログラム
JP2019003545A (ja) * 2017-06-19 2019-01-10 横河電機株式会社 操作支援装置、操作支援方法、操作支援プログラム及び記録媒体
WO2020037367A1 (fr) * 2018-08-21 2020-02-27 M2M Pumps Systèmes et procédés de surveillance à distance

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011192040A (ja) * 2010-03-15 2011-09-29 Kddi Corp 予測モデル学習装置、イベント予測装置、予測モデル学習方法およびプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003140728A (ja) * 2001-11-01 2003-05-16 Fujitsu Ltd 作業調査分析システム、作業分析装置、および作業調査分析方法
JP2007172131A (ja) * 2005-12-20 2007-07-05 Nec Fielding Ltd 障害予測システム、障害予測方法、障害予測プログラム
JP2019003545A (ja) * 2017-06-19 2019-01-10 横河電機株式会社 操作支援装置、操作支援方法、操作支援プログラム及び記録媒体
WO2020037367A1 (fr) * 2018-08-21 2020-02-27 M2M Pumps Systèmes et procédés de surveillance à distance

Also Published As

Publication number Publication date
JP6837615B1 (ja) 2021-03-03
CN115244564A (zh) 2022-10-25
JPWO2021176663A1 (fr) 2021-09-10

Similar Documents

Publication Publication Date Title
JP6796373B2 (ja) プラント運転システム及びプラント運転方法
KR102334965B1 (ko) 공장 자동화 설비의 효율적 관리와 생산성 향상을 위한 예지 보전 시스템
AU2014205737A1 (en) Method, device and computer program for monitoring an industrial control system
KR20220042425A (ko) 크레인 및 건설 현장 중 적어도 하나에 대한 관리 방법 및 시스템
JP6812312B2 (ja) プラント支援評価システム及びプラント支援評価方法
CN111882833B (zh) 基于离群参数的设备故障预警方法、装置、设备及介质
US10990090B2 (en) Apparatus and method for automatic detection and classification of industrial alarms
JP6837615B1 (ja) 監視装置
US11138852B2 (en) Remote diagnostics for flame detectors using fire replay technique
EP4038557A1 (fr) Procédé et système d'estimation et de représentation en continu de risque
KR20220043094A (ko) 스마트 공장에 적용가능한 스마트 밴드를 이용한 생산관리장치
US20180307212A1 (en) System and method for the maintenance of an industrial plant
JP2001084035A (ja) 運転監視システム
JP2009059204A (ja) コンピュータリモート制御システム
JP2005071200A (ja) 製造情報管理プログラム
JP2021060852A (ja) アラーム発報装置およびアラーム発報方法
JP2020046771A (ja) 音点検システムおよび音点検方法
JP2005149060A (ja) 監視システム、監視サーバ、監視方法およびプログラム
JP2003067049A (ja) 監視装置
JP2011145206A (ja) 通報側システム、プラント情報通報システム及びその方法
WO2020141541A1 (fr) Dispositif, système et procédé destinés à une action en boucle fermée sur la base d'une analyse de données de machine extraites
JP2022176700A (ja) 工業炉保全支援システム
JP2023123058A (ja) 情報処理装置の制御方法、情報処理装置、制御プログラム、記録媒体、情報処理システム、生産システム、物品の製造方法
CN103248516A (zh) 异常通知双向流程
WO2016099451A9 (fr) Procédé et appareil pour la prospection et la compréhension de gestion de cas

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020553664

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20922576

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20922576

Country of ref document: EP

Kind code of ref document: A1