CN115244564A - Monitoring device - Google Patents
Monitoring device Download PDFInfo
- Publication number
- CN115244564A CN115244564A CN202080097903.XA CN202080097903A CN115244564A CN 115244564 A CN115244564 A CN 115244564A CN 202080097903 A CN202080097903 A CN 202080097903A CN 115244564 A CN115244564 A CN 115244564A
- Authority
- CN
- China
- Prior art keywords
- information
- event
- operator
- unit
- production site
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012806 monitoring device Methods 0.000 title claims abstract description 26
- 238000004519 manufacturing process Methods 0.000 claims abstract description 88
- 238000003860 storage Methods 0.000 claims abstract description 36
- 238000006243 chemical reaction Methods 0.000 claims abstract description 5
- 238000012544 monitoring process Methods 0.000 claims description 11
- 238000000034 method Methods 0.000 description 17
- 238000001514 detection method Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 10
- 238000007405 data analysis Methods 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 238000000342 Monte Carlo simulation Methods 0.000 description 2
- 230000004931 aggregating effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000012447 hatching Effects 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000003449 preventive effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y10/00—Economic sectors
- G16Y10/25—Manufacturing
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Manufacturing & Machinery (AREA)
- Development Economics (AREA)
- Marketing (AREA)
- Entrepreneurship & Innovation (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Computing Systems (AREA)
- Accounting & Taxation (AREA)
- Game Theory and Decision Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- General Factory Administration (AREA)
Abstract
A monitoring device (5) comprises: an operator information acquisition unit (21) that acquires operator information including position information of an operator terminal (3) and text data obtained by text conversion of voice data; a device information acquisition unit (23) that acquires device information that includes history data of operations performed on the production device (4) and sensor information that represents the sensing results obtained by the sensor; a trained model storage unit (123) that stores a trained model generated by learning an event generated on the production site, the state of an operator at the time of the event, and the state of a production device, based on event information indicating an event generated on the production site, operator information and device information acquired at the event generation timing, and operator information and device information acquired within a predetermined period before the event generation timing; and an inference unit (252) that predicts an event occurring at the production site based on the trained model, the newly acquired operator information, and the device information.
Description
Technical Field
The present invention relates to a monitoring device for monitoring a state of a production site.
Background
In the production site, information obtained in the production site is monitored, and the monitoring result is used for various purposes such as high efficiency of production, assistance of operators, maintenance of a production system, and the like.
For example, patent document 1 describes an invention in which, when a failure occurs, information such as the content and the operation method of an operation performed by an operator to cope with the failure is stored in association with information of a place where the failure occurred, and when a new failure occurs, the operator can confirm information such as the content and the operation method of an operation performed in the past at a place where the new failure occurred to cope with the failure.
Patent document 1: japanese patent laid-open publication No. 2017-068397
Disclosure of Invention
According to the invention described in patent document 1, when an event such as a failure occurs, the time required for the required work can be shortened, and the work can be made more efficient. On the other hand, if an event that is likely to occur in the future can be predicted before the event occurs, it is conceivable that preparation of a job that is required after the event occurs can be performed in advance, and in this case, the required job can be efficiently performed even after the event occurs. For example, if a failure of the apparatus can be predicted, it is possible to provide preparations such as the securing of components necessary for repairing the apparatus and the securing of maintenance workers. Further, if the stop of the apparatus can be predicted, it is also possible to realize a countermeasure of eliminating the cause of the apparatus stop in advance.
The present invention has been made in view of the above problems, and an object of the present invention is to provide a monitoring device capable of notifying an operator of an event occurring at a production site in advance.
In order to solve the above problems and achieve the object, the present invention provides a monitoring device for monitoring a state of a production site, the monitoring device including: an operator information acquisition unit that acquires operator information including position information of an operator terminal held by an operator and text data obtained by text conversion of voice data including the content of a conversation of the operator; and an apparatus information acquisition unit that acquires apparatus information including history data of operations performed on a production apparatus installed at a production site and sensor information indicating a sensing result obtained by a sensor installed at the production apparatus. The monitoring device further includes a trained model storage unit that stores a trained model generated by learning an event generated on the production site, a state of an operator at the time of the event, and a state of the production device at the time of the event, based on event information indicating an event generated on the production site, operator information and device information acquired at an event generation timing which is a timing at which the event is generated, and operator information and device information acquired in a predetermined period before the event generation timing. The monitoring device further includes an inference unit that predicts an event occurring at the production site based on the trained model, the newly acquired operator information, and the newly acquired device information, if the operator information and the device information are newly acquired.
ADVANTAGEOUS EFFECTS OF INVENTION
The monitoring device according to the present invention has an effect of being able to notify an operator of an event occurring in a production site in advance.
Drawings
Fig. 1 is a diagram showing a configuration example of a monitoring system including a monitoring device according to an embodiment.
Fig. 2 is a diagram showing a configuration example of the learning device.
Fig. 3 is a diagram showing a configuration example of a data acquisition unit included in the learning apparatus.
Fig. 4 is a flowchart showing a learning operation performed by the learning device.
Fig. 5 is a diagram showing a configuration example of the inference device.
Fig. 6 is a flowchart showing an inference operation performed by the inference apparatus.
Fig. 7 is a diagram showing an example of a method for notifying the prediction result obtained by the monitoring device by the operator terminal.
Detailed Description
Hereinafter, a monitoring device according to an embodiment of the present invention will be described in detail with reference to the drawings. The present invention is not limited to this embodiment.
Description of the preferred embodiment
Fig. 1 is a diagram showing a configuration example of a monitoring system including a monitoring device according to an embodiment. The monitoring system 100 includes a management server 1, a field device 2, an operator terminal 3, and a production device 4.
The management server 1 includes: a storage device 12 that receives information acquired by the field device 2 from the operator terminal 3 and the production device 4; and a learning device 11 that learns an event occurring at the production site, a state of an operator at the time of the event, and a state of the production apparatus 4 at the time of the event, using the information stored in the storage device 12. Examples of the event occurring at the production site are a stop of the production apparatus 4, a failure of the production apparatus 4, a repair of the production apparatus 4, an injury of an operator, and the like. The learning device 11 will be described in detail later.
The storage device 12 includes an operator information storage unit 121, a device information storage unit 122, and a trained model storage unit 123. The worker information storage unit 121 stores worker information including various information on the state of a worker, which is acquired from the worker terminal 3 by the field device 2. The device information storage unit 122 stores device information including various information related to the state of the production device 4, which is acquired by the field device 2 from the production device 4. The trained model storage unit 123 stores the trained model generated by the learning device 11. The management server 1 may be connected to the field devices 2 via the internet or the like, that is, the monitoring system 100 may be implemented by cloud computing.
The field device 2 is, for example, an industrial personal computer installed at a production site, and has a function of communicating with the operator terminal 3 and the production apparatus 4 via a wired line or a wireless line. The field device 2 may also be a programmable logic controller, a programmable display, or the like. In the configuration example shown in fig. 1, 1 field device 2 is used, but a configuration including 2 or more field devices 2 is also possible.
The field device 2 has: an operator information acquiring unit 21 that acquires various information output from the operator terminal 3 and generates operator information by aggregating the acquired various information; an operator information transmitting unit 22 that transmits operator information to the management server 1; an apparatus information acquisition unit 23 that acquires various information output from the production apparatus 4 and generates apparatus information by aggregating the acquired various information; and a device information transmitting unit 24 that transmits the device information to the management server 1. Here, the worker information includes time information indicating the acquisition time of various kinds of information and identification information of the worker terminal 3 as an acquisition source of the information. The device information includes time information indicating the time when the various pieces of information are acquired, and identification information of the production device 4 that is the source of the information acquisition. The operator information acquiring unit 21 repeatedly acquires operator information at a fixed cycle. Similarly, the device information acquisition unit 23 repeatedly acquires device information at a fixed cycle.
In addition, the field device 2 has an inference means 25. When the worker information acquiring unit 21 and the device information acquiring unit 23 newly acquire the worker information and the device information in a state where the learning by the learning device 11 of the management server 1 is completed, that is, in a state where the trained model storing unit 123 of the management server 1 stores the trained model, the inference device 25 performs inference processing based on the newly acquired worker information and device information and the trained model stored by the trained model storing unit 123. In the inference process performed by the inference device 25, an event generated at the production site is inferred. The inference means 25 will be described in detail later.
Each component of the field device 2 constitutes the monitoring device 5 together with the learning device 11 and the storage device 12 of the management server 1. Fig. 1 shows an example of a configuration in which the field device 2 acquires the above-described information (operator information and device information) from the operator terminal 3 and the production device 4 and transmits the information to the management server 1, but the management server 1 may be configured to have the functions of the components shown in fig. 1 of the field device 2. That is, the management server 1 may directly acquire the information from the operator terminal 3 and the production apparatus 4.
The worker information transmitting unit 22 transmits the worker information acquired by the worker information acquiring unit 21 to the management server 1 in a certain order. The worker information transmitting unit 22 transmits the worker information to the management server 1 at a fixed cycle, for example. Similarly, the device information transmitting unit 24 also transmits the device information acquired by the device information acquiring unit 23 to the management server 1 in a certain order. The timing (timing) at which the operator information transmitting unit 22 transmits the operator information may be the same as or different from the timing at which the apparatus information transmitting unit 24 transmits the apparatus information.
The worker terminal 3 is a terminal device held by a worker at a production site, and is a tablet terminal, a smartphone, or the like. The monitoring system 100 includes 1 or more operator terminals 3.
The operator terminal 3 includes: a display unit (31); an operation section 32; a position information acquisition unit 33 that acquires position information of the operator terminal 3; a voice acquisition unit 34 that collects sound and generates voice data including the content of the conversation of the operator; an image acquisition unit 35 that generates image data by capturing an image; and a voice conversion unit 36 for converting the content of the operator's conversation included in the voice data generated by the voice acquisition unit 34 into text data. The position information, the image data, and the text data constitute the worker information. That is, the worker information includes position information of the worker terminal 3, text data obtained by converting a voice of a worker into a text, image data captured by the worker terminal 3, time information, and identification information of the worker terminal 3. The operator terminal 3 may not include the voice conversion unit 36. In this case, the field device 2 or the management server 1 converts the conversation contents of the worker into text data as necessary.
The production apparatus 4 is various apparatuses provided in a production line, and is, for example, an industrial robot. The monitoring system 100 includes 1 or more production apparatuses 4.
The production apparatus 4 has: an information collection unit 41 that collects information relating to the operating state of the production apparatus 4; and an image acquisition unit 42 that captures an image and generates image data. The information collected by the information collection unit 41 includes an operation log 411, which is history data of operations performed by the operator on the production apparatus 4, and sensor information 412, which indicates sensing results obtained by various sensors, not shown. The image data, the operation log 411, and the sensor information 412 constitute the above-described device information. That is, the device information includes video data, an operation log 411, sensor information 412, time information, and identification information of the production device 4.
Next, details of the learning device 11 will be described with reference to fig. 2 to 4.
Fig. 2 is a diagram showing a configuration example of the learning device 11. For convenience of explanation, fig. 2 also shows a storage device 12 connected to the learning device 11. As shown in fig. 2, the learning device 11 includes a data acquisition unit 111 and a model generation unit 112. The learning device 11 is realized by the arithmetic device 10 constituting the management server 1.
The data acquisition unit 111 extracts information relating to an event occurring at a production site from the worker information stored in the worker information storage unit 121 and the device information stored in the device information storage unit 122, and generates learning data. Fig. 3 is a diagram showing a configuration example of the data acquisition unit 111 included in the learning device 11. The data acquisition unit 111 includes an event detection unit 111A and a learning data generation unit 111B. The event detection unit 111A analyzes the worker information stored in the worker information storage unit 121 and the device information stored in the device information storage unit 122, and detects an event generated at the production site. The learning data generation unit 111B generates learning data based on the operator information and the device information acquired at the event generation timing and the operator information and the device information generated in a predetermined period before the event generation timing. That is, the learning data generation unit 111B associates the operator information and the device information acquired at the occurrence timing of the event, the operator information and the device information generated in a predetermined period before the occurrence timing of the event, and the event information indicating the event that has occurred, and generates the learning data.
The model generation unit 112 learns the correspondence relationship between an event occurring at the production site, the state of the operator at the time of the event, and the state of the production apparatus 4 at the time of the event based on the learning data generated by the data acquisition unit 111, and generates a trained model. Here, as an algorithm for machine learning in the model generation unit 112, a bayesian network, a monte carlo method, or the like is conceivable.
The learning operation performed by the learning device 11 is shown in a flowchart as shown in fig. 4. Fig. 4 is a flowchart showing a learning operation performed by the learning device 11. For example, if the management server 1 receives an operation from the user instructing the start of the learning action, the action shown in fig. 4 is started.
When the operation is started, the learning device 11 first starts analysis of data stored in the storage device 12, specifically, worker information stored in the worker information storage unit 121 and device information stored in the device information storage unit 122, and confirms an event occurring at the production site. This process is performed by the event detection unit 111A of the data acquisition unit 111. Specifically, the event detecting unit 111A analyzes the operator information and the device information acquired first, that is, the operator information and the device information acquired at the earliest time, among the operator information stored in the operator information storage unit 121 and the device information stored in the device information storage unit 122, and confirms the presence or absence of an event (step S11). For example, when the sensor information 412 indicating an abnormality is included in the sensor information 412 included in the device information, the event detection unit 111A determines that an event has occurred. Further, when the operation log 411 included in the device information includes information indicating that an operation to bring the production device 4 into an emergency stop is performed, the event detection unit 111A determines that an event has occurred. Further, the event detection unit 111A determines that an event has occurred when a word that can be associated with the occurrence of the event is included in the text data included in the operator information. Words that can be associated with events such as "o", "ei", "not wonderful", "how to do? "," oddly ", etc. For example, these words are registered in the management server 1 as keywords that can be associated with the occurrence of an event, and the event detection unit 111A determines that an event has occurred when the registered keywords are detected. The event detection unit 111A may determine whether an event has occurred based on the operation of the operator obtained by analyzing the video data.
When the event detecting unit 111A detects the occurrence of an event (step S11: yes), the learning data generating unit 111B generates learning data (step S12). Specifically, the learning data generating unit 111B extracts, from the operator information storage unit 121 and the device information storage unit 122 of the storage device 12, operator information and device information acquired within a certain period including the generation time that is the generation timing of the detected event and before the generation time, and generates the learning data based on the extracted operator information and device information and event information indicating the detected event. The data for learning is composed of position information and text data in information included in worker information, operation log 411 and sensor information 412 in information included in device information, and event information.
The event information is, for example, the name of the event. Here, the event information may be generated by the learning data generation unit 111B according to the content of the operator information and the content of the device information, but may be input by the user of the management server 1 using an input device such as a keyboard or a mouse. When event information is input by a user, for example, if an event is detected by the event detecting unit 111A, the learning device 11 displays the worker information and the device information at the event detection time on a display device, which is not shown in fig. 1 or the like, and allows the user to input the event information while confirming the display. In addition, when the event information is generated by the learning data generation unit 111B, the generated event information may be displayed on a display device so that the user can confirm the content and can correct the content as needed.
When the creation of the learning data by the learning data generation unit 111B is completed, the model generation unit 112 performs a learning process using the learning data to generate a trained model (step S13). As described above, the learning process here uses an existing algorithm such as a bayesian network or a monte carlo method. The model generation unit 112 performs learning by associating position information, text data, an operation log 411, sensor information 412, and event information included in the data for learning.
When the learning process by the model generation unit 112 in step S13 is completed, the process returns to step S11, and the event detection unit 111A analyzes the operator information and the device information acquired at the earliest time among the unconfirmed operator information and device information, and confirms whether or not an event has occurred.
If the occurrence of an event in the production site is not detected (step S11: no), the event detection unit 111A checks whether the data analysis is completed, that is, whether there is No unconfirmed worker information or device information (step S14). If the data analysis is not completed (No in step S14), the process returns to step S11, and the event detection unit 111A analyzes the worker information and the device information acquired at the earliest time among the worker information and the device information that have not been confirmed, and confirms whether or not an event has occurred.
On the other hand, when the data analysis is completed (Yes in step S14), the model generation unit 112 outputs the trained model to the trained model storage unit 123, and the trained model storage unit 123 stores the model (step S15), and the learning operation by the learning device 11 is completed.
In a state where the trained model storage unit 123 stores the trained model, that is, in a case where the learning device 11 performs a learning operation according to the flowchart shown in fig. 4 and the trained model has been generated in the past, the learning device 11 performs relearning according to the flowchart shown in fig. 4 and updates the trained model stored in the trained model storage unit 123 when an operation for instructing the start of the learning operation is received.
Next, details of the inference device 25 will be described with reference to fig. 5 and 6.
Fig. 5 is a diagram showing a configuration example of the inference device 25. For convenience of explanation, fig. 5 also shows other devices and components connected to the inference device 25. As shown in fig. 5, the inference device 25 includes a data acquisition unit 251 and an inference unit 252. The inference means 25 is realized by the arithmetic means 20 constituting the field device 2.
The data acquisition unit 251 acquires data used for the inference process from the operator information output from the operator information acquisition unit 21 and the device information output from the device information acquisition unit 23. Specifically, the data acquisition unit 251 acquires position information and text data from the operator information, and acquires the operation log 411 and the sensor information 412 from the device information. In the following description, for convenience, the positional information, the text data, the operation log 411, and the sensor information 412 acquired by the data acquisition unit 251 may be collectively referred to as "inference data".
The inference unit 252 estimates an event occurring in the production site based on the inference data acquired by the data acquisition unit 251 and the trained model stored in the trained model storage unit 123, and outputs the estimated event as an inference result. The inference result output by the inference unit 252 is transmitted to the operator terminal 3, for example. In this case, the operator terminal 3 displays the inference result 311 on the display unit 31. The inference result output by the inference unit 252 may be transmitted to other devices, such as the management server 1, the production apparatus 4, and a terminal apparatus held by a manager in the production site, which is not shown.
If the inference actions performed by the inference means 25 are illustrated by means of a flow chart, this is illustrated in fig. 6. Fig. 6 is a flowchart showing the inference operation performed by the inference device 25. The operation shown in fig. 6 is executed, for example, when the field device 2 acquires operator information and device information.
When the inference device 25 starts to operate, the data acquisition unit 251 first performs data acquisition processing to acquire inference data based on the worker information output from the worker information acquisition unit 21 and the device information output from the device information acquisition unit 23 (step S21). Next, the inference unit 252 performs inference processing using the inference data acquired by the data acquisition unit 251 and the trained model stored in the trained model storage unit 123, and predicts an event occurring in the production site (step S22). At this time, the inference unit 252 also infers what events will occur with a certain degree of probability at what place in the production site in the future by using the inference data acquired by the data acquisition unit 251 in the past. If the inference process ends, the inference section 252 outputs an inference result (step S23).
The inference result output by the inference unit 252 is transmitted to the worker terminal 3, a terminal device held by a manager of the production site, and the like, and is notified to the worker, the manager, and the like via these devices.
As an example, a notification method in which the worker terminal 3 notifies the worker of the inference result will be described. Fig. 7 is a diagram showing an example of a method of notifying the operator terminal 3 of the prediction result obtained by the monitoring device 5. In the example shown in fig. 7, the operator terminal 3 superimposes the prediction result on the image acquired by the image acquisition unit 35 and notifies the operator of the result. Fig. 7 shows an example of a notification method in a case where the inference unit 252 predicts occurrence of an event related to the production apparatus 4, such as stoppage of the production apparatus 4, and the places (apparatuses) where occurrence of the event is predicted are marked with hatchings 321 and 322 indicating the predicted occurrence. When the occurrence of different events is predicted, shadows that differ from event to event are set. The location where the occurrence probability of the event is high may be highlighted by a shadow different from the surrounding area. Note that the hatching is an example, and text display, color, symbol, or the like may be used as long as the type and occurrence probability of an event can be determined. The details of the event can be confirmed by, for example, moving the mouse above the location where the occurrence of the event is predicted, or by performing a pop-up display when the location where the occurrence of the event is predicted is touched. By adopting the notification method as shown in fig. 7, the operator can clearly understand where the event is generated in the production site.
In the present embodiment, the inference result obtained by the inference device 25 is transmitted to another device external to the worker terminal 3 or the like, and the inference result is notified by the other device, but the inference result may be notified by the field device 2 having the inference device 25. Further, the inference result may be notified by all or a part of a plurality of devices existing at or near the production site, such as the field device 2 and the operator terminal 3.
As described above, the monitoring device 5 according to the present embodiment includes the learning device 11, and the learning device 11 learns an event occurring at the production site, the state of the operator at the time of the event, and the state of the production apparatus 4 on the basis of the operator information acquired from the operator terminal 3 held by the operator who performs the work at the production site and the apparatus information acquired from the production apparatus 4 installed at the production site, and generates the trained model. Further, if the operator information and the device information are acquired in a state where the trained model is generated by the learning device 11, the monitoring device 5 predicts an event occurring at the production site based on the acquired operator information and device information and the trained model. This enables the operator to be notified of an event occurring at the production site in advance, thereby contributing to risk prediction and preventive maintenance. Further, by feeding back the prediction result to the operator, it is possible to contribute to preventing a trouble related to the production apparatus 4 and the operator in the past, and it is possible to improve safety and reliability of the production site.
The configuration shown in the above embodiment is an example of the contents of the present invention, and may be combined with other known techniques, and a part of the configuration may be omitted or modified within a range not departing from the gist of the present invention.
Description of the reference numerals
The system comprises a management server 1, a field device 2, an operator terminal 3, a production device 4, a monitoring device 5, a calculation device 10, a calculation device 20, a learning device 11, a storage device 12, an operator information acquisition unit 21, an operator information transmission unit 22, an operator information acquisition unit 23, an apparatus information transmission unit 24, an inference device 25, a display unit 31, an operation unit 32, a position information acquisition unit 33, a voice acquisition unit 34, a video acquisition unit 35, a video acquisition unit 42, an information collection unit 41, a monitoring system 100, a data acquisition unit 111, a 251, an event detection unit 111A, a data generation unit for learning 111B, a model generation unit 112, an operator information storage unit 121, an apparatus information storage unit 122, a trained model storage unit 123, a reasoning unit 252, an inference result 311, an operation log 411, and sensor information.
Claims (6)
1. A monitoring device for monitoring the condition of a production site,
the monitoring device is characterized by comprising:
an operator information acquisition unit that acquires operator information including position information of an operator terminal held by an operator and text data obtained by text conversion of voice data including a content of a conversation of the operator;
an apparatus information acquisition unit that acquires apparatus information including history data of operations performed on a production apparatus installed at the production site and sensor information indicating a sensing result obtained by a sensor installed at the production apparatus;
a trained model storage unit that stores a trained model generated by learning an event generated at the production site, a state of an operator at the time of the event, and a state of a production apparatus at the time of the event, based on event information indicating an event generated at the production site, the operator information and the apparatus information acquired at an event generation timing that is a timing at which the event is generated, and the operator information and the apparatus information acquired in a predetermined period before the event generation timing; and
and an inference unit configured to predict an event generated at the production site based on the trained model, the newly acquired worker information, and the newly acquired device information, if the worker information and the device information are newly acquired.
2. The monitoring device of claim 1,
the production control device includes a model generation unit that learns a correspondence relationship between an event generated at the production site, a state of an operator at the time of the event, and a state of a production apparatus at the time of the event based on the event information, the position information, the text data, the history data, and the sensor information acquired at the event generation timing, and the position information, the text data, the history data, and the sensor information acquired during the predetermined period, and generates the trained model.
3. The monitoring device of claim 1 or 2,
the estimation device includes a display unit that displays the estimation result obtained by the estimation unit.
4. The monitoring device of claim 3,
the inference unit predicts an event generated at the production site and a generation location of the event,
the display unit displays a display indicating occurrence of a predicted event at a location where the occurrence of the predicted event is captured within the image of the production site, based on the prediction result obtained by the inference unit.
5. The monitoring device of claim 1 or 2,
the inference unit sends a prediction result of an event generated at the production site to the operator terminal, and causes the operator terminal to notify the operator of the prediction result.
6. The monitoring device of claim 5,
the inference unit predicts an event occurring in the production site and an event occurrence location, and causes the operator terminal to display a display indicating occurrence of the predicted event at the location where the predicted event occurs in the image of the production site is captured.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/009493 WO2021176663A1 (en) | 2020-03-05 | 2020-03-05 | Monitoring device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115244564A true CN115244564A (en) | 2022-10-25 |
Family
ID=74673653
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080097903.XA Pending CN115244564A (en) | 2020-03-05 | 2020-03-05 | Monitoring device |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP6837615B1 (en) |
CN (1) | CN115244564A (en) |
WO (1) | WO2021176663A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022190257A1 (en) * | 2021-03-10 | 2022-09-15 | 日本電信電話株式会社 | Learning device, estimation device, methods therefor, and program |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003140728A (en) * | 2001-11-01 | 2003-05-16 | Fujitsu Ltd | Work survey analyzing system, work analyzing device and work survey analyzing method |
JP2007172131A (en) * | 2005-12-20 | 2007-07-05 | Nec Fielding Ltd | Failure prediction system, failure prediction method and failure prediction program |
JP2011192040A (en) * | 2010-03-15 | 2011-09-29 | Kddi Corp | Predicted model learning system, event predicting system, method for learning predicted model, and program |
JP7252703B2 (en) * | 2017-06-19 | 2023-04-05 | 横河電機株式会社 | Operation support device, operation support method, operation support program, and recording medium |
WO2020037367A1 (en) * | 2018-08-21 | 2020-02-27 | M2M Pumps | Remote monitoring systems and methods |
-
2020
- 2020-03-05 CN CN202080097903.XA patent/CN115244564A/en active Pending
- 2020-03-05 WO PCT/JP2020/009493 patent/WO2021176663A1/en active Application Filing
- 2020-03-05 JP JP2020553664A patent/JP6837615B1/en active Active
Also Published As
Publication number | Publication date |
---|---|
JPWO2021176663A1 (en) | 2021-09-10 |
JP6837615B1 (en) | 2021-03-03 |
WO2021176663A1 (en) | 2021-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8291264B2 (en) | Method and system for failure prediction with an agent | |
Alves et al. | Deployment of a smart and predictive maintenance system in an industrial case study | |
AU2014205737A1 (en) | Method, device and computer program for monitoring an industrial control system | |
JP2007172131A (en) | Failure prediction system, failure prediction method and failure prediction program | |
US10573421B2 (en) | Plant operation system and plant operation method | |
US11423494B2 (en) | Plant assistance assessment system and plant assistance assessment method | |
US20180174694A1 (en) | Abnormality diagnosis system and abnormality diagnosis method | |
CN114743341A (en) | Fire-fighting detection alarm method and device based on edge calculation | |
US10990090B2 (en) | Apparatus and method for automatic detection and classification of industrial alarms | |
KR20190135923A (en) | Facility management method and apparatus performing the same | |
CN115244564A (en) | Monitoring device | |
EP3451306B1 (en) | Remote diagnostics for flame detectors using fire replay technique | |
US20220404820A1 (en) | Alarm reporting apparatus and alarm reporting method | |
JP2009086983A (en) | Alarm management system and portable terminal to be used for alarm management system | |
JP2016181074A (en) | Computer terminal, program for same, and computer system | |
KR20160099902A (en) | Part Life Management System | |
JP2019057196A (en) | Information collection device and information collection method | |
JP2005153104A (en) | Computer maintenance robot system, maintenance work method by call, patrol inspection method, and program therefor | |
JP2009059204A (en) | Computer remote control system | |
JP2005149060A (en) | Supervisory system, server, method and program | |
EP3974958A1 (en) | Method and apparatus to interface with sis for remote maintenance | |
JP7286439B2 (en) | Supervisory control system, information processing device, information processing method, and computer program | |
JP2023123058A (en) | Control method for information processing device, information processing device, control program, recording medium, information processing system, production system, and manufacturing method for article | |
CN116848832A (en) | Mobile communication device and machine tool controllable by means of a mobile communication device | |
CN116319263A (en) | Edge device repairing method and device, VR device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |