CN112016689A - Information processing apparatus, prediction discrimination system, and prediction discrimination method - Google Patents

Information processing apparatus, prediction discrimination system, and prediction discrimination method Download PDF

Info

Publication number
CN112016689A
CN112016689A CN202010283149.0A CN202010283149A CN112016689A CN 112016689 A CN112016689 A CN 112016689A CN 202010283149 A CN202010283149 A CN 202010283149A CN 112016689 A CN112016689 A CN 112016689A
Authority
CN
China
Prior art keywords
unit
evaluation
causal
model
causal model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010283149.0A
Other languages
Chinese (zh)
Other versions
CN112016689B (en
Inventor
西纳修一
前田真彰
樱井祐市
矢崎彻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN112016689A publication Critical patent/CN112016689A/en
Application granted granted Critical
Publication of CN112016689B publication Critical patent/CN112016689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • G06N5/046Forward inferencing; Production systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0635Risk analysis of enterprise or organisation activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Game Theory and Decision Science (AREA)
  • Artificial Intelligence (AREA)
  • Operations Research (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Manufacturing & Machinery (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Quality & Reliability (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to an information processing apparatus, a prediction discrimination system and a prediction discrimination method. A highly applicable causal model can be estimated. The disclosed device is provided with: a causal model estimation unit that estimates one or more causal models representing relationships between explanatory variables and target variables, using, as input, measurement data including the explanatory variables and the target variables obtained from the determination target; an evaluation unit that evaluates one or more causal models using an index indicating performance of prediction or determination for a target variable, and outputs a causal model in which the result of the evaluation satisfies a predetermined condition; and an editing unit that outputs the causal model output by the evaluation unit and the result of the evaluation to the display unit.

Description

Information processing apparatus, prediction discrimination system, and prediction discrimination method
Technical Field
The invention relates to a construction technology of a causal model and an application technology thereof.
Background
In order to improve production efficiency in a manufacturing site, it is necessary to predict a failure of a manufacturing apparatus and perform maintenance in advance, or to determine a cause of a defective product at an early stage and to make a countermeasure.
Such a model for prediction and discrimination can be constructed using a statistical method such as regression analysis and discrimination analysis, and a machine learning method such as a neural network. In these methods, the determination result and the prediction result, which are the target variables, are output as input variables by receiving the setting parameters, the sensing data, and the like of the manufacturing apparatus and the production line, but the following means may be considered to clarify the determination: and constructing a causal model for representing the relationship between the input variable and the target variable, and predicting and distinguishing the effective utilization of the causal model.
In the automatic estimation of a causal model, methods such as SGS (Spirts, Glymour and Scheines) algorithm and path analysis are generally known, and the causal model can be automatically estimated so that the suitability for data is maximized. However, when the number of data is small, a causal model of an error may be estimated. Therefore, when the model is applied to a manufacturing site, the model needs to be corrected in advance according to the judgment of a person who is familiar with the manufacturing site or a component thereof (hereinafter, referred to as a domain knowledge holder). Patent document 1 and the like also show a mechanism for automatically estimating such a model and editing the result thereof.
Documents of the prior art
Patent document
Patent document 1: JP 2008-217711 publication
However, in the estimation of the causal model, when importance is placed not on the estimation of the graph structure itself but on the application of prediction and discrimination, it is not optimal to select the causal model using the suitability of the data with respect to the entire graph as an index. This is because the suitability of the data of the entire model does not necessarily correspond to the performance of prediction and discrimination (hereinafter referred to as an applicability index) for a specific target variable. In order to improve applicability, for example, in a model, it is necessary to emphasize a partial graph of a set of variables related to a target variable more than a partial graph related to a set of variables not related to the target variable.
Disclosure of Invention
Therefore, an object of the present invention is to provide a technique capable of estimating a highly applicable causal model.
An information processing device according to an aspect of the present invention includes: a causal model estimation unit that estimates one or more causal models representing relationships between explanatory variables obtained from a determination target and target variables, using measurement data including the explanatory variables and the target variables as inputs; an evaluation unit that evaluates the one or more causal models using an index indicating performance of prediction or determination for the target variable, and outputs a causal model in which a result of the evaluation satisfies a predetermined condition; and an editing unit configured to output the causal model output by the evaluation unit and a result of the evaluation to a display unit.
Effects of the invention
According to one aspect of the present invention, a highly applicable causal model can be estimated.
Drawings
Fig. 1 is a schematic configuration diagram of a prediction discrimination apparatus in example 1.
Fig. 2 is a processing flow of the prediction discrimination apparatus in embodiment 1.
Fig. 3 is a data format in example 1.
FIG. 4 is a hypothetical set of causal models from example 1.
Fig. 5 is a schematic configuration diagram of the model evaluation unit 102 in example 1.
Fig. 6 is an example of estimated parameters in embodiment 1.
Fig. 7 is an example of the user display editing unit 103 in embodiment 1.
Fig. 8 is a schematic configuration diagram in embodiment 2.
Fig. 9 is a schematic configuration diagram in embodiment 2.
Fig. 10 is a data format in embodiment 3.
Description of reference numerals:
prediction and judgment device
101.. causal model estimation unit
102.. model evaluation unit
A data splitting section
202
203
204
A user display editing unit
A model effective utilization part
A data accumulation unit
112
Detailed Description
Hereinafter, embodiments of the present invention will be described with reference to the drawings. The following description and drawings are illustrative of the present invention and are omitted or simplified as appropriate for clarity of description. The present invention can be implemented in various other ways. Each component may be singular or plural as long as it is not limited thereto.
In order to facilitate understanding of the present invention, the positions, sizes, shapes, ranges, and the like of the respective constituent elements shown in the drawings may not represent actual positions, sizes, shapes, ranges, and the like. Therefore, the present invention is not limited to the position, size, shape, range, etc. disclosed in the drawings.
In the following description, various information may be described in the form of "table", "list", or the like, but various information may be represented by data structures other than these. To indicate that it does not depend on the data structure, "XX table", "XX list", and the like are sometimes referred to as "XX information". In the description of the identification information, expressions such as "identification information", "identifier", "name", "ID" and "number" are used, but these can be replaced with each other.
When there are a plurality of constituent elements having the same or similar functions, the same reference numerals are attached with different suffixes to describe the elements. Note that when it is not necessary to distinguish between these plural constituent elements, the subscript may be omitted for explanation.
In the following description, although the processing performed by executing the program will be described in some cases, the program is executed by a processor (for example, CPU or GPU) to perform the specified processing while appropriately using a memory resource (for example, memory) and/or an interface device (for example, communication port), and the like, and therefore the main body of the processing may be the processor. Similarly, the main body of the processing performed by executing the program may be a controller, an apparatus, a system, a computer, or a node having a processor. The main body of the processing performed by executing the program may be a computing unit, or may include a dedicated circuit (e.g., FPGA or ASIC) that performs specific processing.
The program may be installed from a program source to a computer-like device. The program source may be, for example, a program distribution server or a computer-readable storage medium. When the program source is a program distribution server, the program distribution server may include a processor and a storage resource for storing the program to be distributed, and the program to be distributed may be distributed to another computer by the processor of the program distribution server. In the following description, 2 or more programs may be implemented as 1 program, or 1 program may be implemented as 2 or more programs.
[ example 1 ]
Here, an example is given in which an environment in which wear of a cutting machine, which is an example of a device, an apparatus, and a tool used in a manufacturing site, is detected, but the present invention is not limited to this, and various objects can be used as a detection object or a discrimination object. In addition, as shown in fig. 3, the prediction and determination device 100 receives measurement data including 5 variables ((1) manufacturing ID, (2) material, (3) vibration, (4) machining type, and (5) presence or absence of machining abnormality) repeatedly transmitted from the cutting machine every time the cutting machine machines the workpiece, and stores the measurement data in the data storage unit 111. Fig. 3 shows that the prediction and determination device 100 receives and accumulates measurement data from various numbers and types of cutting machines having manufacturing IDs "ID-09" and "ID-05". In addition, for example, the measurement data received from "ID-09" indicates: when a workpiece made of "steel" is machined, the vibration is "small" (less than a predetermined threshold), the machining type is "type-A" (metal cutting), and there is no machining abnormality. Furthermore, the prediction and determination device 100 measures the above (1) to (4) from the measurement data, and desirably creates a model for determining the presence or absence of machining abnormality (5). In this case, (1) to (4) are explanatory variables, and (5) is a target variable.
As shown in fig. 1, the prediction and determination device 100 includes a data storage unit 111, a causal model estimation unit 101, a model evaluation unit 102, and a user display editing unit 103.
The data storage unit 111 is configured by a general storage device such as an HDD (hard Disk Drive) or an SSD (Solid State Drive) as hardware, and stores the measurement data.
The causal model estimation unit 101 reads the measurement data from the data storage unit 111 to estimate 1 or more causal models. The model evaluation unit 102 evaluates, as a candidate to be presented to the user, a causal model having a high index indicating that the applicability index of the performance of prediction and determination with respect to the target variable is a predetermined reference value or more, from the causal models generated by the causal model estimation unit 101.
The user Display editing unit 103 outputs the structure of the causal model estimated by the causal model estimation unit 101 and the result obtained by evaluating the performance by the model evaluation unit 102 using the applicability index to a Display unit 112, which is a Display device generally used as hardware, such as an LCD (Liquid Crystal Display), and presents the result to the user. The specific processing performed by the causal model estimation unit 101, the model evaluation unit 102, and the user display editing unit 103 will be described later.
The prediction and determination device 100 is configured by a general information processing device such as a PC (Personal Computer) or a server as hardware. The causal model estimation unit 101, the model evaluation unit 102, and the user display editing unit 103 are realized by executing a program. For example, the functions of the causal model estimation Unit 101, the model evaluation Unit 102, and the user display editing Unit 103 are realized by a CPU (Central Processing Unit) of the prediction determination device 100 reading a program from a ROM (Read Only Memory) and executing the program. The program may be read from a storage medium such as a USB (Universal Serial Bus) memory, downloaded from another computer via a network, or the like, and supplied to the prediction and determination device 100. In the present embodiment, the display unit 112 is provided outside the prediction discrimination apparatus 100 as an example, but the prediction discrimination apparatus 100 may include the display unit 112.
Next, the processing in the present embodiment is explained according to the flowchart of fig. 2.
Step 201:
the causal model estimation unit 101 reads and acquires measurement data from the data storage unit 111.
Step 202:
the causal model estimation unit 101 estimates a causal model using the measurement data read from the data storage unit 111 as input data. At this point, one or more models are generated as assumptions. The causal model is a model that explicitly indicates the relationship between variables, and can be defined as a visual model by preparing nodes corresponding to the variables and forming a graph structure in which the nodes having a relationship are connected by links. Specifically, a bayesian network, a markov network, or the like can be used. In addition, as a method for estimating a causal model, for example, SGS (Spirtes, Glymour and Scheines) algorithm can be used. For example, in the case of SGS, thresholds for independence discrimination between particular nodes can be manipulated to derive different causal models. Further, a plurality of hypotheses may be generated by obtaining a model that maximizes an index such as MDL (Minimum Description Length) and randomly changing the presence or absence of a link. As a result, a group of causal models such as shown in fig. 4 can be estimated. Fig. 4 shows, as an example, the estimation of 4 causal models (a) to (d).
Step 203:
the model evaluation unit 102 evaluates the one or more causal models generated in step 1 using the applicability index, and sets the models as candidates to be presented to the user.
Fig. 5 shows the structure of the model evaluation unit 102. As shown in fig. 5, the model evaluation unit 102 includes a data dividing unit 201, a learning unit 202, a test unit 203, and a comprehensive evaluation unit 204.
When the input data given from the data storage unit 111 is input as is as data for evaluation, the data used for estimation of the causal model and the data used for evaluation overlap each other and can be evaluated only once, and thus robust evaluation cannot be performed. Therefore, the data dividing unit 201 divides the input data into learning data and test data, performs learning using the learning data in the learning unit 202, and performs a test using the test data in the test unit 203. The learning and the test are repeated while changing the method of dividing the data, and the comprehensive evaluation unit 204 evaluates the learning data and the test data separately, and performs overall evaluation by integrating these data. With this structure, robust evaluation can be performed.
Here, in the learning unit 202, the structure of the model is kept unchanged, and data obtained by changing the division method is input to the causal model, and the parameters thereof are newly estimated. For example, the learning unit 202 can calculate parameters as shown in fig. 6. Here, the causal model is expressed by a bayesian network, for example, and the parameters are obtained as probabilities. Fig. 6 shows, as an example, a case where parameters for the causal model shown in fig. 4(a) are estimated. In fig. 6, A, B, C, D, E are used for manufacturing ID, material, machining type, vibration, and machining abnormality, respectively, and distributions of estimated parameters, i.e., probabilities P (a), P (B | a), P (C | a), P (D | B), and P (E | B, C) are shown in a table. For the parameter estimation, maximum likelihood estimation or EAP (Expected a Posteriori) estimation can be generally used. In the case where the parameter p (a) is estimated by maximum likelihood estimation, the number of occurrences of a (manufacturing ID) appearing in the data is obtained, and the sum is divided by the number of occurrences. For example, when ID-05 appears 15 times and ID-09 appears 15 times, the total is 30, and the probability p (a) of each is 15/30 ═ 0.5. The learning unit 202 performs such estimation on all the causal models generated in step 1.
Further, the test unit 203 performs a test for predicting a target variable (in this case, a machining abnormality) from input variables (in this case, 4 variables other than the machining abnormality) of the test data using the parameters estimated by the learning unit 202. As a method for predicting a target variable using an estimated parameter, a merged tree (join tree) algorithm or the like is known which can estimate a target variable efficiently even when a graph structure is large.
The comprehensive evaluation unit 204 can evaluate the causal model using Accuracy and f-measure, which are one of the discrimination Accuracy indexes, as the applicability index, for example. The flow of processing performed by the data dividing unit 201 → the learning unit 202 → the testing unit 203 → the comprehensive evaluation unit 204 is performed on each model generated in step 202. Finally, the comprehensive evaluation unit 204 selects the causal model having the highest applicability index and the best applicability index. In the present embodiment, the comprehensive evaluation unit 204 selects the causal model having the best applicability index, but may select one or more causal models having high applicability indexes satisfying a reference value as a predetermined condition. This makes it possible to select a causal model satisfying a predetermined level as a candidate. The reference value may be determined according to the type of measurement data and the required evaluation accuracy.
Step 204:
the user display editing unit 103 presents performance evaluation based on the structure and applicability index of the causal model selected by the comprehensive evaluation unit 204. For example, as shown in fig. 7, the user display editing unit 103 displays the causal model 701 and the evaluation result 702 on the display unit 112. Fig. 7 shows a case where the model name of the causal model shown in fig. 4(a) and the value of the applicability index (accuracy: 60%) are displayed on the display unit 112 by the user display editing unit 103.
Step 205:
here, when the user who is the domain knowledge holder checks the structure of the displayed causal model and determines that editing is not necessary and ends the editing, the prediction and determination device 100 receives pressing of the editing end button 711 via the display unit 112 or another input device connected to the prediction and determination device 100 such as a keyboard or a mouse. The user display editing unit 103 determines whether or not the pressing of the edit end button 711 is accepted, and if it is determined that the pressing of the edit end button 711 is accepted (yes in step 205), ends the processing. On the other hand, if the user display editing unit 103 determines that the pressing of the edit end button 711 has not been accepted (no in step 205), the process proceeds to S206.
Step 206:
when it is determined that the pressing of the edit completion button 711 is not accepted (no in step 205), the user display editing unit 103 determines that there is a possibility that editing is necessary, and waits without changing. When the prediction determination device 100 determines that an operation indicating that editing is necessary (for example, pressing of the link deletion button 712 or the link addition button 713) is accepted from the user, the user display editing unit 103 can change the configuration by switching the links of the causal model 701 being displayed. For example, the causal model 701 shown in fig. 4(a) displayed on the display unit 112 is edited as shown in fig. 4(b) by the above-described operation.
Here, the display and the editing for the user do not necessarily have to be performed as shown in fig. 7, and the editing in the text library is also considered. For example, the user display editing unit 103 generates "whether or not the processing type is considered to affect the material? If it is determined that "no" indicating negative is input, the method of changing the causal model is considered as described above.
Thereafter, returning to step 202, the prediction discrimination apparatus 100 repeatedly executes the subsequent processing. In step 202 after editing, the causal model estimation unit 101 adds an assumption of a new model based on the causal model edited by the user to the assumption generated in step 202 before editing. For example, a generation method is considered in which the causal model estimation unit 101 randomly changes the presence or absence of a link between other variables in accordance with a change made by a user. The causal model estimation unit 101 may simply add the user's editing result to the assumed group up to that point.
In step 203 after the editing, the model evaluation unit 102 evaluates the set of assumptions of the model created in step 202 after the editing. Here, the model evaluation unit 102 may store already evaluated models and evaluate only previously unevaluated models, thereby reducing the labor and time required for evaluation.
In step 204 after the editing, the user display editing unit 112 presents the change of the applicability index due to the editing by the user to the user based on the evaluation result in step 203 after the editing. For example, the user display editing unit 112 displays a list of changes in the accuracy of the model created and evaluated in the editing process up to that time, as in the editing history 705 in fig. 7. In fig. 7, the user display editing unit 112 displays, as an editing history 705, changes in the index of the causal model based on the applicability index for the past 5 times including the causal model being displayed in time series. Further, since the evaluation value for the cause and effect model 701 at the 5 th time in the display is lower than the evaluation value at the previous time (4 th time), when the user selects the evaluation value 7051 at the 4 th time in the graph of the editing history 705, the user display editing unit 112 displays a pop-up screen 706, and the pop-up screen 706 displays the information at this time. The causal model 7061 and the evaluation result 7062 at this time are displayed on the pop-up screen 706, and when the program returns to this time, the button 714 is pressed. In this case, the user display editing unit 112 switches the cause and effect model and the evaluation result of the current time from the cause and effect model and the evaluation result currently displayed to be displayed on the main screen. The past data is stored in the data storage unit 1011 every time the process shown in fig. 2 is performed, and the user display editing unit 112 may display the causal model and the evaluation result stored at that time point as a history at the timing when fig. 7 is displayed.
In this way, the user display editing unit 112 can edit not only the currently displayed causal model 701 but also a model that has gone back to the past and is edited again.
In step 205 after the editing, the user examines whether the editing itself is correct or not based on the information so far, and if yes, the editing is terminated. Similarly to step 205 described above, when it is determined that the user has received a press of the edit completion button 711 from the user (yes in step 205), the user display editing unit 103 terminates the processing. On the other hand, if the user display editing unit 103 determines that the pressing of the edit end button 711 has not been accepted (no in step 205), the process proceeds to S206.
Thus, even when a desired causal model cannot be obtained, the domain knowledge holder can easily edit the causal model. Further, even when the domain knowledge holder cannot determine how to change the causal model, the causal model can be edited to progress to a highly applicable causal model by referring to the past editing history, and a model with higher applicability can be created.
[ example 2 ]
As example 2, fig. 8 and 9 show an example in which the model produced in example 1 can be further applied to a manufacturing site. The prediction discrimination apparatus 800 in embodiment 2 is different from the prediction discrimination apparatus 100 in embodiment 1 in that it includes the model effective utilization unit 104. The prediction determination device 800 is connected to a result output unit 113 for outputting data obtained from the model effective utilization unit 104, and to the device group 114 (device 114a, device 114b, and device 114c) serving as an output source of the measurement data.
Although fig. 8 shows a configuration in which the model effective utilization unit 104 is included in the prediction and determination device 800, it may be configured as another device different from the prediction and determination device 800 (2 nd device) as shown in fig. 9, for example. In fig. 9, the model effective utilization unit 104 is included in a model effective utilization apparatus 902 (3 rd apparatus). Further, the data storage unit 111 is included in the model efficient utilization device 901 (the 1 st device), and the result output unit 113 is also included in the result output device 903 (the output device). One or more of these can be included in the prediction discrimination device 800 as necessary. By adopting the configuration as shown in fig. 9, both the evaluation and the effective use of the model can be performed in a stable environment. For example, since the process of the model evaluation unit 102 and the process of the model effective utilization unit 104 are executed by different apparatuses, the processing load on each apparatus can be reduced, and these functions can be realized by an apparatus of a lower specification. Further, since the display unit 112 and the result output unit 113 are configured as separate devices, the present system can be used even when the user who edits the evaluation result of the model and the user who effectively uses the model are in different environments with a network interposed therebetween.
An example in which the user display editing unit 112 outputs the causal model to the model utilization unit 104 for effective use after the end of the editing in step 205 in embodiment 1 will be described below.
The model effective utilization unit 104 acquires measurement data from the device group 114, and acquires, from the model evaluation unit 102, a causal model (for example, a causal model with the best applicability index) having a high applicability index satisfying a reference value as a predetermined condition selected by the comprehensive evaluation unit 204, and a manufacturing ID included in measurement data input to the causal model. The model utilization unit 104 extracts data having the same manufacturing ID as the manufacturing ID from the measurement data received from the device group 114, and outputs the extracted data to the effect output unit 113 together with a causal model (for example, an optimal causal model) satisfying the reference value.
The result output unit 113 stores the result including the causal model and the manufacturing ID received from the model utilization unit 104 in a storage device, and presents the information to an output device such as a display device or an audio output device. This makes it possible to determine the causal model with the highest evaluation as the causal model to be used effectively, and to grasp the content of the measurement data input to the causal model for evaluation. In example 1, when a machining abnormality is determined, a measure such as an alarm may be taken, but in this example, a product that issued an alarm can be grasped from the product ID output by the result output unit 113.
Further, also in the period in which the causal model is effectively used, the data accumulation unit 111 accumulates the measurement data acquired from the device group 114, and inputs the newly accumulated measurement data to the causal model estimation unit 101, thereby performing model relearning. Thus, the causal model estimation unit 101 evaluates the causal model using the latest measurement data accumulated at any time, and the model utilization unit 104 can acquire and effectively utilize the new causal model as the causal model that is the highest in evaluation. In this case, the causal model estimation unit 101 may be determined according to the use environment or the like, whether to perform only the re-estimation of the learning parameters, the re-estimation of the structure, or the editing by the domain knowledge holder.
[ example 3 ]
Examples of the determination of the current state are shown in embodiments 1 and 2, but the future prediction can be performed similarly. For example, a case where the measurement data is accumulated in the data accumulation unit 111 in time series as shown in fig. 10 is considered. The storage of the measurement data from the past to the present as time instants tn, tn-1, tn-2 is shown in fig. 10. The learning unit 202 receives the measurement data thus accumulated as input, and generates a causal model for predicting a target variable at a subsequent time point (for example, m time point) from the current explanatory variable (for example, tn-1 time point). As described above, if the target variable is set to the presence or absence of a failure, and the input variable is learned with the sensor and the set value earlier than the time of occurrence of the failure, a model that predicts the future state from the past state can be constructed.
[ example 4 ]
In addition, when the device or the like is judged to be normal or abnormal by using the prediction judgment device, abnormal data is not necessarily required. For example, it is assumed that various parameters of the device and vibration values of a specific portion can be measured, and the vibration values are known to be associated with an abnormality of the device. In this case, even if only data of the normal vibration value is obtained, a model for discriminating the normal vibration value can be made. When the vibration is effectively used, the vibration value is predicted from data other than the vibration value using the model, the difference from the actual vibration value is obtained, and when the difference exceeds a specific threshold, abnormal vibration is detected.
The present invention is not limited to the above-described embodiments, but includes various modifications. For example, the above-described embodiments have been described in detail to explain the present invention in a manner that facilitates understanding of the present invention, but the present invention is not limited to having all of the structures described. In addition, a part of the structure of one embodiment may be replaced with the structure of another embodiment, and the structure of another embodiment may be added to the structure of one embodiment. Further, a part of the configuration of each embodiment may be added, deleted, or replaced with another configuration. Further, the above-described respective structures, functions, processing units, and the like may be partially or entirely realized by hardware by designing them with, for example, an integrated circuit. The respective structures, functions, and the like described above may be implemented in software by interpreting and executing a program that implements the respective functions by a processor. Information such as programs, tables, and files for realizing the respective functions can be stored in a memory, a recording device such as a hard disk or an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.

Claims (8)

1. An information processing apparatus, comprising:
a causal model estimation unit that estimates one or more causal models representing relationships between explanatory variables obtained from a determination target and target variables, using measurement data including the explanatory variables and the target variables as inputs;
an evaluation unit that evaluates the one or more causal models using an index indicating performance of prediction or determination for the target variable, and outputs a causal model in which a result of the evaluation satisfies a predetermined condition; and
and an editing unit configured to output the causal model output by the evaluation unit and the result of the evaluation to a display unit.
2. The information processing apparatus according to claim 1,
the editing unit accepts editing of the causal model output by the evaluation unit,
the evaluation unit outputs the edited causal model received by the editing unit and the result of the evaluation on the edited causal model to a display unit.
3. The information processing apparatus according to claim 2,
the evaluation unit outputs the past causal model edited by the editing unit and information indicating a transition of the evaluation result of the causal model to a display unit.
4. The information processing apparatus according to claim 3,
the evaluation unit outputs a past causal model selected from information indicating a transition of the evaluation result and the evaluation result of the causal model to a display unit.
5. The information processing apparatus according to claim 1,
the evaluation unit outputs, as a causal model satisfying the predetermined condition, a causal model of a high index in which the index satisfies a predetermined reference value as a result of the evaluation.
6. A prediction discrimination system is provided with:
a causal model estimation unit that estimates one or more causal models representing relationships between explanatory variables and target variables, the causal models being obtained from objects of discrimination that output measurement data, the measurement data including the explanatory variables, the target variables, and a manufacturing ID;
an evaluation unit that evaluates the one or more causal models using an index indicating performance of prediction or determination for the target variable, and outputs a causal model in which a result of the evaluation satisfies a predetermined condition;
an editing unit configured to output the causal model output by the evaluation unit and a result of the evaluation to a display unit; and
and a model utilization unit that acquires, from the model evaluation unit, a causal model whose evaluation result satisfies a predetermined condition and a manufacturing ID included in measurement data that is input to the causal model, extracts data whose manufacturing ID is the same as the manufacturing ID from the measurement data acquired from the determination target, and outputs the extracted data to a display unit together with the causal model satisfying the predetermined condition.
7. The predictive discrimination system according to claim 6,
the prediction discrimination system is composed of the following devices:
a 1 st device for accumulating measurement data output from the discrimination target;
a 2 nd device having the causal model estimation unit, the evaluation unit, and the editing unit;
a 3 rd device for acquiring measurement data outputted from the discrimination object and having the model effective utilization unit;
a display device having the display unit as an output destination of the editing unit; and
and an output device serving as an output destination of the model effective utilization unit.
8. A prediction discrimination method is characterized in that,
the causal model estimation unit inputs measurement data including an explanatory variable and a target variable obtained from a discrimination object,
the causal model estimation unit estimates one or more causal models representing relationships between the explanatory variables and the target variables,
the evaluation unit evaluates the one or more causal models using an index indicating performance of prediction or determination for the target variable,
the evaluation unit outputs a causal model in which the result of the evaluation satisfies a predetermined condition,
the editing unit outputs the causal model output by the evaluation unit and the result of the evaluation to a display unit.
CN202010283149.0A 2019-05-28 2020-04-10 Information processing device, prediction discrimination system, and prediction discrimination method Active CN112016689B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-099053 2019-05-28
JP2019099053A JP7247021B2 (en) 2019-05-28 2019-05-28 Information processing device, prediction discrimination system, and prediction discrimination method

Publications (2)

Publication Number Publication Date
CN112016689A true CN112016689A (en) 2020-12-01
CN112016689B CN112016689B (en) 2023-08-18

Family

ID=73506507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010283149.0A Active CN112016689B (en) 2019-05-28 2020-04-10 Information processing device, prediction discrimination system, and prediction discrimination method

Country Status (2)

Country Link
JP (1) JP7247021B2 (en)
CN (1) CN112016689B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113112311A (en) * 2021-05-12 2021-07-13 北京百度网讯科技有限公司 Method for training causal inference model, information prompting method and device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7409421B2 (en) 2022-03-24 2024-01-09 いすゞ自動車株式会社 Model creation device and model creation method
WO2024004384A1 (en) * 2022-06-27 2024-01-04 ソニーグループ株式会社 Information processing device, information processing method, and computer program
WO2024053020A1 (en) * 2022-09-07 2024-03-14 株式会社日立製作所 System and method for estimating factor of difference between simulation result and actual result

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008003866A (en) * 2006-06-22 2008-01-10 Omron Corp Causal structure acquiring device, causal structure acquiring method, causal structure acquiring program and computer readable medium recording it
JP2008217711A (en) * 2007-03-07 2008-09-18 Omron Corp Apparatus for deciding causal structure, control method therefor, and control program therefor
JP2009265713A (en) * 2008-04-22 2009-11-12 Toyota Central R&D Labs Inc Model construction device and program
WO2015122362A1 (en) * 2014-02-14 2015-08-20 オムロン株式会社 Causal network generating system and causal relation data structure
CN106796618A (en) * 2014-10-21 2017-05-31 株式会社日立制作所 Time series forecasting device and time sequence forecasting method
JP2017194730A (en) * 2016-04-18 2017-10-26 株式会社日立製作所 Decision Support System and Decision Support Method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008003866A (en) * 2006-06-22 2008-01-10 Omron Corp Causal structure acquiring device, causal structure acquiring method, causal structure acquiring program and computer readable medium recording it
JP2008217711A (en) * 2007-03-07 2008-09-18 Omron Corp Apparatus for deciding causal structure, control method therefor, and control program therefor
JP2009265713A (en) * 2008-04-22 2009-11-12 Toyota Central R&D Labs Inc Model construction device and program
WO2015122362A1 (en) * 2014-02-14 2015-08-20 オムロン株式会社 Causal network generating system and causal relation data structure
CN106796618A (en) * 2014-10-21 2017-05-31 株式会社日立制作所 Time series forecasting device and time sequence forecasting method
JP2017194730A (en) * 2016-04-18 2017-10-26 株式会社日立製作所 Decision Support System and Decision Support Method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113112311A (en) * 2021-05-12 2021-07-13 北京百度网讯科技有限公司 Method for training causal inference model, information prompting method and device
CN113112311B (en) * 2021-05-12 2023-07-25 北京百度网讯科技有限公司 Method for training causal inference model and information prompting method and device

Also Published As

Publication number Publication date
JP2020194320A (en) 2020-12-03
CN112016689B (en) 2023-08-18
JP7247021B2 (en) 2023-03-28

Similar Documents

Publication Publication Date Title
CN112016689B (en) Information processing device, prediction discrimination system, and prediction discrimination method
JP6354755B2 (en) System analysis apparatus, system analysis method, and system analysis program
JP6652699B2 (en) Anomaly evaluation program, anomaly evaluation method, and information processing device
JP6875179B2 (en) System analyzer and system analysis method
US20170261403A1 (en) Abnormality detection procedure development apparatus and abnormality detection procedure development method
CN111459700A (en) Method and apparatus for diagnosing device failure, diagnostic device, and storage medium
JP6521096B2 (en) Display method, display device, and program
JP6183450B2 (en) System analysis apparatus and system analysis method
EP3859472B1 (en) Monitoring system and monitoring method
JP2009064407A (en) Process analysis apparatus, process analysis method, and process analysis program
CN112231181A (en) Data abnormal update detection method and device, computer equipment and storage medium
JP6737277B2 (en) Manufacturing process analysis device, manufacturing process analysis method, and manufacturing process analysis program
JP7493930B2 (en) Information processing method, information processing device, production system, program, and recording medium
CN117114454B (en) DC sleeve state evaluation method and system based on Apriori algorithm
EP2634733A1 (en) Operations task management system and method
JPWO2014132611A1 (en) System analysis apparatus and system analysis method
US20190265088A1 (en) System analysis method, system analysis apparatus, and program
CN113722134A (en) Cluster fault processing method, device and equipment and readable storage medium
JP2007164346A (en) Decision tree changing method, abnormality determination method, and program
JP2022191680A (en) Data selection support device, and data selection support method
JP2014215908A (en) Power consumption estimation program, power consumption estimation method, and power consumption estimation device
JP2023036469A5 (en)
WO2016163008A1 (en) Fault diagnostic device and fault diagnostic method
JP2019047188A (en) Analysis management system and analysis management method
JP7392415B2 (en) Information processing program, information processing device, computer readable recording medium, and information processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant