WO2021079445A1 - Procédé d'affichage, programme d'affichage et dispositif de traitement d'informations - Google Patents

Procédé d'affichage, programme d'affichage et dispositif de traitement d'informations Download PDF

Info

Publication number
WO2021079445A1
WO2021079445A1 PCT/JP2019/041584 JP2019041584W WO2021079445A1 WO 2021079445 A1 WO2021079445 A1 WO 2021079445A1 JP 2019041584 W JP2019041584 W JP 2019041584W WO 2021079445 A1 WO2021079445 A1 WO 2021079445A1
Authority
WO
WIPO (PCT)
Prior art keywords
model
data
deal
concept drift
user needs
Prior art date
Application number
PCT/JP2019/041584
Other languages
English (en)
Japanese (ja)
Inventor
勉 石田
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to JP2021553215A priority Critical patent/JP7363911B2/ja
Priority to PCT/JP2019/041584 priority patent/WO2021079445A1/fr
Publication of WO2021079445A1 publication Critical patent/WO2021079445A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present invention relates to a display method, a display program, and an information processing device.
  • concept drift means that the quality of data input to the AI system changes over time.
  • the AI system makes various inferences using a learner on the input data.
  • learning of the learner is performed at the time of development. Therefore, when concept drift occurs, the accuracy of the AI system may decrease relatively.
  • the above method has a problem that the calculation cost may increase.
  • a method of ensemble a plurality of learning devices and perform a majority vote is known.
  • the amount of calculation may increase.
  • the purpose is to detect concept drift with a small amount of calculation.
  • the computer acquires the first operational data input to the model.
  • the computer calculates an index indicating the magnitude of the change in the output result of the model due to the time change of the tendency of the operation data based on the acquired first operation data.
  • the computer identifies changes in the index over time from the calculated index.
  • the computer determines if there has been a concept drift that the user needs to address based on a comparison of the specified metric changes with preset thresholds. When it is determined that the concept drift that the user needs to deal with has occurred, the computer displays information indicating that the concept drift that the user needs to deal with has occurred on the display screen.
  • FIG. 1 is a diagram for explaining concept drift.
  • FIG. 2 is a diagram showing a configuration example of the detection device of the first embodiment.
  • FIG. 3 is a diagram for explaining a processing flow of the detection device.
  • FIG. 4 is a diagram for explaining the parameters of the neural network.
  • FIG. 5 is a diagram for explaining the replica model.
  • FIG. 6 is a diagram for explaining a method of acquiring the distribution of the output with respect to the training data.
  • FIG. 7 is a diagram for explaining a method of acquiring the output distribution with respect to the operation data.
  • FIG. 8 is a diagram showing an example of changes in KLD.
  • FIG. 9 is a diagram showing an example of a change in distribution.
  • FIG. 10 is a diagram showing an example of a monthly display screen when concept drift does not occur.
  • FIG. 10 is a diagram showing an example of a monthly display screen when concept drift does not occur.
  • FIG. 11 is a diagram showing an example of a monthly display screen when incremental drift occurs.
  • FIG. 12 is a diagram showing an example of a detailed display screen.
  • FIG. 13 is a diagram showing an example of a monthly display screen when a sudden drift occurs.
  • FIG. 14 is a diagram showing an example of a monthly display screen when recurring drift occurs.
  • FIG. 15 is a diagram showing an example of a daily display screen.
  • FIG. 16 is a flowchart showing a processing flow during learning of the detection device.
  • FIG. 17 is a flowchart showing a processing flow during operation of the detection device.
  • FIG. 18 is a diagram for explaining the parameters of the decision tree.
  • FIG. 19 is a diagram illustrating a hardware configuration example.
  • the detection device of the embodiment can be said to be an information processing device that executes the detection method and the display method.
  • concept drift a trained model in which training is performed using training data with a known label is called an original model.
  • the model in this embodiment is assumed to be a classifier that estimates and classifies the class to which the data belongs based on the features.
  • the classifier calculates a score for each class for each data and classifies based on the score.
  • the score may be the probability that the data belongs to each class.
  • the classifier may classify two classes such as whether or not it is normal, or may classify three or more classes.
  • FIG. 1 is a diagram for explaining concept drift.
  • the left and right figures of FIG. 1 are drawn on a plane in which the horizontal axis and the vertical axis are different feature quantities, the data points at the time of learning and operation of each class, and the curve representing the original model. ..
  • the data points of ⁇ , ⁇ , and ⁇ in FIG. 1 correspond to the data belonging to class A, class B, and class C, respectively.
  • the original model can classify the data of each class with high accuracy at the time of learning.
  • the distributions of class B and class C change significantly, and the classification accuracy of the original model decreases.
  • concept drift is a phenomenon in which the quality of data changes during model operation compared to during learning. In other words, the change in the output result of the model due to the time change of the tendency of the operational data becomes large. Then, as shown in FIG. 1, when concept drift occurs, the accuracy of the model may be relatively lowered.
  • the model classifies each stock into one of "buy”, “sell”, and “neutral” in stock trading.
  • the model trained with the data of 2018 shows high accuracy in the beginning of 2019, but the tendency of the data changes in the middle of 2019, and the accuracy may decrease.
  • the model uses conditions such as temperature and brightness of the manufacturing environment and predetermined measured values obtained in each process as feature quantities to determine whether or not the product is a defective product. be able to. In this case, it is conceivable that changes in the environment or the like appear as concept drift and the determination accuracy of the model decreases.
  • the detection device of this embodiment can be applied to any model affected by the above concept drift.
  • the detection device of this embodiment detects concept drift using a plurality of replica models generated from the original model. Since the generation of the replica model does not require learning, the detection device can generate the replica model with a small amount of calculation.
  • the true class to which the data during operation belongs is unknown, but for the sake of explanation in Fig. 1, the data points during operation are also assigned the shapes of ⁇ , ⁇ , and ⁇ corresponding to each class. ing.
  • the detection device of this embodiment is intended to detect concept drift that the user needs to deal with even if the true class of data during operation is unknown.
  • measures may be taken such as automatic or manual retraining of the original model and exclusion of inappropriate data that caused the concept drift. ..
  • FIG. 2 is a diagram showing a configuration example of the detection device of the first embodiment.
  • the detection device 10 includes a communication unit 11, an input unit 12, an output unit 13, a storage unit 14, and a control unit 15.
  • the communication unit 11 is an interface for communicating data with other devices.
  • the communication unit 11 is a NIC (Network Interface Card), and may be used to communicate data via the Internet.
  • NIC Network Interface Card
  • the input unit 12 is an interface for receiving data input.
  • the input unit 12 may be an input device such as an input device such as a keyboard or a mouse.
  • the output unit 13 is an interface for outputting data.
  • the output unit 13 may be an output device such as a display or a speaker. Further, the input unit 12 and the output unit 13 may input / output data to / from an external storage device such as a USB memory.
  • the storage unit 14 is an example of a storage device that stores data, a program executed by the control unit 15, and the like, such as a hard disk and a memory.
  • the storage unit 14 stores the original model information 141, the replica model information 142, and the distribution information 143.
  • the original model information 141 is information such as parameters for constructing the original model.
  • the original model information 141 is the weight and bias of each layer.
  • Replica model information 142 is information such as parameters for constructing a replica model.
  • the algorithms of the replica model and the original model are common. Therefore, for example, if the algorithm of the original model is a neural network, the algorithm of the replica model is also a neural network.
  • the replica model information 142 is the weight and bias of each layer of the neural network.
  • Distribution information 143 is information regarding the distribution of the output of the replica model at the time of learning.
  • the distribution information 143 is information that serves as a reference for evaluating the data during operation and detecting the concept drift that the user needs to deal with.
  • the distribution information 143 may be sufficient information for calculating the evaluation value of the operation data.
  • the distribution information 143 may be all or part of the output of the replica model, or may be a distribution parameter indicating the distribution of the output of the replica model. The method of acquiring the output distribution of the replica model will be described later.
  • control unit 15 for example, a program stored in an internal storage device is executed with RAM as a work area by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), or the like. Is realized by. Further, the control unit 15 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array), for example.
  • the control unit 15 includes a learning unit 151, a generation unit 152, a calculation unit 153, a detection unit 154, and a display control unit 155.
  • FIG. 3 is a diagram for explaining a processing flow of the detection device.
  • the processing of the detection device 10 is divided into a learning phase and an operation phase.
  • the detection device 10 generates the original model information 141, the replica model information 142, and the distribution information 143.
  • the detection device 10 detects the concept drift that the user needs to deal with by using the original model information 141, the replica model information 142, and the distribution information 143.
  • the learning unit 151 First, the processing of the learning phase will be described. As shown in step S11 of FIG. 3, the learning unit 151 generates an original model based on the learning data.
  • the learning unit 151 can perform learning by a known method. For example, if the original model is a neural network, the learning unit 151 performs learning by the back-propagation method.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the trained original model.
  • a plurality of replica models generated by the generation unit 152 may be referred to as a replica model group.
  • the original model is an example of the first model.
  • the replica model is an example of the second model.
  • the generation unit 152 generates a plurality of neural networks as a replica model by changing the weights and biases in a predetermined layer of the trained original model neural network.
  • FIG. 4 is a diagram for explaining the parameters of the neural network.
  • weights w_a1, w_a2, w_b1, w_b2, w_c1, w_c2, w_d1, w_d2 and biases b1 and b2 are set between the output layer and the hidden layer of the neural network of the original model. ..
  • the generation unit 152 generates a replica model by multiplying the weight and the bias by a random number.
  • ⁇ _i be a random number that follows a normal distribution with a mean of 1.0 and a standard deviation of 0.3.
  • the generation unit 152 uses the weights w ⁇ _a1, w ⁇ _b1, w ⁇ _c1, w ⁇ _d1, w ⁇ _a2, w ⁇ _b2 between the output layer and the hidden layer as the first replica model.
  • W'_c2, w'_d2, and biases b'1 and b'2 are set to generate a neural network.
  • the generation unit 152 generates the replica model so that the distance from the original model is within a narrow range to some extent. For example, the generation unit 152 generates a replica model by multiplying a predetermined parameter by a random number according to a normal distribution having an average of 1.0 and a standard deviation of 0.3.
  • the calculation unit 153 calculates the output for the data actually input to the model. As shown in step S12 of FIG. 3, the calculation unit 153 stores the information regarding the distribution of the output obtained by inputting the training data used for learning the original model into the replica model as the distribution information 143 in the storage unit 14. .. Further, the calculation unit 153 may use not only the output of the replica model group but also the information regarding the distribution including the output of the original model as the distribution information 143.
  • the learning unit 151 works with the calculation unit 153 to perform learning by optimizing the output of the model.
  • the generation unit 152 can generate the replica model without performing the calculation by the calculation unit 153.
  • FIG. 5 is a diagram for explaining the replica model.
  • the replica model group can be said to be a curve group dispersed around the original model, which is a curve, with a predetermined variation, or a region formed by the curve group.
  • the replica model group shows the same classification result as the original model group.
  • each replica model may show different classification results for data points located near the curve of the original model.
  • the detection device 10 of the present embodiment detects the concept drift that the user needs to deal with by paying attention to the change of each output of the replica model group with the passage of time.
  • FIG. 6 is a diagram for explaining a method of acquiring the output distribution with respect to the training data.
  • the original model classifies the data into one of class A, class B, and class C.
  • the detection device 10 uses, for example, 70 learning data for each class for learning. Further, it is assumed that the detection device 10 generates, for example, 1,000 replica models.
  • the calculation unit 153 applies the training data s 1 , s 2 , ..., And s 70 , which are known to belong to the class C, to the replica models r 1 , r 2 , and r 1000 , respectively.
  • the score is the probability that the data belongs to each class.
  • (A0.0, B0.0, C1.0) indicates that the probability that the data belongs to class A is 0.0, the probability that the data belongs to class B is 0.0, and the probability that the data belongs to class C is 1.0.
  • the calculation unit 153 inputs the operation data to the original model and calculates the output.
  • the output is the score for each class to be classified.
  • the output unit 13 can output the score for each class or the information for identifying the class having the highest score as the classification result.
  • the learning data and the operation data may be time-series data.
  • the training data can be said to be the data at the time of training of the original model among the time series data.
  • the operational data can be said to be the data of a predetermined time after the learning time among the time series data.
  • the detection unit 154 differs between the distribution of the output obtained by inputting the training data into the replica model and the distribution of the output obtained by inputting the operational data into the replica model. Is detected.
  • the operational data is an example of predetermined data different from the learning data.
  • the calculation unit 153 inputs operational data to each replica model and calculates the output.
  • the detection unit 154 acquires the distribution of the output calculated by the calculation unit 153. Further, the detection unit 154 acquires the distribution of the output obtained by inputting the training data into the replica model from the distribution information 143.
  • FIG. 7 is a diagram for explaining a method of acquiring the output distribution with respect to the operation data.
  • the calculation unit 153 inputs the operation data u 1 classified into the class C by the original model into each of the replica models r 1 , r 2 , ..., And r 1000 , and inputs each class. Calculate the score of.
  • the calculation unit 153 can calculate the score by real-time processing as shown in FIG. 7.
  • the calculation unit 153 may collectively calculate the score by batch processing a plurality of operational data.
  • the detection unit 154 determines the KLD (Kullback-Leibler variance) between the score distribution obtained by inputting the training data into the replica model and the score distribution obtained by inputting the operational data into the replica model. Based on this, the difference is detected.
  • KLD is an example of an index showing the magnitude of the difference between distributions.
  • the detection unit 154 calculates the KLD of the probability distribution P (i) of the score of the training data and the probability distribution Q (i) of the score of the operation data as in the equation (1).
  • i is an identifier indicating the classification.
  • the detection unit 154 calculates the sum of the D KLs for the class A, the class B, and the class C as in the equation (2). Then, the detection unit 154, when the sum of D KL or calculated D KL of each class is equal to or higher than the threshold, detecting the concept drift that the user must correspond.
  • FIG. 8 is a diagram showing an example of changes in KLD.
  • the "sample ID” is an ID for identifying individual data included in the operational data.
  • “correct answer” is the true correct answer class of operational data.
  • "prediction” is a classification result based on the original model.
  • the classification result and the correct answer match, and all KLDs have a small value of 0.5 or less.
  • the classification result does not match the correct answer and the KLD takes a large value of 30 or more.
  • the KLD tends to increase when the classification result and the correct answer do not match, but even if the classification result matches the correct answer, only the KLD may increase.
  • FIG. 9 is a diagram showing an example of a change in distribution.
  • t represents a fixed time such as a day, a week, or a month.
  • each histogram in FIG. 9 is a plot of the scores output by each replica model with respect to the data classified into the class C by the original model.
  • the display control unit 155 can display the distribution comparison result by the detection unit 154 and the concept drift detection result on the screen via the output unit 13.
  • FIG. 10 is a diagram showing an example of a monthly display screen when concept drift does not occur. As shown in FIG. 10, the display control unit 155 displays the monthly change in KLD as a line graph on the monthly display screen. Further, the display control unit 155 displays the screen transition menu when the data point is clicked. The screen transition menu includes items "display details" and "display on a daily basis". The transition destination screen will be described later.
  • Display example of display device screen Here, a display example of the display screen of the display device will be described with reference to FIGS. 11 to 15. Information indicating that a concept drift that the user needs to deal with has occurred is displayed on the screen of the display device.
  • concept drift such as incremental drift, sudden drift, and recurring drift.
  • Incremental drift is a phenomenon in which the accuracy of the model gradually decreases due to concept drift.
  • Sudden drift is a phenomenon in which the accuracy of a model drops sharply due to concept drift.
  • Recurring drift is a phenomenon in which the accuracy of a model decreases only at a specific timing on a periodic time axis due to concept drift.
  • the accuracy of the model decreases relatively as the tendency of the data changes, but the accuracy of the model that the user can tolerate varies from user to user. For example, consider the case where the model classifies each stock into one of "buy”, “sell”, and “neutral” in stock trading. In this case, since it is necessary to maintain a state in which the accuracy of the model is high, it is necessary to immediately perform re-learning with the data of the new tendency when the accuracy of the model deteriorates even a little. Then, it is necessary to update the currently operating model to the retrained model. On the other hand, consider the case of determining whether the indoor environment is hot or not.
  • the timing for re-learning the model can be determined according to the accuracy of the model that the user can tolerate.
  • FIG. 11 is a diagram showing an example of a monthly display screen when incremental drift occurs.
  • the generation unit 152 acquires the operation data input to the trained model. Then, the generation unit 152 calculates an index indicating the magnitude of the change in the output result of the model due to the time change of the tendency of the operation data based on the acquired operation data.
  • the generation unit 152 determines whether or not a concept drift that the user needs to deal with has occurred based on the comparison between the calculated index and the preset threshold value. When it is determined that the concept drift that the user needs to deal with has occurred, the generation unit 152 causes the display screen to display information indicating that the concept drift that the user needs to deal with has occurred.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the original model, which is a model for calculating a score for classifying data into one of a plurality of classes.
  • the detection unit 154 has an index indicating the magnitude of the difference between the distribution of the score obtained by inputting the training data into the replica model and the distribution of the output obtained by inputting the operational data into the replica model when the threshold value or more. In some cases, it detects that a concept drift has occurred that the user needs to address.
  • the detection unit 154 replicas the distribution of scores obtained by inputting the training data at the time of learning of the original model into the replica model and the data of each of the plurality of times after the learning time among the time series data. Get the distribution of scores obtained by inputting into the model. Then, the detection unit 154 detects that a concept drift that the user needs to deal with has occurred when the index indicating the magnitude of the difference in each distribution is equal to or greater than the threshold value. In addition, the display control unit 155 displays information on each index of the plurality of times on the screen. Then, the display control unit 155 further detects that the concept drift that the user needs to deal with has generated the information regarding the index of the time when it is detected that the concept drift that the user needs to deal with has occurred. Display in a manner different from the information regarding the index of time not being used.
  • the detection unit 154 sets a threshold value for an index indicating the magnitude of the difference in each distribution.
  • the threshold can be set according to the accuracy of the model that the user can tolerate.
  • the detection unit 154 receives a value related to the accuracy of the model that can be tolerated by the user. Then, the detection unit 154 sets a value corresponding to the received value as a threshold value.
  • the threshold value can be set by using the information on the accuracy of the model required in the industry.
  • the detection unit 154 identifies the accuracy of the model required in the industry from the user's industry information. Then, the detection unit 154 determines the threshold value from the accuracy of the specified model. For example, with the accuracy of a model that detects a defective product from an image of a product, even one defective product must not be overlooked, so it is necessary to set a threshold value according to the high accuracy.
  • the display control unit 155 displays each index of a plurality of times as a line graph.
  • the display control unit 155 corresponds to the time when the user needs to deal with the concept drift and corresponds to the time when the user does not detect the concept drift. Display so that at least one of color, shape, and size is different.
  • the display control unit 155 displays information indicating the generated concept drift on the display screen, and displays the concept drift that the user needs to deal with and the concept drift that the user does not need to deal with in different modes. To display.
  • the generation unit 152 displays each index of a plurality of times on the display screen as a line graph.
  • the generation unit 152 has detected that the concept drift that the user needs to deal with has occurred at a point corresponding to the time when the concept drift that the user needs to deal with has occurred. Display so that at least one of the color, shape, and size is different from the point corresponding to no time.
  • the display control unit 155 displays the data points before October 2018, which have not detected the concept drift that the user needs to deal with, with white circles.
  • the display control unit 155 displays the data points of November 2018, which detected the concept drift that the user needs to deal with, with a black circle and a large circle surrounding the circle.
  • the display control unit 155 displays a message that incremental drift has occurred in the vicinity of the data point in November 2018 when the concept drift that the user needs to deal with is detected.
  • the display control unit 155 may display a message by a tooltip.
  • the display control unit 155 displays the detailed display screen when "display details" is selected from the transition menu.
  • FIG. 12 is a diagram showing an example of a detailed display screen.
  • the display control unit 155 displays a detailed display screen of a histogram of the output distribution obtained by inputting the training data into the replica model and a histogram showing the output distribution obtained by inputting the data at the specified time into the replica model. To display.
  • Each histogram may be similar to that described with reference to FIG. Further, the display control unit 155 may display the value of each variable of the input data corresponding to the selected data point.
  • the learning unit 151 When the learning unit 151 detects that the concept drift that the user needs to deal with has occurred, the learning unit 151 displays information indicating that the concept drift that the user needs to deal with has occurred on the display screen of the display device. At this time, the learning unit 151 receives from the user whether or not to execute the re-learning. Then, when the user accepts to execute the re-learning, the learning unit 151 extracts the latest operational data at the time when the concept drift that the user needs to deal with occurs from the time-series data. .. Specifically, the learning unit 151 extracts operational data before a preset period from the time when the concept drift that the user needs to deal with occurs. Then, the learning unit 151 executes re-learning using the operational data and generates a new model.
  • the learning unit 151 executes re-learning using a neural network.
  • the learning unit 151 can perform re-learning using the input data displayed on the detailed display screen as learning data.
  • the label of the learning data in the re-learning may be manually assigned, or may be automatically assigned based on the output of the original model and the replica model.
  • the learning unit 151 inputs the extracted operational data from the input layer and propagates it forward to the neural network. Then, the classification result obtained from the output layer is compared with the correct answer (correct example / negative example) of the user operation to obtain the error. Further, the error back propagation method propagates the error between the classification result and the correct answer to the neural network in the direction opposite to that at the time of classification, and changes the parameters of each layer of the neural network to approach the optimum solution.
  • a neural network has been described as a learning model, the present invention is not limited to this, and other machine learning such as a logistic regression model or a support vector machine can also be adopted.
  • the learning unit 151 changes the current original model to a newly generated model.
  • FIG. 13 is a diagram showing an example of a monthly display screen when a sudden drift occurs.
  • the generation unit 152 acquires the operation data input to the trained model. Then, the generation unit 152 calculates an index indicating the magnitude of the change in the output result of the model due to the time change of the tendency of the operation data based on the acquired operation data, and the calculated index is used in time series. Identify changes in indicators. Further, the generation unit 152 determines whether or not a concept drift that the user needs to deal with has occurred based on the comparison between the change in the specified index and the preset threshold value. When it is determined that the concept drift that the user needs to deal with has occurred, the generation unit 152 causes the display screen to display information indicating that the concept drift that the user needs to deal with has occurred.
  • the display control unit 155 displays information indicating the generated concept drift on the display screen, and displays the concept drift that the user needs to deal with and the concept drift that the user does not need to deal with in different modes. Display on the screen.
  • the generation unit 152 displays each index of a plurality of times on the display screen as a line graph.
  • the generation unit 152 has detected that the concept drift that the user needs to deal with has occurred at a point corresponding to the time when the concept drift that the user needs to deal with has occurred. Display so that at least one of the color, shape, and size is different from the point corresponding to no time.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the original model, which is a model for calculating a score for classifying data into one of a plurality of classes. Then, the detection unit 154 replicas the distribution of the score obtained by inputting the training data at the time of learning of the original model into the replica model and the data of each of the plurality of times after the learning time among the time series data. Get the distribution of scores obtained by inputting into the model. Further, the detection unit 154 detects that a concept drift that the user needs to deal with occurs when the change in the time series of the index indicating the magnitude of the difference of each distribution satisfies a predetermined condition.
  • the detection unit 154 detects that a concept drift that the user needs to deal with occurs when the amount of change in the index in time series is equal to or greater than the threshold value. As shown in FIG. 13, since the KLD increased sharply in November 2018, the detection unit 154 detects that the sudden drift occurred in November 2018. In this case, it is assumed that the KLD is 10 or less, which is the threshold of incremental drift, but the amount of change between October 2018 and November 2018 is more than the threshold of sudden drift.
  • the display control unit 155 displays a message that sudden drift has occurred near the data point in November 2018 when the concept drift that the user needs to deal with is detected. On the other hand, the display control unit 155 displays the data points before October 2018, which have not detected the concept drift that the user needs to deal with, with white circles.
  • the learning unit 151 detects that the learned user needs to deal with the concept drift, the learning unit 151 displays information indicating that the user needs to deal with the concept drift on the display screen of the display device. To display. At this time, the learning unit 151 receives from the user whether or not to execute the re-learning. Then, when the user accepts to execute the re-learning, the learning unit 151 extracts the latest operational data at the time when the concept drift that the user needs to deal with occurs from the time-series data. .. Specifically, the learning unit 151 extracts operational data before a preset period from the time when the concept drift that the user needs to deal with occurs. The learning unit 151 relearns using the operation data before the preset period.
  • the KLD on the monthly display screen described so far may be, for example, the total value of the daily KLD.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the original model, which is a model for calculating a score for classifying the data into one of a plurality of classes.
  • the detection unit 154 replicas the distribution of the score obtained by inputting the training data at the time of learning of the original model into the replica model and the data of each of the plurality of times after the learning time among the time series data. Get the distribution of scores obtained by inputting into the model.
  • the display control unit 155 displays on the screen the total value of the indexes indicating the magnitude of the difference of each distribution acquired by the detection unit 154 at regular intervals.
  • the detection unit 154 detects that a concept drift that the user needs to deal with has occurred when the index or the total value is equal to or greater than the threshold value. That is, the detection unit 154 detects that a concept drift that the user needs to deal with is generated based on the daily KLD and / or the monthly KLD. In this case, for example, even if the total KLD of a certain month does not exceed the first threshold value, if the KLD of one day of the day of the month is equal to or more than the second threshold value, the detection unit 154 , It is possible to detect that a concept drift that the user needs to deal with has occurred.
  • the display control unit 155 When the display control unit 155 detects a concept drift that the user needs to deal with, the display control unit 155 displays information on the concept drift on the screen. For example, the display control unit 155 displays a message to the effect that the concept drift has occurred on the monthly display screen even when the display control unit 155 detects that the concept drift that the user needs to deal with has occurred from the daily KLD. You may.
  • the display control unit 155 switches between a screen for displaying information on a plurality of indicators for each time and a screen for displaying information on the total value for each fixed period. For example, the display control unit 155 displays the daily display screen when "display by day" is selected from the monthly display transition menu.
  • FIG. 14 is a diagram showing an example of a monthly display screen when recurring drift occurs.
  • FIG. 15 is a diagram showing an example of a daily display screen.
  • the generation unit 152 acquires the operation data input to the trained model. Then, the generation unit 152 calculates the first index indicating the magnitude of the change in the output result of the model due to the time change of the tendency of the operation data based on the acquired operation data. Further, the generation unit 152 calculates the second index obtained by dividing the calculated first index by the periodic unit time of the operation data, and sets the calculated second index and the preset threshold value. Based on the comparison, determine if there is a concept drift that the user needs to address. When it is determined that the concept drift that the user needs to deal with has occurred, the generation unit 152 causes the display screen to display information indicating that the concept drift that the user needs to deal with has occurred.
  • the display control unit 155 displays information indicating the generated concept drift on the display screen, and displays the concept drift that the user needs to deal with and the concept drift that the user does not need to deal with in different modes. Display on the screen.
  • the generation unit 152 displays each index of a plurality of times on the display screen as a line graph.
  • the generation unit 152 has detected that the concept drift that the user needs to deal with has occurred at a point corresponding to the time when the concept drift that the user needs to deal with has occurred. Display so that at least one of the color, shape, and size is different from the point corresponding to no time.
  • the display control unit 155 displays a message that recurring drift has occurred near the data point in November 2018 when the concept drift that the user needs to deal with is detected. On the other hand, the display control unit 155 displays the data points before October 2018, which have not detected the concept drift that the user needs to deal with, with white circles.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the original model, which is a model for calculating a score for classifying data into one of a plurality of classes. Then, the detection unit 154 replicas the distribution of the score obtained by inputting the training data at the time of learning of the original model into the replica model and the data of each of the plurality of times after the learning time among the time series data. Get the distribution of scores obtained by inputting into the model. Further, the detection unit 154 detects that a concept drift that the user needs to deal with occurs when the index indicating the magnitude of the difference of each distribution satisfies a predetermined condition at a constant cycle. For example, the detection unit 154 detects that a concept drift that the user needs to deal with has occurred when the index exceeds the threshold value at regular intervals.
  • the display control unit 155 displays the KLD together with the date and the day of the week on the daily display screen.
  • the KLD of 11/6 (Wednesday), 11/13 (Wednesday), 11/20 (Wednesday), and 11/27 (Wednesday) is 0.4 or more of the threshold value. From this, the detection unit 154 detects that the concept drift that the user needs to deal with periodically occurs every Wednesday every other week.
  • the display control unit 155 detects the concept drift that the user needs to deal with, and is the data point of 11/6 (Wednesday), 11/13 (Wednesday), 11/20 (Wednesday), and 11/27 (Wednesday). Display a message in the vicinity.
  • the learning unit 151 uses the time-series data that satisfies a predetermined condition as a new model. To learn. For example, in the example of FIG. 15, the learning unit 151 may learn a new model for Wednesday using only the data of every Wednesday. That is, the learning unit 151 learns the operational data of Wednesday, November 2018, which the user needs to deal with.
  • the learning unit 151 detects that the concept drift that the user needs to deal with has occurred, the display screen of the display device displays information indicating that the concept drift that the user needs to deal with has occurred. To display. At this time, the learning unit 151 receives from the user whether or not to execute the re-learning. Then, when the user accepts to execute the re-learning, the learning unit 151 extracts the latest operational data at the time when the concept drift that the user needs to deal with occurs from the time-series data. .. Specifically, the learning unit 151 extracts operational data in the same time unit before a preset period, starting from the time when the concept drift that the user needs to deal with occurs. The learning unit 151 relearns using the operation data before the preset period.
  • FIG. 16 is a flowchart showing a processing flow during learning of the detection device.
  • the detection device 10 generates an original model by learning the training data (step S101).
  • the detection device 10 generates a replica model group in which predetermined parameters of the original model are changed (step S102).
  • the detection device 10 saves information regarding the distribution of the output when the training data is input to the replica model group (step S103).
  • FIG. 17 is a flowchart showing a processing flow during learning of the detection device.
  • the detection device 10 calculates the output when the operation data is input to the original model (step S201).
  • the detection device 10 can obtain an inference result based on the output calculated in step S201.
  • the detection device 10 acquires the distribution of the output when the operation data is input to the replica model group (step S202). Then, the detection device 10 calculates an index for evaluating the difference between the distribution acquired at the time of operation and the distribution acquired at the time of learning (step S203).
  • the distribution acquired during operation is the distribution acquired by the detection device 10 in step S202.
  • the distribution acquired at the time of learning was acquired by the detection device 10 in step S103 of FIG. Also, for example, the index is KLD.
  • the detection device 10 determines whether or not the index satisfies the condition (step S204). When the index satisfies the condition (step S204, Yes), the detection device 10 detects the concept drift and outputs an alert (step S205). The alert may be, for example, a message as shown in FIG. If the index does not satisfy the condition (step S204, No), the detection device 10 outputs an inference result based on the output of the original model (step S206).
  • the conditions here are the conditions for detecting each type of concept drift described above.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the trained original model.
  • the detection unit 154 has an output distribution obtained by inputting the training data used for training the original model into the replica model and an output distribution obtained by inputting predetermined data different from the training data into the replica model. Detect differences. As described above, if the original model has already been generated, the detection device 10 can generate a replica model without learning and detect the concept drift. As a result, the detection device 10 can detect the concept drift with a small amount of calculation.
  • the detection unit 154 acquires the distribution of the output obtained by inputting the learning data used for learning the original model into the replica model, which is the data at the time of learning the original model among the time series data.
  • the detection unit 154 acquires the distribution of the output obtained by inputting the data at a predetermined time after the learning time into the replica model among the time series data.
  • the detection unit 154 detects the difference in each distribution. In this way, the detection device 10 can evaluate the difference in the output distribution of the time-series data at the time of learning and at the time of operation. As a result, the detection device 10 can detect the concept drift in the time series data with a small amount of calculation.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the original model, which is a model for calculating a score for classifying data into one of a plurality of classes.
  • the detection unit 154 determines the KLD (Kullback-Leibler variance) between the distribution of the score obtained by inputting the training data into the replica model and the distribution of the output obtained by inputting predetermined data different from the training data into the replica model. Based on this, the difference is detected. As described above, the detection device 10 can easily evaluate the difference between distributions by using KLD.
  • the generation unit 152 generates a replica model by multiplying a predetermined parameter by a random number according to a normal distribution having an average of 1.0 and a standard deviation of 0.3. In this way, the detection device 10 can prevent the gap between the original model and the replica model from becoming too large. As a result, the detection device 10 can reduce erroneous detection of concept drift.
  • the generation unit 152 generates a plurality of neural networks as a replica model by changing the weights and biases in a predetermined layer of the trained original model neural network. As a result, the detection device 10 can generate a replica model when the original model is a neural network.
  • the generation unit 152 generates a plurality of SVMs as replica models by changing the weights and biases of each feature amount in the identification function of the SVMs that are the trained original models. As a result, the detection device 10 can generate a replica model when the original model is an SVM.
  • the generation unit 152 generates a plurality of logistic regression models as replica models by changing the weights and biases of each feature amount in the determination function of the logistic regression model which is the trained original model. As a result, the detection device 10 can generate a replica model when the original model is a logistic regression model.
  • the generation unit 152 generates a plurality of decision trees as a replica model by changing the threshold value of the information gain of the decision tree which is the learned original model. As a result, the detection device 10 can generate a replica model when the original model is a decision tree.
  • the detection unit 154 detects a difference
  • the learning unit 151 relearns the original model using predetermined data.
  • the detection device 10 can adapt the original model to the concept drift.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the original model, which is a model for calculating a score for classifying data into one of a plurality of classes.
  • the index indicating the magnitude of the difference between the score distribution obtained by inputting the training data into the replica model and the output distribution obtained by inputting the operational data into the replica model is equal to or greater than the threshold value. Detects that a concept drift has occurred that the user needs to address. As a result, the detection device 10 can detect the incremental drift with a small amount of calculation.
  • the detection unit 154 uses the distribution of scores obtained by inputting the training data at the time of training of the original model into the replica model and the data of each of the plurality of times after the training to the replica model among the time series data. Get the distribution of the score obtained by inputting.
  • the detection unit 154 detects that a concept drift that the user needs to deal with has occurred when the index indicating the magnitude of the difference in each distribution is equal to or greater than the threshold value.
  • the display control unit 155 displays information on each index of a plurality of times on the screen.
  • the display control unit 155 further provides information on an index of time when it is detected that a concept drift that the user needs to deal with has been provided, and also relates to an index of time that the user has not detected the concept drift that needs to be dealt with. Display in a manner different from the information. As a result, the detection device 10 can detect the incremental drift of the time series data and present the detection result in an intuitive manner.
  • the display control unit 155 displays each index of a plurality of times as a line graph.
  • the display control unit 155 corresponds to the time when it is detected that the concept drift that the user needs to deal with occurs at the time when the concept drift that the user needs to deal with is not detected. Display the corresponding points so that at least one of color, shape, and size is different.
  • the detection device 10 can detect the incremental drift of the time series data and present the detection result in an intuitive manner.
  • the display control unit 155 displays on the screen a histogram of the output distribution obtained by inputting the training data into the replica model and a histogram showing the output distribution obtained by inputting the data at the specified time into the replica model. Let me. As a result, the detection device 10 can present the occurrence of concept drift from various aspects and promote the understanding of the user.
  • the learning unit 151 detects that the concept drift that the user needs to deal with has occurred, the learning unit 151 is the latest in the time series data at the time when the concept drift that the user needs to deal with is detected. Relearn the original model using the data. As a result, the detection device 10 can adapt the original model to the incremental drift.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the original model, which is a model for calculating a score for classifying data into one of a plurality of classes.
  • the detection unit 154 uses the distribution of scores obtained by inputting the training data at the time of training of the original model into the replica model and the data of each of the plurality of times after the training to the replica model among the time series data. Get the distribution of the score obtained by inputting.
  • the detection unit 154 detects that a concept drift that the user needs to deal with occurs when the time-series change of the index indicating the magnitude of the difference of each distribution satisfies a predetermined condition. As a result, the detection device 10 can detect the sudden drift of the time series data with a small amount of calculation.
  • the detection unit 154 detects that a concept drift that the user needs to deal with has occurred when the amount of change in the index in time series is equal to or greater than the threshold value. As a result, the detection device 10 can detect the sudden drift of the time series data.
  • the learning unit 151 detects that the concept drift that the user needs to deal with has occurred, the learning unit 151 is the latest in the time series data at the time when the concept drift that the user needs to deal with is detected. Relearn the original model using the data. As a result, the detection device 10 can adapt the original model to the sudden drift.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the original model, which is a model for calculating a score for classifying data into one of a plurality of classes.
  • the detection unit 154 uses the distribution of scores obtained by inputting the training data at the time of training of the original model into the replica model and the data of each of the plurality of times after the training to the replica model among the time series data. Get the distribution of the score obtained by inputting.
  • the display control unit 155 displays on the screen the total value of the indexes indicating the magnitude of the difference of each distribution acquired by the detection unit 154 at regular intervals. As a result, the detection device 10 can detect that a concept drift that the user needs to deal with in a predetermined period has occurred with a small amount of calculation.
  • the detection unit 154 detects that a concept drift that the user needs to deal with has occurred when the index or the total value is equal to or greater than the threshold value.
  • the display control unit 155 detects a concept drift that the user needs to deal with, the display control unit 155 displays information on the concept drift on the screen.
  • the detection device 10 can detect the occurrence of concept drift that the user needs to address in both a narrow time range and a wide time range.
  • the display control unit 155 switches between a screen for displaying information on a plurality of indicators for each time and a screen for displaying information on the total value for each fixed period.
  • the detector 10 can present information about both narrow time range and wide time range concept drift.
  • the learning unit 151 detects that the concept drift that the user needs to deal with has occurred, the learning unit 151 is the latest in the time series data at the time when the concept drift that the user needs to deal with is detected. Relearn the original model using the data. As a result, the detection device 10 can adapt the original model to the concept drift.
  • the generation unit 152 generates a plurality of replica models by changing predetermined parameters of the original model, which is a model for calculating a score for classifying data into one of a plurality of classes.
  • the detection unit 154 uses the distribution of scores obtained by inputting the training data at the time of training of the original model into the replica model and the data of each of the plurality of times after the training to the replica model among the time series data. Get the distribution of the score obtained by inputting.
  • the detection unit 154 detects that a concept drift that the user needs to deal with occurs when the index indicating the magnitude of the difference in each distribution satisfies a predetermined condition at a constant cycle. As a result, the detection device 10 can detect the recurring drift of the time series data with a small amount of calculation.
  • the detection unit 154 detects that a concept drift that the user needs to deal with has occurred when the index exceeds the threshold value at regular intervals. As a result, the detection device 10 can detect the recurring drift of the time series data.
  • the learning unit 151 detects that a concept drift that the user needs to deal with has occurred, the learning unit 151 learns a new model by using the time-series data that satisfies a predetermined condition. Do. As a result, the detection device 10 can adapt the new model to the recurring drift.
  • the detection device 10 can detect concept drift even when the algorithm of each model is an algorithm other than the neural network.
  • the algorithm may be SVM (Support Vector Machine).
  • the generation unit 152 generates a plurality of SVMs as replica models by changing the weights and biases for each feature amount in the discriminant function of the SVMs that are the trained original models. Equation (3) is an SVM identification function.
  • the generation unit 152 generates a replica model by multiplying the weight a i and the bias b of the discrimination function by a random number. Similar to the neural network example described above, the generator 152 can multiply each parameter by a random number that follows a normal distribution with a mean of 1.0 and a standard deviation of 0.3.
  • the generation unit 152 can generate a plurality of logistic regression models as replica models by changing the weights and biases of each feature amount in the determination function of the logistic regression model which is the trained original model. .. Equation (4) is a decision function of logistic regression.
  • the generation unit 152 generates a replica model by multiplying the weight w i and the bias b of the determination function by a random number. Similar to the neural network example described above, the generator 152 can multiply each parameter by a random number that follows a normal distribution with a mean of 1.0 and a standard deviation of 0.3.
  • the generation unit 152 can generate a plurality of decision trees as a replica model by changing the threshold value of the information gain of the decision tree which is the trained original model.
  • the generation unit 152 generates a replica model by multiplying the threshold value of the information gain of each question by a random number. Similar to the neural network example described above, the generator 152 can multiply each parameter by a random number that follows a normal distribution with a mean of 1.0 and a standard deviation of 0.3.
  • FIG. 18 is a diagram for explaining the parameters of the decision tree.
  • the threshold of information gain for the question “Temperature 25 ° C or higher?" Is 25.
  • the generation unit 152 uses values such as 25.1 and 24.5 obtained by multiplying 25 by a random number to generate a decision tree with questions such as "temperature 25.1 ° C or higher?" And "temperature 24.5 ° C or higher?" As a replica model. Can be done.
  • each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution and integration of each device is not limited to the one shown in the figure. That is, all or a part thereof can be functionally or physically distributed / integrated in an arbitrary unit according to various loads, usage conditions, and the like. Further, each processing function performed by each device may be realized by a CPU and a program analyzed and executed by the CPU, or may be realized as hardware by wired logic.
  • FIG. 19 is a diagram illustrating a hardware configuration example.
  • the detection device 10 includes a communication interface 10a, an HDD (Hard Disk Drive) 10b, a memory 10c, and a processor 10d. Further, the parts shown in FIG. 19 are connected to each other by a bus or the like.
  • HDD Hard Disk Drive
  • the communication interface 10a is a network interface card or the like, and communicates with other servers.
  • the HDD 10b stores a program and a DB that operate the functions shown in FIG.
  • the processor 10d is a hardware that operates a process that executes each function described in FIG. 2 or the like by reading a program that executes the same processing as each processing unit shown in FIG. 2 from the HDD 10b or the like and expanding the program into the memory 10c. It is a wear circuit. That is, this process executes the same function as each processing unit of the detection device 10. Specifically, the processor 10d reads a program having the same functions as the learning unit 151, the generation unit 152, the calculation unit 153, the detection unit 154, and the display control unit 155 from the HDD 10b or the like. Then, the processor 10d executes a process of executing the same processing as the learning unit 151, the generation unit 152, the calculation unit 153, the detection unit 154, the display control unit 155, and the like.
  • the detection device 10 operates as an information processing device that executes the detection method or the display method by reading and executing the program. Further, the detection device 10 can realize the same function as that of the above-described embodiment by reading the program from the recording medium by the medium reading device and executing the read program.
  • the program referred to in the other embodiment is not limited to being executed by the detection device 10.
  • the present invention can be similarly applied when another computer or server executes a program, or when they execute a program in cooperation with each other.
  • This program can be distributed via networks such as the Internet.
  • this program is recorded on a computer-readable recording medium such as a hard disk, flexible disk (FD), CD-ROM, MO (Magneto-Optical disk), DVD (Digital Versatile Disc), and is recorded from the recording medium by the computer. It can be executed by being read.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'informations qui acquiert des premières données de fonctionnement à entrer dans un modèle. Le dispositif de traitement d'informations calcule, sur la base des premières données de fonctionnement acquises, des indices qui indiquent l'amplitude de changement dans une sortie de résultat du modèle, qui est provoquée par un changement temporel dans la tendance des données de fonctionnement. Le dispositif de traitement d'informations spécifie un changement d'indice dans une série chronologique à partir des indices calculés. Le dispositif de traitement d'informations détermine si une dérive de concept, nécessitant une manipulation par un utilisateur, se produit sur la base d'une comparaison entre le changement spécifié dans l'indice et un seuil prédéfini. Lorsque la dérive de concept, nécessitant une manipulation par l'utilisateur, est déterminée comme ayant eu lieu, le dispositif de traitement d'informations affiche, sur un écran d'affichage, des informations qui indiquent que la dérive de concept, nécessitant une manipulation par l'utilisateur, s'est produite.
PCT/JP2019/041584 2019-10-23 2019-10-23 Procédé d'affichage, programme d'affichage et dispositif de traitement d'informations WO2021079445A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021553215A JP7363911B2 (ja) 2019-10-23 2019-10-23 表示方法、表示プログラム及び情報処理装置
PCT/JP2019/041584 WO2021079445A1 (fr) 2019-10-23 2019-10-23 Procédé d'affichage, programme d'affichage et dispositif de traitement d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/041584 WO2021079445A1 (fr) 2019-10-23 2019-10-23 Procédé d'affichage, programme d'affichage et dispositif de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2021079445A1 true WO2021079445A1 (fr) 2021-04-29

Family

ID=75620640

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/041584 WO2021079445A1 (fr) 2019-10-23 2019-10-23 Procédé d'affichage, programme d'affichage et dispositif de traitement d'informations

Country Status (2)

Country Link
JP (1) JP7363911B2 (fr)
WO (1) WO2021079445A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210133632A1 (en) * 2019-11-04 2021-05-06 Domino Data Lab, Inc. Systems and methods for model monitoring

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008052577A (ja) * 2006-08-25 2008-03-06 Hitachi High-Technologies Corp 稼働状況を提示することができる装置
JP2014232494A (ja) * 2013-05-30 2014-12-11 日本電信電話株式会社 文書作成支援装置およびその動作方法
WO2019064892A1 (fr) * 2017-09-26 2019-04-04 株式会社日立製作所 Système et procédé d'aide à la gestion de fabrication
WO2019176480A1 (fr) * 2018-03-14 2019-09-19 オムロン株式会社 Dispositif d'aide à l'apprentissage

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6407841B2 (ja) 2015-11-18 2018-10-17 ファナック株式会社 クーラントを循環させるための循環路を備えるレーザ加工ヘッド

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008052577A (ja) * 2006-08-25 2008-03-06 Hitachi High-Technologies Corp 稼働状況を提示することができる装置
JP2014232494A (ja) * 2013-05-30 2014-12-11 日本電信電話株式会社 文書作成支援装置およびその動作方法
WO2019064892A1 (fr) * 2017-09-26 2019-04-04 株式会社日立製作所 Système et procédé d'aide à la gestion de fabrication
WO2019176480A1 (fr) * 2018-03-14 2019-09-19 オムロン株式会社 Dispositif d'aide à l'apprentissage

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LINDSTROM, P. ET AL.: "Drift detection using uncertainty distribution divergence", 2011 IEEE 11TH INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, 11 December 2011 (2011-12-11), pages 604 - 608, XP032100123, Retrieved from the Internet <URL:https://ieeexplore.ieee.org/abstract/document/6137435> [retrieved on 20191213], DOI: 10.1109/ICDMW.2011.70 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210133632A1 (en) * 2019-11-04 2021-05-06 Domino Data Lab, Inc. Systems and methods for model monitoring

Also Published As

Publication number Publication date
JP7363911B2 (ja) 2023-10-18
JPWO2021079445A1 (fr) 2021-04-29

Similar Documents

Publication Publication Date Title
JP2018195308A (ja) プロセス及び製造業における業績評価指標のデータに基づく最適化のための方法及びシステム
CN109726090A (zh) 计算系统中性能影响缺陷的识别
WO2018092747A1 (fr) Procédé de génération de modèle appris, dispositif de génération de modèle appris, procédé de discrimination de données de signal, dispositif de discrimination de données de signal et programme de discrimination de données de signal
US11593648B2 (en) Methods and systems for detection and isolation of bias in predictive models
WO2021079444A1 (fr) Procédé d&#39;affichage, programme d&#39;affichage et dispositif de traitement d&#39;informations
WO2019092931A1 (fr) Dispositif, procédé, et programme de génération de modèle discriminatoire
CN110969200A (zh) 基于一致性负样本的图像目标检测模型训练方法及装置
CN116453438B (zh) 一种显示屏参数检测方法、装置、设备及存储介质
US20220222545A1 (en) Generation method, non-transitory computer-readable storage medium, and information processing device
WO2021079445A1 (fr) Procédé d&#39;affichage, programme d&#39;affichage et dispositif de traitement d&#39;informations
WO2021079447A1 (fr) Procédé d&#39;affichage, programme d&#39;affichage et dispositif de traitement d&#39;informations
WO2021079446A1 (fr) Procédé d&#39;affichage, programme d&#39;affichage et dispositif de traitement d&#39;informations
JP2019067299A (ja) ラベル推定装置及びラベル推定プログラム
Karimi-Haghighi et al. Predicting early dropout: Calibration and algorithmic fairness considerations
JP6988995B2 (ja) 画像生成装置、画像生成方法および画像生成プログラム
CN117669384A (zh) 基于物联网的温度传感器生产智能监测方法及系统
WO2021079443A1 (fr) Procédé de détection, programme de détection et dispositif de détection
JP7212292B2 (ja) 学習装置、学習方法および学習プログラム
US20230385690A1 (en) Computer-readable recording medium storing determination program, determination apparatus, and method of determining
US20220230028A1 (en) Determination method, non-transitory computer-readable storage medium, and information processing device
US20220222580A1 (en) Deterioration detection method, non-transitory computer-readable storage medium, and information processing device
JPWO2018235841A1 (ja) グラフ構造解析装置、グラフ構造解析方法、及びプログラム
Unceta et al. Using copies to remove sensitive data: A case study on fair superhero alignment prediction
JP7400827B2 (ja) 検出方法、検出プログラムおよび情報処理装置
KR20220141220A (ko) 다수의 kpi들에 걸친 고차원 데이터 세트들에 대한 머신 학습 기반 대화형 비주얼 모니터링 툴

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19950122

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021553215

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19950122

Country of ref document: EP

Kind code of ref document: A1