WO2005055135A1 - Bayesian network approximation device - Google Patents
Bayesian network approximation device Download PDFInfo
- Publication number
- WO2005055135A1 WO2005055135A1 PCT/JP2004/017842 JP2004017842W WO2005055135A1 WO 2005055135 A1 WO2005055135 A1 WO 2005055135A1 JP 2004017842 W JP2004017842 W JP 2004017842W WO 2005055135 A1 WO2005055135 A1 WO 2005055135A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- validity
- experience
- record
- input
- bayesian network
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
Definitions
- the present invention relates to a Bayesian network approximation processing device that performs Bayesian network approximation processing.
- a processing method called a Bayesian network may be used to perform processing for various events that occur in a machine, an apparatus, a system, or the like.
- a Bayesian network is a probabilistic model with a graph structure in which random variables are represented by nodes and variables that show dependencies such as causal relationships and correlations are linked to each other. This is a model represented by an acyclic directed graph that has directionality and does not circulate the path through the link (a Bayesian network is described in Non-Patent Document 1 below).
- Patent Documents 1 and 2 below disclose inventions using a Bayesian network for an automatic diagnosis system.
- Patent Document 1 JP 2001-75808 A
- Patent Document 2 JP-A-2002-318691
- Non-Patent Document 1 Yoichi Motomura, “Probability Network and Its Application to Knowledge Information Processing", [online], January 24, 2001, Internet URL:
- the present inventor has developed a record (experience record) in which a record composed of input values (input records) and a record of various past events are combined with an input value field to the system and an appropriate output value field corresponding thereto.
- the similarity of the Bayesian network is calculated by calculating the similarity as validity, classifying the experience records by validity, and outputting the output value fields of the experience records with high validity as output values.
- the Bayesian network approximation processing device is implemented by hardware, so that the processing speed is increased and small-scale hardware is implemented. Processing that is similar to a Bayesian network is possible even for resources.
- the invention according to claim 1 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input accepting means for accepting input of sensitivity data which is data for assigning a weight to each field; Validity reference data input receiving means for receiving input of reference data, input record input receiving means for receiving input of an input record acquired outside the Bayesian network approximation processing device, and a predetermined Bayesian network approximation.
- Experience record input receiving means for acquiring an experience record including an input value field and an output value field corresponding to the input value field to the processing device; and for each experience record, the input record and the experience record Then, a validity calculating means for calculating the sum of the multiplications weighted by the sensitivity data given for each of the fields as the validity for each experience record, and the experience record is calculated based on the validity reference data.
- An effectiveness classifying unit that calculates output data from values of the output value fields in the classified experience records, and an output unit that receives and outputs the output data from the effectiveness classifying unit.
- a Bayesian network approximation processing device for acquiring an experience record including an input value field and an output value field corresponding to the input value field to the processing device.
- the invention according to claim 2 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input accepting means for accepting input of sensitivity data which is data for assigning weights to respective fields, Validity reference data input receiving means for receiving input of reference data, input record input receiving means for receiving input of an input record acquired outside the Bayesian network approximation processing device, and a predetermined Bayesian network approximation.
- Experience record input receiving means for acquiring an experience record consisting of an input value field and an output value field corresponding thereto, NF is the number of input value fields in the experience record, and NI is previously described Assuming that the number of elements in the value field is based on Equation 1,
- An effectiveness calculating means for calculating the effectiveness of each experience record, and classifying the experience records by the effectiveness based on the effectiveness reference data, and from the values of the output value fields in the classified experience records.
- a validity classifying means for calculating output data; an output means for receiving and outputting the output data from the validity classifying means; and a powerful Bayesian network approximation processing device.
- the Bayesian network approximation processing device has at least data storage means for storing the sensitivity data, the validity reference data, and the input record, and the validity calculating means Stores the calculated validity of each experience record in the data storage means, and stores the validity classification means in the data storage means when classifying the experience records for each of the validity reference data.
- a Bayesian network approximation processor that classifies the experience records after acquiring
- the invention according to claim 10 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input accepting means for accepting input of sensitivity data which is data for weighting each field, An input record input receiving unit for receiving an input of an input record acquired outside the Bayesian network approximation processing device; and a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto.
- Experience record input receiving means for acquiring an experience record, and for each experience record, multiplying the input record and the experience record by weighting with sensitivity data given for those fields.
- An effectiveness calculating means for calculating the effectiveness records, and classifying the experience records based on the effectiveness based on a predetermined criterion, and calculating the output power of the output value field in the classified experience records.
- a Bayesian network approximation processing device comprising: means; and output means for receiving and outputting the output data from the effectiveness classifying means.
- the invention of claim 11 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input receiving means for receiving input of sensitivity data which is data for weighting each field; An input record input receiving unit for receiving an input of an input record acquired outside the Bayesian network approximation processing device; and a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto.
- Experience record input receiving means for acquiring an experience record
- NF is the number of input value fields in the experience record
- NI is the number of elements of the input value field
- the experience record is Effectiveness calculation means for calculating the effectiveness of each of the above, and the experience record Is classified based on a predetermined criterion according to the validity, and the value of the output value field in the classified experience record is determined.
- the invention according to claim 17 is characterized in that the Bayesian network approximation processing device has at least data storage means for storing the sensitivity data and the input record, and the validity calculation means includes The validity of each experience record is stored in the data storage means, and the validity classifying means, when classifying the experience record, obtains each validity stored in the data storage means and then acquires the experience record.
- a Bayesian network approximation processor that classifies
- each experience record as in the present invention without storing all values of all Bayesian functions constituting the Bayesian network as in the related art.
- By calculating it is possible to automatically select an experience record and use it as output data based on it.
- the required output value for past events can be reduced, and the same processing as a Bayesian network can be performed even with smaller resources than before.
- an average value of an output value field in an experience record whose validity calculated by the validity calculating means is higher than the validity reference data can be used as the output data.
- the average value of the output value fields in the experience records in which the validity calculated by the validity calculating means is higher than the validity reference data and which corresponds to the predetermined upper nth order May be used as the output data.
- the experience records are classified for each of the validity reference data based on the validity calculated by the validity calculating means, and an output value field of the experience record classified into the highest validity standard is classified.
- a value may be used as the output data.
- the output in the experience record classified therein is set.
- the average value of the value field may be used as the output data, or as described in claim 7, when there are a plurality of experience records classified as the highest validity reference data, they are classified there.
- the value of the output value field in the experience record with the highest validity may be used as the output data, or as described in claim 8, classified into the highest validity reference data.
- the value of the output value field in an arbitrary experience record among the experience records classified there may be used as the output data.
- the sorting of the validity is performed, and the value of the output value field in the experience record having the highest validity can be used as the output data.
- sorting of the validity may be performed, and an average value of an output value field in an experience record corresponding to a predetermined upper n-th order may be used as output data.
- the validity may be sorted, and the values of the output value fields in the experience records corresponding to the predetermined upper nth order may be used as output data.
- the value of the output value field in an arbitrary experience record among the experience records corresponding to the upper n-th predetermined record may be used as the output data.
- the output value field of the experience record selected as in claims 12 to 15 is weighted with the validity of the experience record, and the average value of the validity after the weighting is calculated as follows: It may be output data.
- the Bayesian network approximation processing device of the present invention is preferably implemented on hardware. At this time, the probability distribution of the fields is expressed in the integer array more than the decimal array. Since it is suitable for hardware processing, an integer array is preferable.
- the device When the Bayesian network approximation processing device is hardware, as claimed in claim 19, the device may be directly incorporated in a control chip or the like as a circuit, or may be an external CPU for an embedded CPU. It is preferable to incorporate it in a processor, in hardware of an expansion board, in a mobile terminal as a microcontroller, or as an all-in-one product for small-scale control devices.
- the invention's effect is not limited to:
- the present invention it is not necessary to create a table in which all values of each event are required in the Bayesian network.
- similar experience records are extracted from a table (experience records) having only a part of the values, so that resources are not required as compared with the related art. Therefore, even a small-scale hardware resource can perform Bayesian network approximation processing.
- this method conforms to the method of transferring multiple data at once by specifying one address, the effect of high-speed processing can be obtained.
- data can be returned continuously after a certain period of time has passed when an address is sent to the memory. Can be received without waiting time, enabling high-speed processing.
- the present invention is designed so that the operation can be performed in synchronization with the burst transfer, the operation is performed on the spot every time one data is input, and the operation is completed only in the time required for reading the memory. Therefore, high-speed processing can be realized.
- FIG. 1 is a system configuration diagram showing an example of a system configuration of a Bayesian network approximation processing device.
- FIG. 2 is a flowchart illustrating an example of a processing flow of a Bayesian network approximation processing device.
- FIG. 3 is a conceptual diagram showing a concept of processing of a Bayesian network approximation processing device.
- FIG. 4 is a conceptual diagram in the case where a Bayesian network approximation processing device is incorporated in hardware.
- FIG. 5 is a diagram showing an example of effectiveness reference data, sensitivity data, input records, and experience records.
- FIG. 6 is an example of fields of oil temperature and oil amount.
- FIG. 7 is an example of an experience record.
- FIG. 8 is an example of sensitivity data.
- 15 is a table showing the effectiveness of the experience record 11E in the second embodiment.
- FIG. 16 is a table showing output value fields of each experience record classified according to effectiveness reference data and an average value thereof in Example 2.
- FIG. 20 is a table showing the validity of the experience record 11C in the third embodiment.
- FIG. 21 is a table showing the effectiveness of experience records 11D in Example 3.
- FIG. 22 is a table showing the effectiveness of experience records 11E in Example 3.
- FIG. 23 is a table showing output value fields of each experience record classified according to effectiveness reference data and an average value thereof in Example 3.
- FIG. 24 is an example of a case where S 140 and S 150 are represented in C language.
- FIG. 25 is an example of another system configuration of the Bayesian network approximation processing device.
- FIG. 26 is a diagram showing the concept of high-speed processing when a Bayesian network approximation processing device is implemented as hardware.
- the Bayesian network approximation processor 1 includes sensitivity data input receiving means 2, validity reference data input receiving means 3, input record input receiving means 4, experience record input receiving means 5, data storage means 6, validity It comprises arithmetic means 7, effectiveness classifying means 8, and output means 9.
- the Bayesian network approximation processing device 1 uses the Bayesian network based on an input record 10 (described later) acquired from an external sensor or the like and a preset experience record 11 (described later). The result of the processing performed by the approximation processing device 1 is output as output data 12, for example, to control a motor and various control devices.
- Fino redo is probability distribution data corresponding to one variable.
- a field shows the input value from the temperature sensor or the output value to the motor in a probability distribution.
- the probability distribution is an integer array in which respective probabilities are represented by integers for a plurality of representative values. For example, as shown in Fig.
- the probability of “one 20 ° C” is 20%, the probability of “20 ° C-30 ° C” is 70%, and “30 ° C” — If the probability of “40 ° C” is 10% and the probability of “40 ° C—” is 0%, the input value field of the temperature sensor force is an integer array of ⁇ 20, 70, 10, 0 ⁇ . .
- this array is originally a probability distribution, it is represented by a decimal point (the above integer array
- the Bayesian network approximation processor 1 is provided as a circuit in hardware such as a control chip, integer processing is more effective. Because of that, the probability is multiplied by 100, and the elements of the array are used. Therefore, an array of decimal numbers may be used instead of an integer array multiplied by 100.
- a record is a collection of a plurality of fields corresponding to one event. For example, at a certain time, “input value field from temperature sensor”, “input value field from humidity sensor”, “input value field from people sensor”, “output value field to heater”, etc. It is treated as one record. Records consisting of the same fields can be compared with each other, and in the Bayesian network approximation processing device 1 of the present invention, the current input value to the processing device The recorded event is used as the experience record 11, and the appropriate experience record 11 is selected by comparing the records.
- the output data 12 is a finale output from the Bayesian network approximation processing device 1 to the outside. Therefore, it is possible to select an appropriate experience record 11 based on the past experience record 11 (described later), calculate the output data 12 therefrom, and output the output data 12, which is a function of the Bayesian network approximation processor 1. It becomes.
- the input record 10 is a record indicating the current input to the Bayesian network approximation processing device 1 obtained from a sensor or the like.
- the experience record 11 is a record in which an input value field to the Bayesian network approximation processing device 1 and an appropriate output value field corresponding thereto are set in advance for various past events.
- the Bayesian network approximation processor 1 selects an appropriate experience record 11 by comparing a plurality of experience records 11 with the input record 10, and obtains an output value field. Therefore, as the number of experience records 11 increases, the result can be output with higher accuracy.
- the sensitivity data input accepting means 2 is means for accepting (receiving, and hereinafter the same) sensitivity data input by the Bayesian network approximation processing device 1 and storing the sensitivity data in the data storage means 6.
- Sensitivity data is data that specifies how much weight is given to each field when comparing records. Sensitivity data is stored in the corresponding field of input record 10. It has the number of elements obtained by multiplying the number of elements of the array indicating the probability distribution of one field by the number of elements of the array of the probability distribution of the corresponding field of the experience record 11.
- the input value field is a 4-element array as shown by ⁇ 20, 70, 10, 0 ⁇
- the corresponding output value field is also a 4-element array
- this sensitivity data Consists of 16 elements (4 x 4 elements). The sensitivity data is input for each field.
- the validity reference data input receiving means 3 is means for receiving the input of the validity reference data in the Bayesian network approximation processing device 1 (receiving the same, the same applies hereinafter), and storing it in the data storage means 6.
- the validity reference data is a numerical value that specifies the validity (described later) of the output data 12 to be obtained. Therefore, the output value field in the experience record 11 having a higher validity than the validity reference data can be obtained as the output data 12.
- the experience records 11 are classified according to the validity, and an output value field is obtained for each classification.
- the validity is a numerical value indicating the degree of similarity when comparing the input record 10 with a certain experience record 11.
- the Bayesian network approximation processing device 1 collects the output value fields of the experience record 11 with high effectiveness and calculates the output value.
- the array of the input record 10 is an array of 1 X NI
- the array of the experience record 11 is an array of NI X 1
- the sensitivity data is NI X NI Since it is an array, the validity is calculated by Equation 1. The calculation of the validity is performed by the validity calculation means 7, and the calculation of the validity using Equation 1 will be described later.
- NF is the number of fields
- NI is the number of elements in the field
- the input record input receiving means 4 receives an input record 10 acquired outside the Bayesian network approximation processing device 1 such as a sensor as input to the Bayesian network approximation processing device 1 (receives the same, and so on). This is a means for storing it in the data storage means 6.
- the experience record input accepting means 5 is a means for acquiring the preset experience record 11 in the Bayesian network approximation processing device 1 (receiving the same, hereinafter the same), and sending it to the effectiveness calculating means 7. is there.
- the data storage means 6 stores the sensitivity data received by the sensitivity data input receiving means 2, the validity reference data received by the validity reference data input receiving means 3, and the input record 10 received by the input record input receiving means 4.
- This is a means for storing data such as a register cache and a memory.
- the present invention is not limited to these, and any means may be used as long as data can be stored.
- the effectiveness calculating means 7 calculates, for each of the input records 10 and the experience records 11, for each of the experience records 11, the sum of multiplications weighted by the sensitivity data given for each field of the experience record 11. This is a means for calculating as the degree of validity.
- the validity classifying means 8 classifies the validity calculated by the validity calculating means 7 for each validity reference data, and determines the value of the output value field in the experience record 11 with high validity (validity is The higher the value, the higher the likelihood that the experience record 11 will be, and therefore the value of the output value field in the experience record 11 that is preferably classified according to the highest validity criteria.
- Numeral 12 is means for sending to the output means 9. If there are a plurality of experience records 11 classified according to the degree of validity, the average value of the output value field in each of the experience records 11 classified therein may be used as the output data 12.
- FIG. 3 shows a conceptual diagram of the processing of the Bayesian network approximation processing device 1 of the present invention.
- the validity calculating means 7 calculates the validity of each experience record 11 by using the above equation 1, and the validity classifying means 8 calculates the validity for each validity reference data. Classify. Then, the value of the output value field in the experience record 11 classified according to the highest level and the standard of effectiveness is sent to the output means 9 as output data 12. In addition, the classification of the effectiveness may be performed by sorting (sorting) for each effectiveness. When a plurality of experience records 11 are classified, the average value of the output value field of each experience record 11 is sent to the output means 9 as output data 12.
- sensitivity data is often fixed, and a large number of experience records per input record 10 are required. Since record 11 is entered, the frequency at which the value changes is "sensitivity data, input record 10, and experience record 11".
- the multiplication of the sensitivity data and the input record 10 is calculated first, and this is stored as intermediate data in the data storage means 6, etc.
- the effectiveness may be calculated by multiplying the intermediate data stored in the data storage means 6 and the like and each element of the experience record 11. This is advantageous in terms of processing speed and circuit scale.
- the output unit 9 is a unit that outputs the output data 12 classified by the effectiveness classifying unit 8 to the outside of the Bayesian network approximation processing device 1 as a processing result.
- the Bayesian network approximation processing device 1 of the present invention can process any of hardware and software, but is particularly effective when the hardware S and the hardware are used.
- the Bayesian network approximation processing device 1 can be incorporated directly into a control chip or the like as a circuit, incorporated into an external coprocessor for an embedded CPU, incorporated into expansion board hardware, or used in the ubiquitous field (for example, It can be built as a microcontroller in mobile terminals such as telephones, PHS, PDAs, etc. Preferably, it is incorporated as an in-one product.
- FIG. 4 shows a conceptual diagram when the Bayesian network approximation processing device 1 is incorporated in hardware.
- Fig. 4 (a) shows the case where an external coprocessor is used for the CPU, and
- Fig. 4 (b) shows the case where it is built into an expansion board.
- Fig. 4 (c) Built-in microcontroller for the ubiquitous field
- Fig. 4 (d) shows the case where it is incorporated as an all-in-one product for small-scale control devices.
- FIG. 4 (a) shows a case where the arithmetic unit of the Bayesian network approximation processor 1 is connected to the CPU, and the experience record 11 and the input record 10 receive (receive) input from the CPU. It will be. Therefore, the Bayesian network approximation processing device 1 is recorded on the chip of the arithmetic unit.
- a controller, an SDRAM, and an arithmetic unit of the Bayesian network approximation processor 1 are provided on an expansion board, and the host writes the experience record 11 to the SDRAM through the controller.
- the controller sets the arithmetic unit, transfers the experience record 11 to the arithmetic unit from the SDRAM, and the arithmetic unit performs Bayesian network approximation processing.
- the host can perform other processing, and the expansion board has the advantage of being able to operate at high speed independently of the host.
- FIG. 4 (c) the method of FIG. 4 (a) is combined with the method of FIG. 4 (b) in order to speed up the operation of FIG. 4 (a).
- Integrate connect SDRAM externally, and also connect to CPU.
- the SDRAM is dedicated to the operation unit, and is characterized in that it can be accessed from the CPU only through the controller. This also has the advantage that the SDRAM and the arithmetic unit can operate independently of the host.
- the controller of FIG. 4 (c) is strengthened so as to have a function as a one-chip microcomputer and to control the entire device.
- KAC-02A there is KAC-02A.
- ARM922T of ARM As a controller, for example, ARM922T of ARM,
- ARM926EJ-S ARM1026EJ-S, ARM1136J (F) _S, ARM7TDMI, ARM7TDMI-S, SC100, SC200, ARM7EJ-S, ARM946E_S, ARM966E-S and Renesas Technology M32R / E series (M32102S6FP, M32104S6FP, M32121FCAWG,
- M32121MCB-XXXWG, etc. M32R / ECU series (32170 group (M32170F3VFP, M32170F4VFP, M32170F6VFP), 32171 group (M32171F2VFP, M32171F3VFP, M32171F4VFP), 32172 group (M32172F2VFP), 321732 group (M32172F3) ??, M32174F4VFP), 32176 group ( ⁇ 432176-2 ??, M32176F2VWG, M32176F2TFP, M32176F2TWG, M32176F3VFP, M32176F3VWG, M32176F3TFP, M32176F3TWG, M32176F4VFP, M32176FVWG
- M32176F4TFP, M32176F4TWG 32180 Group (M32180F8TFP, M32180F8UFP, M32180F8VFP), 32182 Group (M32182F3TFP, M32182F3UFP, 32182F3VFP, M32182F8TFP, M32182F8UFP, M32182F8VFP, M32182F8VFP)
- the Bayesian network approximation processing device 1 receives sensitivity data input by the sensitivity data input receiving means 2, and stores it in the data storage means 6 (S100).
- An example of sensitivity data that accepts input at this time is shown in Fig. 5 (al) and Fig. 5 (a3).
- 5 (a) and 5 (a2) show the case where the sensitivity data is set symmetrically
- FIG. 5 (a3) shows the example where the sensitivity data is set asymmetrically.
- the sensitivity data indicates how much the field similarity affects the overall similarity. For example, if you set the sensitivity data fields A to 2 times the sensitivity of the data field B, since towards the field A becomes higher effectiveness, and towards the similarity Finore de A is more important than the field B Let's specify. By setting such sensitivity data, it can be used to emphasize fields whose values do not fluctuate much or to weaken fields whose values fluctuate greatly. On the other hand, by setting the sensitivity data base widely as shown in Fig. 5 (a2), the fields are completely the same. It is possible to specify that it is valid even if it is not. This is effective in that data having a large amount of noise components can absorb fluctuations. Also, as shown in Fig. 5 (a3), when sensitivity data is set asymmetrically, for example, in the case of Fig. 5 (a3), it is effective if the input record 10 is larger than the experience record 11 side. Can also be set as
- any numerical reference eg, a negative number
- the input of the validity reference data for classifying the experience records 11 is received by the validity reference data input receiving means 3, and is stored in the data storage means 6 (S110).
- An example of the effectiveness reference data is shown in Fig. 5 (b).
- Either of the input of the sensitivity data and the input of the validity reference data in S100 and S110 may be performed first.
- the Bayesian network approximation processing device 1 After receiving the input of the sensitivity data and the validity reference data, the Bayesian network approximation processing device 1 converts the input record 10 acquired from outside the Bayesian network approximation processing device 1 such as a sensor into an input record input reception. It is accepted by the means 4 and stored in the data storage means 6 (S120).
- FIG. 5 (c) shows an example of the input record 10
- FIG. 5 (d) shows an example of the experience record 11.
- the probability of a value of 1 is 70% for temperature, 10% for humidity, and 10% for wind speed '', ⁇
- the probability of a value of 2 is for a temperature of 10%, a humidity of 20%, and a wind speed of 0% ”,“ The probability that the value will be 3 indicates that the temperature power is 3 ⁇ 4%, the humidity is 70%, and the wind speed is 0% ”.
- the experience record input receiving means 5 acquires each experience record 11 one by one (S 130) and sends the acquired experience records 11 to the effectiveness calculating means 7 one by one. Then, the validity calculation means 7 compares the input record 10 stored in the data storage means 6 with the sensitivity data.
- the validity classifying means 8 classifies the validity of each experience record 11 calculated in Equation 1 for each validity reference data received in S1 10 and obtains the highest validity standard data.
- the value of the output value field in the experience record 11 corresponding to the data is set as the output data 12 (S150).
- the average value of the output value field in each of the classified experience records 11 is calculated and used as output data.
- the output data 12 processed by the effectiveness classifying means 8 in S150 is sent to the output means 9 and the output means 9 outputs the output data 12 as an output value (S160).
- the processing flow of the Bayesian network approximation processing device 1 will be described using a specific example. It is assumed that a failure prediction is performed as a model to be processed by the Bayesian network approximation processing device 1, and that two input values are an oil temperature and an oil amount. That is, there are two input records 10 of oil temperature and oil quantity, and the output is output data 12 that is useful for failure prediction.
- the oil temperature field is "Value 0 at one 20 ° C", "Value 1 at 20 ° C-40 ° C", "40 ° C-60 ° C”.
- the value of the oil volume field is 4 when the value is 2 and the value is 3 when the temperature is 60 ° C.
- Fig. 6 (a) the oil temperature field is "Value 0 at one 20 ° C", "Value 1 at 20 ° C-40 ° C", "40 ° C-60 ° C”.
- the value of the oil volume field is 4 when the value is 2 and the value is 3 when the temperature is 60 °
- the oil volume field is A field that takes the value 1 for 600 ml 800 ml, the value 2 for 400 ml 600 ml, and the value 3 for 400 ml 600 ml.
- the assignment is made as if the value is large, but may be assigned arbitrarily.
- the failure prediction field contains "Value 0 for normal operation”, “Value 1 for caution”, “Value 2 for inspection required”, and "Value 3 for emergency stop". Take Fino Redo.
- FIGS. 7 (a) to 7 (e) the experience record 11
- Figure 7 (a) is a sample of the experience record 11 consisting of the input record 10 and its output value field when the event is normal
- Figure 7 (b) is the input record 10 and its output when the event is cautious Sample 1 of experience record 11 consisting of value fields.
- Fig. 7 (c) input record 10 when the event is a caution and sample 2 of experience record 11 consisting of its output value field.
- Fig. 7 (d) shows the event.
- the Bayesian network approximation processing device 1 receives the sensitivity data of the oil temperature field and the sensitivity data of the oil amount field shown in FIG. Is stored in the data storage means 6 (S100).
- the validity reference data input receiving means 3 of the Bayesian network approximation processing device 1 receives the input of the validity reference data having the data shown in FIG. 9, and stores it in the data storage means 6 (S 110).
- the Bayesian network approximation processor 1 After accepting the input of the sensitivity data and the validity reference data, in this embodiment, the oil temperature and the failure of the oil power device are predicted. Therefore, the Bayesian network approximation processor 1 detects the oil temperature at which the oil temperature is detected.
- the input record 10 shown in Fig. 10 obtained by the sensor and the oil level sensor that detects the oil level is received by the input record input receiving means 4, and the data is stored. Store in means 6 (SI 20).
- the experience record input receiving means 5 acquires each experience record 11 one by one (S 130), and sends the acquired experience records 11 to the effectiveness calculating means 7 one by one.
- the validity calculating means 7 acquires the input record 10 and the sensitivity data stored in the data storage means 6.
- the effectiveness calculation means 7 calculates the sum of multiplications weighted by the sensitivity data given for each field for the input record 10 and each experience record 11 as the respective effectiveness (that is, for each experience record 11). (Calculate number 1) (S140
- the validity classifying means 8 classifies the validity of each of the experience records 11 for each validity reference data based on the validity calculated by the validity calculating means 7, and outputs the high validity reference data (preferably The value of the output value field in the experience record 11 classified as the highest validity reference data) is set as the output data 12 output by the output means 9 (S150). If there is a plurality of experience records 11 classified as the highest validity standard data, the average value of the output value field of the experience record 11 classified there is used as the output data 12.
- the processing of S140 and S150 can be expressed as shown in FIG. 24 when expressed in C language.
- the input record 10 is compared with the experience record 11A, and the validity is calculated by the validity calculating means.
- the validity of the oil temperature field and the validity of the oil quantity field can be calculated as shown in Fig. 11.
- the effectiveness of the oil temperature finolade is 680,000 and the effectiveness of the oil volume field is 300,000. Therefore, the effectiveness of comparison with experience record 11A is 980000.
- the validity of the experience record 11A is stored in the data storage means 6.
- the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
- the effectiveness of the oil temperature finoled is 680,000 and the effectiveness of the oil quantity field is 50,000. Therefore, the effectiveness when compared with the experience record 11B is 7300000.
- the validity of the experience record 11B is stored in the data storage means 6.
- the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the effectiveness of the oil temperature finoled is 320,000, and the effectiveness of the oil volume field is 300000. Therefore, the effectiveness when compared with the experience record 11C is 620 000.
- the validity of the experience record 11C is stored in the data storage means 6.
- the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
- the effectiveness of the oil temperature finoled is 320,000 and the effectiveness of the oil quantity field is 50,000. Therefore, the effectiveness when compared with the experience record 11D is 3700000.
- the validity of the experience record 11D is stored in the data storage means 6.
- the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
- the validity of the oil temperature finoled is 40,000 and the validity of the oil amount field is 300,000. Therefore, the effectiveness when compared with the experience record 11E is 3400000.
- the validity of the experience record 11E is stored in the data storage means 6.
- the validity classifying means 8 extracts the validity of each of the experience records 11A to 11E and the validity reference data accepted in S110 from the data storage means 6, and The classifying means 8 classifies each of the effectiveness reference data, and sends the value of the output value field in the experience record 11 classified as the highest effectiveness reference data to the output means 9 as the output data 12 (S150). ). In this case, since only one experience record 11 is classified as the highest validity reference data, it is not necessary to calculate the average value of the output value field.
- the validity of the experience record 11A is 980000, which corresponds to the value 2 (800001-10000000) of the validity reference data.
- FIG. 16 schematically shows the data classified for each validity reference data as described above. Then, since the experience record 11A with the reference value “2” was classified as the highest effectiveness reference data, the output value field in this experience record 11A (experience record 11 in FIG. 7 (a)) was used. Referring to the value of (failure prediction value of the above), it is “100, 0, 0, 0”. Since the experience record 11 classified into the reference value “2” is only this, the value of the output data 12 is also “100, 0, 0, 0”. Therefore, the effectiveness classifying means 8 sends “100, 0, 0, 0” to the output means 9 as the output data 12 output by the output means 9.
- the experience data, sensitivity data, and effectiveness reference data are the same as those in Example 2 (that is, the experience data is FIG. 7, the sensitivity data is FIG. 8, and the effectiveness reference data is FIG. 9).
- the input record 10 received by the input receiving means 4 (S120) and the input record 10 stored in the data storage means 6 is as shown in FIG. Note that S100 and S110 are the same as in the second embodiment.
- the experience record input receiving means 5 acquires each experience record 11 one by one (S130), and sends the acquired experience records 11 one by one to the effectiveness calculation means 7. Send out.
- the validity calculation means 7 acquires the input record 10 and the sensitivity data stored in the data storage means 6.
- the effectiveness calculation means 7 calculates, for each experience record 11, the sum of multiplications weighted by the sensitivity data given for each field for the input record 10 and each experience record 11, as the respective effectiveness. (Calculate fiP number 1) (S140).
- the effectiveness classifying means 8 classifies the effectiveness of each of the experience records 11 for each of the effectiveness reference data based on the effectiveness calculated by the effectiveness calculation means 7, and outputs the highest effectiveness reference data.
- the value of the output value field in the experience record 11 classified into the data is used as the output data output by the output means 9 (S150).
- Experience classified as the highest efficacy criterion data When there are a plurality of records 11, the average value of the output value fields in the experience records 11 classified therein is calculated and output data 12 is obtained.
- the input record 10 is compared with the experience record 11A, and the validity is calculated by the validity calculating means.
- step 7 the validity of the oil temperature field and the validity of the oil quantity field can be calculated as shown in Fig. 18.
- the effectiveness of the oil temperature finoled is 320,000 and the effectiveness of the oil volume field is 20000. Therefore, the effectiveness when compared with experience record 11A is 340000.
- the validity of the experience record 11A is stored in the data storage means 6.
- the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
- the effectiveness of the oil temperature finoled is 320,000 and the effectiveness of the oil quantity field is 340,000. Therefore, the effectiveness when compared with the experience record 11B is 660 000.
- the validity of the experience record 11B is stored in the data storage means 6.
- the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
- the effectiveness of the oil temperature finoled is 680,000 and the effectiveness of the oil amount finoled is 160,000. Therefore, the effectiveness when compared with the experience record 11C is 840 000.
- the validity of the experience record 11C is stored in the data storage means 6.
- the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
- the effectiveness of the oil temperature finoled is 680,000 and the effectiveness of the oil volume field is 34,000. Therefore, the effectiveness when compared with the experience record 11D is 1020 000.
- the validity of the experience record 11D is stored in the data storage means 6.
- the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
- the validity of the oil temperature finoled is 320,000
- the validity of the oil amount field is The degree is 20000. Therefore, the effectiveness of comparison with the experience record 1 IE is 3400000.
- the validity of the experience record 11E is stored in the data storage means 6.
- the validity classifying means 8 extracts the validity of each of the experience records 11A to 11E and the validity reference data received in S110 from the data storage means 6, and The classifying means 8 classifies each of the effectiveness reference data, and sends the value of the output value field in the experience record 11 classified as the highest effectiveness reference data to the output means 9 as the output data 12 (S150). ). In this case, since only one experience record 11 is classified as the highest validity reference data, it is not necessary to calculate the average value of the output value field.
- the validity of the experience record 11A is 340000, which corresponds to the value 5 (0 200000) of the validity reference data.
- FIG. 23 schematically shows the data classified for each validity reference data as described above. Then, since the experience record 11D with the reference value “1” was classified as the highest validity reference data, the output value field in this experience record 11D (experience record 11 in FIG. 7 (d)) was used. When referring to the value of the failure prediction value of “0, 0, 100, 0”. Since the only experience record 11 classified into the reference value “1” is this, the value of the output data 12 is also “0, 0, 100, 0”. Therefore, the validity classifying means 8 sends “0, 0, 100, 0” to the output means 9 as output data output by the output means 9.
- the output means 9 When the output means 9 receives the output data 12 “0, 0, 100, 0” from the effectiveness classifying means 8, it outputs it. Therefore, it can be determined that the current output result is a time when inspection is required.
- the effectiveness classifying means 8 classifies the effectiveness calculated by the effectiveness calculating means 7 for each effectiveness reference data, The value of the output value field in the experience record 11 having the highest validity among the experience records 11 of the above may be sent to the output means 9 as the output data 12.
- the validity classifying means 8 classifies the validity calculated by the validity calculating means 7 for each validity reference data, and selects any of the plurality of experience records 11 belonging to the highest validity reference data.
- the value of the output value field in the experience record 11 may be sent to the output means 9 as output data 12.
- the validity classifying means 8 classifies the validity calculated by the validity calculating means 7 for each validity reference data, and among the plurality of experience records 11 belonging to the highest validity reference data, The average value of the values of the output value fields in the predetermined n uppermost experience records 11 may be sent to the output means 9 as the output data 12.
- FIG. 25 shows a case where the Bayesian network approximation processing device 1 is not provided with the validity reference data input receiving means 3.
- processing other than the effectiveness classifying means 8 is the same as in the first to fourth embodiments. Therefore, the processing of S110 in FIG. 2 is also unnecessary.
- the effectiveness classification means 8 sorts (sorts) the effectiveness calculated by the effectiveness calculation means 7,
- the value of the output value field in the experience record 11 with the highest validity is also used as the output data 12 (preferably the value of the output value field in the experience record 11 classified according to the highest validity criterion).
- the validity classifying means 8 may use the experience record having the validity corresponding to the predetermined upper n items without using the value of the output value field of the experience record 11 having the highest validity.
- the average value of the values of the 11 output value fields may be used as the output data 12, or the value of the output value field of any one of the n most experienced records 11 may be used as the output data 12. Further, when there is a plurality of output data 12, the values of the output value fields of the top n experience records 11 may be used as the output data 12.
- Example 6 when generating the output data 12 in the first to fifth embodiments, the output value field of each experience record 10 is further weighted according to the effectiveness of the experience record 11 to obtain an average. It may be a value.
- the validity classifying means 8 calculates ⁇ (xo [r] X val [r]) ⁇ ⁇ (val [r]) and sets it as output data 12. Then, the validity classifying means 8 transmits the output data 12 to the output means 9, and the output means 9 outputs the output data 12.
- the weighted average may be calculated using the square of the effectiveness rather than the effectiveness itself. As a result, it is possible to obtain the output data 12 which emphasizes the experience records 11 with particularly high effectiveness.
- a storage medium storing a program of software for realizing the functions of the present embodiment is supplied to the system, and a computer of the system reads and executes the program stored in the storage medium.
- the program itself read from the storage medium implements the functions of the above-described embodiments, and the storage medium storing the program naturally constitutes the present invention. .
- each function of the Bayesian network approximation processing device 1 of the present invention may be written and operated in a programmable LSI.
- each function of the Bayesian network approximation processor 1 described in a circuit design language such as HDL is written to a programmable logic device called an FPGA (Field Programmable Gate Array), and calculation is performed.
- FPGA Field Programmable Gate Array
- As a storage medium for supplying the program for example, a magnetic disk, a hard disk, an optical disk, a magneto-optical disk, a magnetic tape, a nonvolatile memory card, and the like can be used.
- the program may be stored not only in a storage medium but also obtained by downloading from a network.
- the functions of the above-described embodiments are not only realized by the computer executing the readout program, but the operating system or the like running on the computer performs the actual processing based on the instructions of the program. It goes without saying that some or all of the above may be performed, and the processing may realize the functions of the above-described embodiments.
- SDRAM Serial DRAM
- this method conforms to the method of transferring multiple data at once by specifying a single address, the effect of high-speed processing can be obtained.
- SDRAM that performs burst transfer
- data can be returned continuously after a certain period of time, and if it is a continuous address, the second and subsequent data can be received without waiting time. Processing is possible. Since the present invention is designed so that the operation can be performed in synchronization with the burst transfer, the operation is performed on the spot every time one data is input, and the operation is completed only in the time required for reading the memory. Therefore, high-speed processing can be realized.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Pure & Applied Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Algebra (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Computational Mathematics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Complex Calculations (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-403039 | 2003-12-02 | ||
JP2003403039A JP2005165624A (en) | 2003-12-02 | 2003-12-02 | Bayesian network approximation processor |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005055135A1 true WO2005055135A1 (en) | 2005-06-16 |
Family
ID=34650051
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/017842 WO2005055135A1 (en) | 2003-12-02 | 2004-12-01 | Bayesian network approximation device |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2005165624A (en) |
WO (1) | WO2005055135A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2593757B (en) | 2020-04-02 | 2022-04-06 | Graphcore Ltd | Control of processing node operations |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08129488A (en) * | 1994-09-05 | 1996-05-21 | Toshiba Corp | Inference method and inference system |
JP2001075808A (en) * | 1999-07-14 | 2001-03-23 | Hewlett Packard Co <Hp> | Bayesian network |
-
2003
- 2003-12-02 JP JP2003403039A patent/JP2005165624A/en not_active Withdrawn
-
2004
- 2004-12-01 WO PCT/JP2004/017842 patent/WO2005055135A1/en not_active Application Discontinuation
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08129488A (en) * | 1994-09-05 | 1996-05-21 | Toshiba Corp | Inference method and inference system |
JP2001075808A (en) * | 1999-07-14 | 2001-03-23 | Hewlett Packard Co <Hp> | Bayesian network |
Non-Patent Citations (1)
Title |
---|
KIMURA Y. ET AL: "Bayesian Net Gakushu no Chishiki System heno Oyo", KEISOKU TO SEIGYO, vol. 38, no. 7, 10 July 1999 (1999-07-10), pages 468 - 473, XP002990224 * |
Also Published As
Publication number | Publication date |
---|---|
JP2005165624A (en) | 2005-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Secker et al. | AISEC: an artificial immune system for e-mail classification | |
EP2791781B1 (en) | Methods and systems for data analysis in a state machine | |
CN104067282B (en) | Counter operation in state machine lattice | |
CN104011723B (en) | Boolean logic in state machine lattice | |
CN103988212B (en) | Method and system for being route in state machine | |
US10489062B2 (en) | Methods and systems for using state vector data in a state machine engine | |
Eskin et al. | Protein family classification using sparse markov transducers | |
US20080082481A1 (en) | System and method for characterizing a web page using multiple anchor sets of web pages | |
JP2004171539A (en) | Method and system of identifying use pattern of web page | |
US20090210470A1 (en) | Apparatus and methods for lossless compression of numerical attributes in rule based systems | |
CN107346433A (en) | A kind of text data sorting technique and server | |
CN109857984B (en) | Regression method and device of boiler load rate-efficiency curve | |
He et al. | Learn to floorplan through acquisition of effective local search heuristics | |
JP2004512615A (en) | Method and apparatus for preempting referenced resources | |
CN113010778A (en) | Knowledge graph recommendation method and system based on user historical interest | |
CN116049536A (en) | Recommendation method and related device | |
WO2005055135A1 (en) | Bayesian network approximation device | |
Harkin et al. | Genetic algorithm driven hardware–software partitioning for dynamically reconfigurable embedded systems | |
US7107192B1 (en) | Method for computing models based on attributes selected by entropy | |
CN112507229A (en) | Document recommendation method and system and computer equipment | |
CN112270177A (en) | News cover mapping method and device based on content similarity and computing equipment | |
JP2009021928A (en) | Information processing apparatus and program | |
CN113994388A (en) | Method and apparatus for managing neural network model | |
CN111966571B (en) | Time estimation cooperative processing method based on ARM-FPGA coprocessor heterogeneous platform | |
Rathi | QSort–Dynamic pivot in original Quick Sort |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
122 | Ep: pct application non-entry in european phase |
Ref document number: 04819849 Country of ref document: EP Kind code of ref document: A1 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 4819849 Country of ref document: EP |