WO2005055135A1 - Bayesian network approximation device - Google Patents

Bayesian network approximation device Download PDF

Info

Publication number
WO2005055135A1
WO2005055135A1 PCT/JP2004/017842 JP2004017842W WO2005055135A1 WO 2005055135 A1 WO2005055135 A1 WO 2005055135A1 JP 2004017842 W JP2004017842 W JP 2004017842W WO 2005055135 A1 WO2005055135 A1 WO 2005055135A1
Authority
WO
WIPO (PCT)
Prior art keywords
validity
experience
record
input
bayesian network
Prior art date
Application number
PCT/JP2004/017842
Other languages
French (fr)
Japanese (ja)
Inventor
Toichiro Yamada
Original Assignee
Inter-Db Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inter-Db Co., Ltd. filed Critical Inter-Db Co., Ltd.
Publication of WO2005055135A1 publication Critical patent/WO2005055135A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks

Definitions

  • the present invention relates to a Bayesian network approximation processing device that performs Bayesian network approximation processing.
  • a processing method called a Bayesian network may be used to perform processing for various events that occur in a machine, an apparatus, a system, or the like.
  • a Bayesian network is a probabilistic model with a graph structure in which random variables are represented by nodes and variables that show dependencies such as causal relationships and correlations are linked to each other. This is a model represented by an acyclic directed graph that has directionality and does not circulate the path through the link (a Bayesian network is described in Non-Patent Document 1 below).
  • Patent Documents 1 and 2 below disclose inventions using a Bayesian network for an automatic diagnosis system.
  • Patent Document 1 JP 2001-75808 A
  • Patent Document 2 JP-A-2002-318691
  • Non-Patent Document 1 Yoichi Motomura, “Probability Network and Its Application to Knowledge Information Processing", [online], January 24, 2001, Internet URL:
  • the present inventor has developed a record (experience record) in which a record composed of input values (input records) and a record of various past events are combined with an input value field to the system and an appropriate output value field corresponding thereto.
  • the similarity of the Bayesian network is calculated by calculating the similarity as validity, classifying the experience records by validity, and outputting the output value fields of the experience records with high validity as output values.
  • the Bayesian network approximation processing device is implemented by hardware, so that the processing speed is increased and small-scale hardware is implemented. Processing that is similar to a Bayesian network is possible even for resources.
  • the invention according to claim 1 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input accepting means for accepting input of sensitivity data which is data for assigning a weight to each field; Validity reference data input receiving means for receiving input of reference data, input record input receiving means for receiving input of an input record acquired outside the Bayesian network approximation processing device, and a predetermined Bayesian network approximation.
  • Experience record input receiving means for acquiring an experience record including an input value field and an output value field corresponding to the input value field to the processing device; and for each experience record, the input record and the experience record Then, a validity calculating means for calculating the sum of the multiplications weighted by the sensitivity data given for each of the fields as the validity for each experience record, and the experience record is calculated based on the validity reference data.
  • An effectiveness classifying unit that calculates output data from values of the output value fields in the classified experience records, and an output unit that receives and outputs the output data from the effectiveness classifying unit.
  • a Bayesian network approximation processing device for acquiring an experience record including an input value field and an output value field corresponding to the input value field to the processing device.
  • the invention according to claim 2 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input accepting means for accepting input of sensitivity data which is data for assigning weights to respective fields, Validity reference data input receiving means for receiving input of reference data, input record input receiving means for receiving input of an input record acquired outside the Bayesian network approximation processing device, and a predetermined Bayesian network approximation.
  • Experience record input receiving means for acquiring an experience record consisting of an input value field and an output value field corresponding thereto, NF is the number of input value fields in the experience record, and NI is previously described Assuming that the number of elements in the value field is based on Equation 1,
  • An effectiveness calculating means for calculating the effectiveness of each experience record, and classifying the experience records by the effectiveness based on the effectiveness reference data, and from the values of the output value fields in the classified experience records.
  • a validity classifying means for calculating output data; an output means for receiving and outputting the output data from the validity classifying means; and a powerful Bayesian network approximation processing device.
  • the Bayesian network approximation processing device has at least data storage means for storing the sensitivity data, the validity reference data, and the input record, and the validity calculating means Stores the calculated validity of each experience record in the data storage means, and stores the validity classification means in the data storage means when classifying the experience records for each of the validity reference data.
  • a Bayesian network approximation processor that classifies the experience records after acquiring
  • the invention according to claim 10 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input accepting means for accepting input of sensitivity data which is data for weighting each field, An input record input receiving unit for receiving an input of an input record acquired outside the Bayesian network approximation processing device; and a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto.
  • Experience record input receiving means for acquiring an experience record, and for each experience record, multiplying the input record and the experience record by weighting with sensitivity data given for those fields.
  • An effectiveness calculating means for calculating the effectiveness records, and classifying the experience records based on the effectiveness based on a predetermined criterion, and calculating the output power of the output value field in the classified experience records.
  • a Bayesian network approximation processing device comprising: means; and output means for receiving and outputting the output data from the effectiveness classifying means.
  • the invention of claim 11 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input receiving means for receiving input of sensitivity data which is data for weighting each field; An input record input receiving unit for receiving an input of an input record acquired outside the Bayesian network approximation processing device; and a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto.
  • Experience record input receiving means for acquiring an experience record
  • NF is the number of input value fields in the experience record
  • NI is the number of elements of the input value field
  • the experience record is Effectiveness calculation means for calculating the effectiveness of each of the above, and the experience record Is classified based on a predetermined criterion according to the validity, and the value of the output value field in the classified experience record is determined.
  • the invention according to claim 17 is characterized in that the Bayesian network approximation processing device has at least data storage means for storing the sensitivity data and the input record, and the validity calculation means includes The validity of each experience record is stored in the data storage means, and the validity classifying means, when classifying the experience record, obtains each validity stored in the data storage means and then acquires the experience record.
  • a Bayesian network approximation processor that classifies
  • each experience record as in the present invention without storing all values of all Bayesian functions constituting the Bayesian network as in the related art.
  • By calculating it is possible to automatically select an experience record and use it as output data based on it.
  • the required output value for past events can be reduced, and the same processing as a Bayesian network can be performed even with smaller resources than before.
  • an average value of an output value field in an experience record whose validity calculated by the validity calculating means is higher than the validity reference data can be used as the output data.
  • the average value of the output value fields in the experience records in which the validity calculated by the validity calculating means is higher than the validity reference data and which corresponds to the predetermined upper nth order May be used as the output data.
  • the experience records are classified for each of the validity reference data based on the validity calculated by the validity calculating means, and an output value field of the experience record classified into the highest validity standard is classified.
  • a value may be used as the output data.
  • the output in the experience record classified therein is set.
  • the average value of the value field may be used as the output data, or as described in claim 7, when there are a plurality of experience records classified as the highest validity reference data, they are classified there.
  • the value of the output value field in the experience record with the highest validity may be used as the output data, or as described in claim 8, classified into the highest validity reference data.
  • the value of the output value field in an arbitrary experience record among the experience records classified there may be used as the output data.
  • the sorting of the validity is performed, and the value of the output value field in the experience record having the highest validity can be used as the output data.
  • sorting of the validity may be performed, and an average value of an output value field in an experience record corresponding to a predetermined upper n-th order may be used as output data.
  • the validity may be sorted, and the values of the output value fields in the experience records corresponding to the predetermined upper nth order may be used as output data.
  • the value of the output value field in an arbitrary experience record among the experience records corresponding to the upper n-th predetermined record may be used as the output data.
  • the output value field of the experience record selected as in claims 12 to 15 is weighted with the validity of the experience record, and the average value of the validity after the weighting is calculated as follows: It may be output data.
  • the Bayesian network approximation processing device of the present invention is preferably implemented on hardware. At this time, the probability distribution of the fields is expressed in the integer array more than the decimal array. Since it is suitable for hardware processing, an integer array is preferable.
  • the device When the Bayesian network approximation processing device is hardware, as claimed in claim 19, the device may be directly incorporated in a control chip or the like as a circuit, or may be an external CPU for an embedded CPU. It is preferable to incorporate it in a processor, in hardware of an expansion board, in a mobile terminal as a microcontroller, or as an all-in-one product for small-scale control devices.
  • the invention's effect is not limited to:
  • the present invention it is not necessary to create a table in which all values of each event are required in the Bayesian network.
  • similar experience records are extracted from a table (experience records) having only a part of the values, so that resources are not required as compared with the related art. Therefore, even a small-scale hardware resource can perform Bayesian network approximation processing.
  • this method conforms to the method of transferring multiple data at once by specifying one address, the effect of high-speed processing can be obtained.
  • data can be returned continuously after a certain period of time has passed when an address is sent to the memory. Can be received without waiting time, enabling high-speed processing.
  • the present invention is designed so that the operation can be performed in synchronization with the burst transfer, the operation is performed on the spot every time one data is input, and the operation is completed only in the time required for reading the memory. Therefore, high-speed processing can be realized.
  • FIG. 1 is a system configuration diagram showing an example of a system configuration of a Bayesian network approximation processing device.
  • FIG. 2 is a flowchart illustrating an example of a processing flow of a Bayesian network approximation processing device.
  • FIG. 3 is a conceptual diagram showing a concept of processing of a Bayesian network approximation processing device.
  • FIG. 4 is a conceptual diagram in the case where a Bayesian network approximation processing device is incorporated in hardware.
  • FIG. 5 is a diagram showing an example of effectiveness reference data, sensitivity data, input records, and experience records.
  • FIG. 6 is an example of fields of oil temperature and oil amount.
  • FIG. 7 is an example of an experience record.
  • FIG. 8 is an example of sensitivity data.
  • 15 is a table showing the effectiveness of the experience record 11E in the second embodiment.
  • FIG. 16 is a table showing output value fields of each experience record classified according to effectiveness reference data and an average value thereof in Example 2.
  • FIG. 20 is a table showing the validity of the experience record 11C in the third embodiment.
  • FIG. 21 is a table showing the effectiveness of experience records 11D in Example 3.
  • FIG. 22 is a table showing the effectiveness of experience records 11E in Example 3.
  • FIG. 23 is a table showing output value fields of each experience record classified according to effectiveness reference data and an average value thereof in Example 3.
  • FIG. 24 is an example of a case where S 140 and S 150 are represented in C language.
  • FIG. 25 is an example of another system configuration of the Bayesian network approximation processing device.
  • FIG. 26 is a diagram showing the concept of high-speed processing when a Bayesian network approximation processing device is implemented as hardware.
  • the Bayesian network approximation processor 1 includes sensitivity data input receiving means 2, validity reference data input receiving means 3, input record input receiving means 4, experience record input receiving means 5, data storage means 6, validity It comprises arithmetic means 7, effectiveness classifying means 8, and output means 9.
  • the Bayesian network approximation processing device 1 uses the Bayesian network based on an input record 10 (described later) acquired from an external sensor or the like and a preset experience record 11 (described later). The result of the processing performed by the approximation processing device 1 is output as output data 12, for example, to control a motor and various control devices.
  • Fino redo is probability distribution data corresponding to one variable.
  • a field shows the input value from the temperature sensor or the output value to the motor in a probability distribution.
  • the probability distribution is an integer array in which respective probabilities are represented by integers for a plurality of representative values. For example, as shown in Fig.
  • the probability of “one 20 ° C” is 20%, the probability of “20 ° C-30 ° C” is 70%, and “30 ° C” — If the probability of “40 ° C” is 10% and the probability of “40 ° C—” is 0%, the input value field of the temperature sensor force is an integer array of ⁇ 20, 70, 10, 0 ⁇ . .
  • this array is originally a probability distribution, it is represented by a decimal point (the above integer array
  • the Bayesian network approximation processor 1 is provided as a circuit in hardware such as a control chip, integer processing is more effective. Because of that, the probability is multiplied by 100, and the elements of the array are used. Therefore, an array of decimal numbers may be used instead of an integer array multiplied by 100.
  • a record is a collection of a plurality of fields corresponding to one event. For example, at a certain time, “input value field from temperature sensor”, “input value field from humidity sensor”, “input value field from people sensor”, “output value field to heater”, etc. It is treated as one record. Records consisting of the same fields can be compared with each other, and in the Bayesian network approximation processing device 1 of the present invention, the current input value to the processing device The recorded event is used as the experience record 11, and the appropriate experience record 11 is selected by comparing the records.
  • the output data 12 is a finale output from the Bayesian network approximation processing device 1 to the outside. Therefore, it is possible to select an appropriate experience record 11 based on the past experience record 11 (described later), calculate the output data 12 therefrom, and output the output data 12, which is a function of the Bayesian network approximation processor 1. It becomes.
  • the input record 10 is a record indicating the current input to the Bayesian network approximation processing device 1 obtained from a sensor or the like.
  • the experience record 11 is a record in which an input value field to the Bayesian network approximation processing device 1 and an appropriate output value field corresponding thereto are set in advance for various past events.
  • the Bayesian network approximation processor 1 selects an appropriate experience record 11 by comparing a plurality of experience records 11 with the input record 10, and obtains an output value field. Therefore, as the number of experience records 11 increases, the result can be output with higher accuracy.
  • the sensitivity data input accepting means 2 is means for accepting (receiving, and hereinafter the same) sensitivity data input by the Bayesian network approximation processing device 1 and storing the sensitivity data in the data storage means 6.
  • Sensitivity data is data that specifies how much weight is given to each field when comparing records. Sensitivity data is stored in the corresponding field of input record 10. It has the number of elements obtained by multiplying the number of elements of the array indicating the probability distribution of one field by the number of elements of the array of the probability distribution of the corresponding field of the experience record 11.
  • the input value field is a 4-element array as shown by ⁇ 20, 70, 10, 0 ⁇
  • the corresponding output value field is also a 4-element array
  • this sensitivity data Consists of 16 elements (4 x 4 elements). The sensitivity data is input for each field.
  • the validity reference data input receiving means 3 is means for receiving the input of the validity reference data in the Bayesian network approximation processing device 1 (receiving the same, the same applies hereinafter), and storing it in the data storage means 6.
  • the validity reference data is a numerical value that specifies the validity (described later) of the output data 12 to be obtained. Therefore, the output value field in the experience record 11 having a higher validity than the validity reference data can be obtained as the output data 12.
  • the experience records 11 are classified according to the validity, and an output value field is obtained for each classification.
  • the validity is a numerical value indicating the degree of similarity when comparing the input record 10 with a certain experience record 11.
  • the Bayesian network approximation processing device 1 collects the output value fields of the experience record 11 with high effectiveness and calculates the output value.
  • the array of the input record 10 is an array of 1 X NI
  • the array of the experience record 11 is an array of NI X 1
  • the sensitivity data is NI X NI Since it is an array, the validity is calculated by Equation 1. The calculation of the validity is performed by the validity calculation means 7, and the calculation of the validity using Equation 1 will be described later.
  • NF is the number of fields
  • NI is the number of elements in the field
  • the input record input receiving means 4 receives an input record 10 acquired outside the Bayesian network approximation processing device 1 such as a sensor as input to the Bayesian network approximation processing device 1 (receives the same, and so on). This is a means for storing it in the data storage means 6.
  • the experience record input accepting means 5 is a means for acquiring the preset experience record 11 in the Bayesian network approximation processing device 1 (receiving the same, hereinafter the same), and sending it to the effectiveness calculating means 7. is there.
  • the data storage means 6 stores the sensitivity data received by the sensitivity data input receiving means 2, the validity reference data received by the validity reference data input receiving means 3, and the input record 10 received by the input record input receiving means 4.
  • This is a means for storing data such as a register cache and a memory.
  • the present invention is not limited to these, and any means may be used as long as data can be stored.
  • the effectiveness calculating means 7 calculates, for each of the input records 10 and the experience records 11, for each of the experience records 11, the sum of multiplications weighted by the sensitivity data given for each field of the experience record 11. This is a means for calculating as the degree of validity.
  • the validity classifying means 8 classifies the validity calculated by the validity calculating means 7 for each validity reference data, and determines the value of the output value field in the experience record 11 with high validity (validity is The higher the value, the higher the likelihood that the experience record 11 will be, and therefore the value of the output value field in the experience record 11 that is preferably classified according to the highest validity criteria.
  • Numeral 12 is means for sending to the output means 9. If there are a plurality of experience records 11 classified according to the degree of validity, the average value of the output value field in each of the experience records 11 classified therein may be used as the output data 12.
  • FIG. 3 shows a conceptual diagram of the processing of the Bayesian network approximation processing device 1 of the present invention.
  • the validity calculating means 7 calculates the validity of each experience record 11 by using the above equation 1, and the validity classifying means 8 calculates the validity for each validity reference data. Classify. Then, the value of the output value field in the experience record 11 classified according to the highest level and the standard of effectiveness is sent to the output means 9 as output data 12. In addition, the classification of the effectiveness may be performed by sorting (sorting) for each effectiveness. When a plurality of experience records 11 are classified, the average value of the output value field of each experience record 11 is sent to the output means 9 as output data 12.
  • sensitivity data is often fixed, and a large number of experience records per input record 10 are required. Since record 11 is entered, the frequency at which the value changes is "sensitivity data, input record 10, and experience record 11".
  • the multiplication of the sensitivity data and the input record 10 is calculated first, and this is stored as intermediate data in the data storage means 6, etc.
  • the effectiveness may be calculated by multiplying the intermediate data stored in the data storage means 6 and the like and each element of the experience record 11. This is advantageous in terms of processing speed and circuit scale.
  • the output unit 9 is a unit that outputs the output data 12 classified by the effectiveness classifying unit 8 to the outside of the Bayesian network approximation processing device 1 as a processing result.
  • the Bayesian network approximation processing device 1 of the present invention can process any of hardware and software, but is particularly effective when the hardware S and the hardware are used.
  • the Bayesian network approximation processing device 1 can be incorporated directly into a control chip or the like as a circuit, incorporated into an external coprocessor for an embedded CPU, incorporated into expansion board hardware, or used in the ubiquitous field (for example, It can be built as a microcontroller in mobile terminals such as telephones, PHS, PDAs, etc. Preferably, it is incorporated as an in-one product.
  • FIG. 4 shows a conceptual diagram when the Bayesian network approximation processing device 1 is incorporated in hardware.
  • Fig. 4 (a) shows the case where an external coprocessor is used for the CPU, and
  • Fig. 4 (b) shows the case where it is built into an expansion board.
  • Fig. 4 (c) Built-in microcontroller for the ubiquitous field
  • Fig. 4 (d) shows the case where it is incorporated as an all-in-one product for small-scale control devices.
  • FIG. 4 (a) shows a case where the arithmetic unit of the Bayesian network approximation processor 1 is connected to the CPU, and the experience record 11 and the input record 10 receive (receive) input from the CPU. It will be. Therefore, the Bayesian network approximation processing device 1 is recorded on the chip of the arithmetic unit.
  • a controller, an SDRAM, and an arithmetic unit of the Bayesian network approximation processor 1 are provided on an expansion board, and the host writes the experience record 11 to the SDRAM through the controller.
  • the controller sets the arithmetic unit, transfers the experience record 11 to the arithmetic unit from the SDRAM, and the arithmetic unit performs Bayesian network approximation processing.
  • the host can perform other processing, and the expansion board has the advantage of being able to operate at high speed independently of the host.
  • FIG. 4 (c) the method of FIG. 4 (a) is combined with the method of FIG. 4 (b) in order to speed up the operation of FIG. 4 (a).
  • Integrate connect SDRAM externally, and also connect to CPU.
  • the SDRAM is dedicated to the operation unit, and is characterized in that it can be accessed from the CPU only through the controller. This also has the advantage that the SDRAM and the arithmetic unit can operate independently of the host.
  • the controller of FIG. 4 (c) is strengthened so as to have a function as a one-chip microcomputer and to control the entire device.
  • KAC-02A there is KAC-02A.
  • ARM922T of ARM As a controller, for example, ARM922T of ARM,
  • ARM926EJ-S ARM1026EJ-S, ARM1136J (F) _S, ARM7TDMI, ARM7TDMI-S, SC100, SC200, ARM7EJ-S, ARM946E_S, ARM966E-S and Renesas Technology M32R / E series (M32102S6FP, M32104S6FP, M32121FCAWG,
  • M32121MCB-XXXWG, etc. M32R / ECU series (32170 group (M32170F3VFP, M32170F4VFP, M32170F6VFP), 32171 group (M32171F2VFP, M32171F3VFP, M32171F4VFP), 32172 group (M32172F2VFP), 321732 group (M32172F3) ??, M32174F4VFP), 32176 group ( ⁇ 432176-2 ??, M32176F2VWG, M32176F2TFP, M32176F2TWG, M32176F3VFP, M32176F3VWG, M32176F3TFP, M32176F3TWG, M32176F4VFP, M32176FVWG
  • M32176F4TFP, M32176F4TWG 32180 Group (M32180F8TFP, M32180F8UFP, M32180F8VFP), 32182 Group (M32182F3TFP, M32182F3UFP, 32182F3VFP, M32182F8TFP, M32182F8UFP, M32182F8VFP, M32182F8VFP)
  • the Bayesian network approximation processing device 1 receives sensitivity data input by the sensitivity data input receiving means 2, and stores it in the data storage means 6 (S100).
  • An example of sensitivity data that accepts input at this time is shown in Fig. 5 (al) and Fig. 5 (a3).
  • 5 (a) and 5 (a2) show the case where the sensitivity data is set symmetrically
  • FIG. 5 (a3) shows the example where the sensitivity data is set asymmetrically.
  • the sensitivity data indicates how much the field similarity affects the overall similarity. For example, if you set the sensitivity data fields A to 2 times the sensitivity of the data field B, since towards the field A becomes higher effectiveness, and towards the similarity Finore de A is more important than the field B Let's specify. By setting such sensitivity data, it can be used to emphasize fields whose values do not fluctuate much or to weaken fields whose values fluctuate greatly. On the other hand, by setting the sensitivity data base widely as shown in Fig. 5 (a2), the fields are completely the same. It is possible to specify that it is valid even if it is not. This is effective in that data having a large amount of noise components can absorb fluctuations. Also, as shown in Fig. 5 (a3), when sensitivity data is set asymmetrically, for example, in the case of Fig. 5 (a3), it is effective if the input record 10 is larger than the experience record 11 side. Can also be set as
  • any numerical reference eg, a negative number
  • the input of the validity reference data for classifying the experience records 11 is received by the validity reference data input receiving means 3, and is stored in the data storage means 6 (S110).
  • An example of the effectiveness reference data is shown in Fig. 5 (b).
  • Either of the input of the sensitivity data and the input of the validity reference data in S100 and S110 may be performed first.
  • the Bayesian network approximation processing device 1 After receiving the input of the sensitivity data and the validity reference data, the Bayesian network approximation processing device 1 converts the input record 10 acquired from outside the Bayesian network approximation processing device 1 such as a sensor into an input record input reception. It is accepted by the means 4 and stored in the data storage means 6 (S120).
  • FIG. 5 (c) shows an example of the input record 10
  • FIG. 5 (d) shows an example of the experience record 11.
  • the probability of a value of 1 is 70% for temperature, 10% for humidity, and 10% for wind speed '', ⁇
  • the probability of a value of 2 is for a temperature of 10%, a humidity of 20%, and a wind speed of 0% ”,“ The probability that the value will be 3 indicates that the temperature power is 3 ⁇ 4%, the humidity is 70%, and the wind speed is 0% ”.
  • the experience record input receiving means 5 acquires each experience record 11 one by one (S 130) and sends the acquired experience records 11 to the effectiveness calculating means 7 one by one. Then, the validity calculation means 7 compares the input record 10 stored in the data storage means 6 with the sensitivity data.
  • the validity classifying means 8 classifies the validity of each experience record 11 calculated in Equation 1 for each validity reference data received in S1 10 and obtains the highest validity standard data.
  • the value of the output value field in the experience record 11 corresponding to the data is set as the output data 12 (S150).
  • the average value of the output value field in each of the classified experience records 11 is calculated and used as output data.
  • the output data 12 processed by the effectiveness classifying means 8 in S150 is sent to the output means 9 and the output means 9 outputs the output data 12 as an output value (S160).
  • the processing flow of the Bayesian network approximation processing device 1 will be described using a specific example. It is assumed that a failure prediction is performed as a model to be processed by the Bayesian network approximation processing device 1, and that two input values are an oil temperature and an oil amount. That is, there are two input records 10 of oil temperature and oil quantity, and the output is output data 12 that is useful for failure prediction.
  • the oil temperature field is "Value 0 at one 20 ° C", "Value 1 at 20 ° C-40 ° C", "40 ° C-60 ° C”.
  • the value of the oil volume field is 4 when the value is 2 and the value is 3 when the temperature is 60 ° C.
  • Fig. 6 (a) the oil temperature field is "Value 0 at one 20 ° C", "Value 1 at 20 ° C-40 ° C", "40 ° C-60 ° C”.
  • the value of the oil volume field is 4 when the value is 2 and the value is 3 when the temperature is 60 °
  • the oil volume field is A field that takes the value 1 for 600 ml 800 ml, the value 2 for 400 ml 600 ml, and the value 3 for 400 ml 600 ml.
  • the assignment is made as if the value is large, but may be assigned arbitrarily.
  • the failure prediction field contains "Value 0 for normal operation”, “Value 1 for caution”, “Value 2 for inspection required”, and "Value 3 for emergency stop". Take Fino Redo.
  • FIGS. 7 (a) to 7 (e) the experience record 11
  • Figure 7 (a) is a sample of the experience record 11 consisting of the input record 10 and its output value field when the event is normal
  • Figure 7 (b) is the input record 10 and its output when the event is cautious Sample 1 of experience record 11 consisting of value fields.
  • Fig. 7 (c) input record 10 when the event is a caution and sample 2 of experience record 11 consisting of its output value field.
  • Fig. 7 (d) shows the event.
  • the Bayesian network approximation processing device 1 receives the sensitivity data of the oil temperature field and the sensitivity data of the oil amount field shown in FIG. Is stored in the data storage means 6 (S100).
  • the validity reference data input receiving means 3 of the Bayesian network approximation processing device 1 receives the input of the validity reference data having the data shown in FIG. 9, and stores it in the data storage means 6 (S 110).
  • the Bayesian network approximation processor 1 After accepting the input of the sensitivity data and the validity reference data, in this embodiment, the oil temperature and the failure of the oil power device are predicted. Therefore, the Bayesian network approximation processor 1 detects the oil temperature at which the oil temperature is detected.
  • the input record 10 shown in Fig. 10 obtained by the sensor and the oil level sensor that detects the oil level is received by the input record input receiving means 4, and the data is stored. Store in means 6 (SI 20).
  • the experience record input receiving means 5 acquires each experience record 11 one by one (S 130), and sends the acquired experience records 11 to the effectiveness calculating means 7 one by one.
  • the validity calculating means 7 acquires the input record 10 and the sensitivity data stored in the data storage means 6.
  • the effectiveness calculation means 7 calculates the sum of multiplications weighted by the sensitivity data given for each field for the input record 10 and each experience record 11 as the respective effectiveness (that is, for each experience record 11). (Calculate number 1) (S140
  • the validity classifying means 8 classifies the validity of each of the experience records 11 for each validity reference data based on the validity calculated by the validity calculating means 7, and outputs the high validity reference data (preferably The value of the output value field in the experience record 11 classified as the highest validity reference data) is set as the output data 12 output by the output means 9 (S150). If there is a plurality of experience records 11 classified as the highest validity standard data, the average value of the output value field of the experience record 11 classified there is used as the output data 12.
  • the processing of S140 and S150 can be expressed as shown in FIG. 24 when expressed in C language.
  • the input record 10 is compared with the experience record 11A, and the validity is calculated by the validity calculating means.
  • the validity of the oil temperature field and the validity of the oil quantity field can be calculated as shown in Fig. 11.
  • the effectiveness of the oil temperature finolade is 680,000 and the effectiveness of the oil volume field is 300,000. Therefore, the effectiveness of comparison with experience record 11A is 980000.
  • the validity of the experience record 11A is stored in the data storage means 6.
  • the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
  • the effectiveness of the oil temperature finoled is 680,000 and the effectiveness of the oil quantity field is 50,000. Therefore, the effectiveness when compared with the experience record 11B is 7300000.
  • the validity of the experience record 11B is stored in the data storage means 6.
  • the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the effectiveness of the oil temperature finoled is 320,000, and the effectiveness of the oil volume field is 300000. Therefore, the effectiveness when compared with the experience record 11C is 620 000.
  • the validity of the experience record 11C is stored in the data storage means 6.
  • the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
  • the effectiveness of the oil temperature finoled is 320,000 and the effectiveness of the oil quantity field is 50,000. Therefore, the effectiveness when compared with the experience record 11D is 3700000.
  • the validity of the experience record 11D is stored in the data storage means 6.
  • the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
  • the validity of the oil temperature finoled is 40,000 and the validity of the oil amount field is 300,000. Therefore, the effectiveness when compared with the experience record 11E is 3400000.
  • the validity of the experience record 11E is stored in the data storage means 6.
  • the validity classifying means 8 extracts the validity of each of the experience records 11A to 11E and the validity reference data accepted in S110 from the data storage means 6, and The classifying means 8 classifies each of the effectiveness reference data, and sends the value of the output value field in the experience record 11 classified as the highest effectiveness reference data to the output means 9 as the output data 12 (S150). ). In this case, since only one experience record 11 is classified as the highest validity reference data, it is not necessary to calculate the average value of the output value field.
  • the validity of the experience record 11A is 980000, which corresponds to the value 2 (800001-10000000) of the validity reference data.
  • FIG. 16 schematically shows the data classified for each validity reference data as described above. Then, since the experience record 11A with the reference value “2” was classified as the highest effectiveness reference data, the output value field in this experience record 11A (experience record 11 in FIG. 7 (a)) was used. Referring to the value of (failure prediction value of the above), it is “100, 0, 0, 0”. Since the experience record 11 classified into the reference value “2” is only this, the value of the output data 12 is also “100, 0, 0, 0”. Therefore, the effectiveness classifying means 8 sends “100, 0, 0, 0” to the output means 9 as the output data 12 output by the output means 9.
  • the experience data, sensitivity data, and effectiveness reference data are the same as those in Example 2 (that is, the experience data is FIG. 7, the sensitivity data is FIG. 8, and the effectiveness reference data is FIG. 9).
  • the input record 10 received by the input receiving means 4 (S120) and the input record 10 stored in the data storage means 6 is as shown in FIG. Note that S100 and S110 are the same as in the second embodiment.
  • the experience record input receiving means 5 acquires each experience record 11 one by one (S130), and sends the acquired experience records 11 one by one to the effectiveness calculation means 7. Send out.
  • the validity calculation means 7 acquires the input record 10 and the sensitivity data stored in the data storage means 6.
  • the effectiveness calculation means 7 calculates, for each experience record 11, the sum of multiplications weighted by the sensitivity data given for each field for the input record 10 and each experience record 11, as the respective effectiveness. (Calculate fiP number 1) (S140).
  • the effectiveness classifying means 8 classifies the effectiveness of each of the experience records 11 for each of the effectiveness reference data based on the effectiveness calculated by the effectiveness calculation means 7, and outputs the highest effectiveness reference data.
  • the value of the output value field in the experience record 11 classified into the data is used as the output data output by the output means 9 (S150).
  • Experience classified as the highest efficacy criterion data When there are a plurality of records 11, the average value of the output value fields in the experience records 11 classified therein is calculated and output data 12 is obtained.
  • the input record 10 is compared with the experience record 11A, and the validity is calculated by the validity calculating means.
  • step 7 the validity of the oil temperature field and the validity of the oil quantity field can be calculated as shown in Fig. 18.
  • the effectiveness of the oil temperature finoled is 320,000 and the effectiveness of the oil volume field is 20000. Therefore, the effectiveness when compared with experience record 11A is 340000.
  • the validity of the experience record 11A is stored in the data storage means 6.
  • the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
  • the effectiveness of the oil temperature finoled is 320,000 and the effectiveness of the oil quantity field is 340,000. Therefore, the effectiveness when compared with the experience record 11B is 660 000.
  • the validity of the experience record 11B is stored in the data storage means 6.
  • the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
  • the effectiveness of the oil temperature finoled is 680,000 and the effectiveness of the oil amount finoled is 160,000. Therefore, the effectiveness when compared with the experience record 11C is 840 000.
  • the validity of the experience record 11C is stored in the data storage means 6.
  • the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
  • the effectiveness of the oil temperature finoled is 680,000 and the effectiveness of the oil volume field is 34,000. Therefore, the effectiveness when compared with the experience record 11D is 1020 000.
  • the validity of the experience record 11D is stored in the data storage means 6.
  • the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG.
  • the validity of the oil temperature finoled is 320,000
  • the validity of the oil amount field is The degree is 20000. Therefore, the effectiveness of comparison with the experience record 1 IE is 3400000.
  • the validity of the experience record 11E is stored in the data storage means 6.
  • the validity classifying means 8 extracts the validity of each of the experience records 11A to 11E and the validity reference data received in S110 from the data storage means 6, and The classifying means 8 classifies each of the effectiveness reference data, and sends the value of the output value field in the experience record 11 classified as the highest effectiveness reference data to the output means 9 as the output data 12 (S150). ). In this case, since only one experience record 11 is classified as the highest validity reference data, it is not necessary to calculate the average value of the output value field.
  • the validity of the experience record 11A is 340000, which corresponds to the value 5 (0 200000) of the validity reference data.
  • FIG. 23 schematically shows the data classified for each validity reference data as described above. Then, since the experience record 11D with the reference value “1” was classified as the highest validity reference data, the output value field in this experience record 11D (experience record 11 in FIG. 7 (d)) was used. When referring to the value of the failure prediction value of “0, 0, 100, 0”. Since the only experience record 11 classified into the reference value “1” is this, the value of the output data 12 is also “0, 0, 100, 0”. Therefore, the validity classifying means 8 sends “0, 0, 100, 0” to the output means 9 as output data output by the output means 9.
  • the output means 9 When the output means 9 receives the output data 12 “0, 0, 100, 0” from the effectiveness classifying means 8, it outputs it. Therefore, it can be determined that the current output result is a time when inspection is required.
  • the effectiveness classifying means 8 classifies the effectiveness calculated by the effectiveness calculating means 7 for each effectiveness reference data, The value of the output value field in the experience record 11 having the highest validity among the experience records 11 of the above may be sent to the output means 9 as the output data 12.
  • the validity classifying means 8 classifies the validity calculated by the validity calculating means 7 for each validity reference data, and selects any of the plurality of experience records 11 belonging to the highest validity reference data.
  • the value of the output value field in the experience record 11 may be sent to the output means 9 as output data 12.
  • the validity classifying means 8 classifies the validity calculated by the validity calculating means 7 for each validity reference data, and among the plurality of experience records 11 belonging to the highest validity reference data, The average value of the values of the output value fields in the predetermined n uppermost experience records 11 may be sent to the output means 9 as the output data 12.
  • FIG. 25 shows a case where the Bayesian network approximation processing device 1 is not provided with the validity reference data input receiving means 3.
  • processing other than the effectiveness classifying means 8 is the same as in the first to fourth embodiments. Therefore, the processing of S110 in FIG. 2 is also unnecessary.
  • the effectiveness classification means 8 sorts (sorts) the effectiveness calculated by the effectiveness calculation means 7,
  • the value of the output value field in the experience record 11 with the highest validity is also used as the output data 12 (preferably the value of the output value field in the experience record 11 classified according to the highest validity criterion).
  • the validity classifying means 8 may use the experience record having the validity corresponding to the predetermined upper n items without using the value of the output value field of the experience record 11 having the highest validity.
  • the average value of the values of the 11 output value fields may be used as the output data 12, or the value of the output value field of any one of the n most experienced records 11 may be used as the output data 12. Further, when there is a plurality of output data 12, the values of the output value fields of the top n experience records 11 may be used as the output data 12.
  • Example 6 when generating the output data 12 in the first to fifth embodiments, the output value field of each experience record 10 is further weighted according to the effectiveness of the experience record 11 to obtain an average. It may be a value.
  • the validity classifying means 8 calculates ⁇ (xo [r] X val [r]) ⁇ ⁇ (val [r]) and sets it as output data 12. Then, the validity classifying means 8 transmits the output data 12 to the output means 9, and the output means 9 outputs the output data 12.
  • the weighted average may be calculated using the square of the effectiveness rather than the effectiveness itself. As a result, it is possible to obtain the output data 12 which emphasizes the experience records 11 with particularly high effectiveness.
  • a storage medium storing a program of software for realizing the functions of the present embodiment is supplied to the system, and a computer of the system reads and executes the program stored in the storage medium.
  • the program itself read from the storage medium implements the functions of the above-described embodiments, and the storage medium storing the program naturally constitutes the present invention. .
  • each function of the Bayesian network approximation processing device 1 of the present invention may be written and operated in a programmable LSI.
  • each function of the Bayesian network approximation processor 1 described in a circuit design language such as HDL is written to a programmable logic device called an FPGA (Field Programmable Gate Array), and calculation is performed.
  • FPGA Field Programmable Gate Array
  • As a storage medium for supplying the program for example, a magnetic disk, a hard disk, an optical disk, a magneto-optical disk, a magnetic tape, a nonvolatile memory card, and the like can be used.
  • the program may be stored not only in a storage medium but also obtained by downloading from a network.
  • the functions of the above-described embodiments are not only realized by the computer executing the readout program, but the operating system or the like running on the computer performs the actual processing based on the instructions of the program. It goes without saying that some or all of the above may be performed, and the processing may realize the functions of the above-described embodiments.
  • SDRAM Serial DRAM
  • this method conforms to the method of transferring multiple data at once by specifying a single address, the effect of high-speed processing can be obtained.
  • SDRAM that performs burst transfer
  • data can be returned continuously after a certain period of time, and if it is a continuous address, the second and subsequent data can be received without waiting time. Processing is possible. Since the present invention is designed so that the operation can be performed in synchronization with the burst transfer, the operation is performed on the spot every time one data is input, and the operation is completed only in the time required for reading the memory. Therefore, high-speed processing can be realized.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Pure & Applied Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Algebra (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Complex Calculations (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

It is possible to provide a Bayesian network approximation device for performing a Bayesian network approximation. The Bayesian network approximation device includes: sensitivity data input reception means for receiving input of sensitivity data; validity reference data input reception means for receiving input of validity reference data; input record input reception means for receiving input of an input record; experience record input reception means for acquiring an experience record; calculation means for calculating, as validity for each experience record, the total of multiplications performed by weighting the input record and experience record with the sensitivity data given for each of their fields, classifying it according to the validity reference data, and outputting a value of output value field in the experience record classified into a high validity reference; and output means for receiving the output data from the calculation means and outputting it.

Description

明 細 書  Specification
ベイジアンネットワーク近似処理装置  Bayesian network approximation processor
技術分野  Technical field
[0001] 本発明は、ベイジアンネットワークの近似処理を行うベイジアンネットワーク近似処 理装置に関する。  The present invention relates to a Bayesian network approximation processing device that performs Bayesian network approximation processing.
背景技術  Background art
[0002] 機械、装置、システム等に発生する様々な事象に対する処理を行う為に、ベイジァ ンネットワークと呼ばれる処理方法を用いることがある。ベイジアンネットワークとは、 確率変数をノードで表し、因果関係や相関関係のような依存関係を示す変数の間に リンクを張ったグラフ構造による確率モデルであって、このリンクが因果関係の方向に 有向性を有し、そのリンクを迪つたパスが循環しない非循環有向グラフで表されるモ デルである(ベイジアンネットワークは下記の非特許文献 1に詳しい)。  [0002] A processing method called a Bayesian network may be used to perform processing for various events that occur in a machine, an apparatus, a system, or the like. A Bayesian network is a probabilistic model with a graph structure in which random variables are represented by nodes and variables that show dependencies such as causal relationships and correlations are linked to each other. This is a model represented by an acyclic directed graph that has directionality and does not circulate the path through the link (a Bayesian network is described in Non-Patent Document 1 below).
[0003] そして、下記特許文献 1及び特許文献 2には、ベイジアンネットワークを自動診断シ ステムに用いた発明が開示されている。  [0003] Patent Documents 1 and 2 below disclose inventions using a Bayesian network for an automatic diagnosis system.
[0004] 特許文献 1 :特開 2001— 75808号公報  [0004] Patent Document 1: JP 2001-75808 A
特許文献 2 :特開 2002 - 318691号公報  Patent Document 2: JP-A-2002-318691
[0005] 非特許文献 1:本村陽一、 "確率ネットワークと知識情報処理への応用"、 [online],平 成 13年 1月 24日、インターネットく URL:  [0005] Non-Patent Document 1: Yoichi Motomura, "Probability Network and Its Application to Knowledge Information Processing", [online], January 24, 2001, Internet URL:
http:// staif.aist.go.Jp/y.motomura/DS/DS.html/ >  http: // staif.aist.go.Jp/y.motomura/DS/DS.html/>
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0006] 上述したベイジアンネットワークによる処理では、経験に基づくテーブル (過去の様 々な事象について、システムへの入力値とその事象が発生した場合の出力値との対 応関係のテーブル)を作成しておき、ある入力値を受け付けた場合に、その事象の 対応する出力値を当該テーブルから探し、出力を行っている。従って、ある事象に対 するシステムへの入力値とその全ての場合の出力値とを(つまり全てのベイズ関数の 全ての出力値を)、予めテーブルに格納しておかなければならない。 [0007] 例えば、システムに於いて発生が想定される事象が 10あり、その各事象の結果が 各々 4通り想定される場合 (4値 10入力の場合)、テーブルとして 41Q項目(約 100万 項目)が必要となる。 [0006] In the processing by the Bayesian network described above, a table based on experience (table of correspondence between input values to the system and output values when the event occurs for various past events) is created. In addition, when a certain input value is received, the corresponding output value of the event is searched from the table and output. Therefore, the input value to the system for a certain event and the output value in all cases (that is, all the output values of all Bayesian functions) must be stored in a table in advance. [0007] For example, if there are 10 events that are assumed to occur in the system, and the results of each event are assumed to be 4 types (in the case of 4 values and 10 inputs), a table with 41Q items (approximately 1 million Item) is required.
[0008] し力、しこのようなテーブルを予め用意しておくこと自体が非常に困難であり、更に、 仮にこのようなテーブルを作成することが出来たとしても、そのデータ量が多量となる 為、ハードウェア資源として多大なリソースが要求される。従って、ュビキタス分野など の小規模なハードウェア資源しか持たない場合には、ベイジアンネットワークを用い た処理を行うことが出来ない問題点がある。  [0008] It is very difficult to prepare such a table in advance, and even if such a table can be created, the data amount becomes large. Therefore, a large amount of hardware resources are required. Therefore, there is a problem that processing using a Bayesian network cannot be performed when only small-scale hardware resources such as the ubiquitous field are available.
[0009] 又、上述したようなテーブルの更新を頻繁に行うことは困難でもある。  [0009] It is also difficult to frequently update the table as described above.
課題を解決するための手段  Means for solving the problem
[0010] そこで本発明者は、入力値からなるレコード (入力レコード)と、過去の様々な事象 について、システムへの入力値フィールドとそれに対応する適切な出力値フィールド を組み合わせたレコード (経験レコード)とに基づいてその類似性を有効度として計 算し、経験レコードを有効度別に分類して、その有効度の高い経験レコードの出力 値フィールドを出力値として出力することで、ベイジアンネットワークの近似処理を行う[0010] Accordingly, the present inventor has developed a record (experience record) in which a record composed of input values (input records) and a record of various past events are combined with an input value field to the system and an appropriate output value field corresponding thereto. The similarity of the Bayesian network is calculated by calculating the similarity as validity, classifying the experience records by validity, and outputting the output value fields of the experience records with high validity as output values. I do
、ベイジアンネットワーク近似処理装置を発明した。 Invented a Bayesian network approximation processing device.
[0011] 更に当該ベイジアンネットワークを用いた第 2の問題点を解決する為、当該べイジ アンネットワーク近似処理装置をハードウェア化することで、その処理速度を高速化 すると共に、小規模なハードウェア資源であってもベイジアンネットワークに近似する 処理を可能とした。  [0011] Further, in order to solve the second problem using the Bayesian network, the Bayesian network approximation processing device is implemented by hardware, so that the processing speed is increased and small-scale hardware is implemented. Processing that is similar to a Bayesian network is possible even for resources.
[0012] 請求項 1の発明は、ベイジアンネットワークの近似処理を行うベイジアンネットワーク 近似処理装置であって、各フィールドに対する重み付けを与えるデータである感度 データの入力を受け付ける感度データ入力受付手段と、有効度基準データの入力を 受け付ける有効度基準データ入力受付手段と、前記ベイジアンネットワーク近似処 理装置の外部で取得された入力レコードの入力を受け付ける入力レコード入力受付 手段と、予め設定された、前記ベイジアンネットワーク近似処理装置への入力値フィ 一ルドとそれに対する出力値フィールドとからなる経験レコードを取得する経験レコー ド入力受付手段と、前記経験レコード毎に、前記入力レコードと前記経験レコードに ついて、それらのフィールド毎に与えられた感度データで重み付けを行った乗算の 合計を、各経験レコードに対する有効度として計算する有効度演算手段と、前記経 験レコードを前記有効度基準データに基づいて、前記有効度で分類し、分類された 経験レコードに於ける出力値フィールドの値から出力データを計算する有効度分類 手段と、前記有効度分類手段から前記出力データを受信し出力する出力手段と、か らなるベイジアンネットワーク近似処理装置である。 [0012] The invention according to claim 1 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input accepting means for accepting input of sensitivity data which is data for assigning a weight to each field; Validity reference data input receiving means for receiving input of reference data, input record input receiving means for receiving input of an input record acquired outside the Bayesian network approximation processing device, and a predetermined Bayesian network approximation. Experience record input receiving means for acquiring an experience record including an input value field and an output value field corresponding to the input value field to the processing device; and for each experience record, the input record and the experience record Then, a validity calculating means for calculating the sum of the multiplications weighted by the sensitivity data given for each of the fields as the validity for each experience record, and the experience record is calculated based on the validity reference data. An effectiveness classifying unit that calculates output data from values of the output value fields in the classified experience records, and an output unit that receives and outputs the output data from the effectiveness classifying unit. And a Bayesian network approximation processing device.
[0013] 請求項 2の発明は、ベイジアンネットワークの近似処理を行うベイジアンネットワーク 近似処理装置であって、各フィールドに対する重み付けを与えるデータである感度 データの入力を受け付ける感度データ入力受付手段と、有効度基準データの入力を 受け付ける有効度基準データ入力受付手段と、前記ベイジアンネットワーク近似処 理装置の外部で取得された入力レコードの入力を受け付ける入力レコード入力受付 手段と、予め設定された、前記ベイジアンネットワーク近似処理装置への入力値フィ 一ルドとそれに対する出力値フィールドとからなる経験レコードを取得する経験レコー ド入力受付手段と、 NFを前記経験レコードに於ける入力値フィールドの数、 NIを前 記入力値フィールドの要素数とすると、数 1に基づレ、て前記経験レコード毎の有効度 を計算する有効度演算手段と、前記経験レコードを前記有効度基準データに基づい て、前記有効度で分類し、分類された経験レコードに於ける出力値フィールドの値か ら出力データを計算する有効度分類手段と、前記有効度分類手段から前記出力デ ータを受信し出力する出力手段と、力 なるベイジアンネットワーク近似処理装置で ある。  [0013] The invention according to claim 2 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input accepting means for accepting input of sensitivity data which is data for assigning weights to respective fields, Validity reference data input receiving means for receiving input of reference data, input record input receiving means for receiving input of an input record acquired outside the Bayesian network approximation processing device, and a predetermined Bayesian network approximation. Experience record input receiving means for acquiring an experience record consisting of an input value field and an output value field corresponding thereto, NF is the number of input value fields in the experience record, and NI is previously described Assuming that the number of elements in the value field is based on Equation 1, An effectiveness calculating means for calculating the effectiveness of each experience record, and classifying the experience records by the effectiveness based on the effectiveness reference data, and from the values of the output value fields in the classified experience records. A validity classifying means for calculating output data; an output means for receiving and outputting the output data from the validity classifying means; and a powerful Bayesian network approximation processing device.
[0014] 請求項 9の発明は、前記ベイジアンネットワーク近似処理装置は、少なくとも、前記 感度データ、前記有効度基準データ、前記入力レコードを記憶するデータ記憶手段 を有しており、前記有効度演算手段は、前記計算した各経験レコードの有効度を前 記データ記憶手段に記憶し、前記有効度分類手段は、前記有効度基準データ毎に 前記経験レコードを分類する際に、前記データ記憶手段に記憶した各有効度を取得 した後に前記経験レコードの分類を行う、ベイジアンネットワーク近似処理装置である  [0014] The invention according to claim 9, wherein the Bayesian network approximation processing device has at least data storage means for storing the sensitivity data, the validity reference data, and the input record, and the validity calculating means Stores the calculated validity of each experience record in the data storage means, and stores the validity classification means in the data storage means when classifying the experience records for each of the validity reference data. A Bayesian network approximation processor that classifies the experience records after acquiring
[0015] これらの発明によって、従来のようにベイジアンネットワークを構成する全てのべィ ズ関数の全ての値を、予め格納しておかなくても、本発明のように経験レコード毎の 有効度を計算することで、経験レコードを自動的に選択し、それに基づいてこれを出 力データとすることが可能となる。これによつて必要となる過去の事象に対する出力 値を減らすことが出来、従来よりも小規模なリソースであってもベイジアンネットワーク と同様な処理が可能となる。 [0015] According to these inventions, all bays that constitute a Bayesian network as in the related art can be obtained. Even if it is not necessary to store all the values of the energy function in advance, the validity of each experience record is calculated as in the present invention, and the experience record is automatically selected and output based on it. It can be data. As a result, the required output value for past events can be reduced, and the same processing as a Bayesian network can be performed even with smaller resources than before.
[0016] 請求項 10の発明は、ベイジアンネットワークの近似処理を行うベイジアンネットヮー ク近似処理装置であって、各フィールドに対する重み付けを与えるデータである感度 データの入力を受け付ける感度データ入力受付手段と、前記ベイジアンネットワーク 近似処理装置の外部で取得された入力レコードの入力を受け付ける入力レコード入 力受付手段と、予め設定された、前記ベイジアンネットワーク近似処理装置への入力 値フィールドとそれに対する出力値フィールドとからなる経験レコードを取得する経験 レコード入力受付手段と、前記経験レコード毎に、前記入力レコードと前記経験レコ ードにつレ、て、それらのフィールド毎に与えられた感度データで重み付けを行った乗 算の合計を、各経験レコードに対する有効度として計算する有効度演算手段と、前 記経験レコードを予め定められた基準に基づいて前記有効度で分類し、分類された 経験レコードに於ける出力値フィールドの値力 出力データを計算する有効度分類 手段と、前記有効度分類手段から前記出力データを受信し出力する出力手段と、か らなるベイジアンネットワーク近似処理装置である。  [0016] The invention according to claim 10 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input accepting means for accepting input of sensitivity data which is data for weighting each field, An input record input receiving unit for receiving an input of an input record acquired outside the Bayesian network approximation processing device; and a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto. Experience record input receiving means for acquiring an experience record, and for each experience record, multiplying the input record and the experience record by weighting with sensitivity data given for those fields. The sum of the calculations as the effectiveness for each experience record An effectiveness calculating means for calculating the effectiveness records, and classifying the experience records based on the effectiveness based on a predetermined criterion, and calculating the output power of the output value field in the classified experience records. A Bayesian network approximation processing device comprising: means; and output means for receiving and outputting the output data from the effectiveness classifying means.
[0017] 請求項 11の発明は、ベイジアンネットワークの近似処理を行うベイジアンネットヮー ク近似処理装置であって、各フィールドに対する重み付けを与えるデータである感度 データの入力を受け付ける感度データ入力受付手段と、前記ベイジアンネットワーク 近似処理装置の外部で取得された入力レコードの入力を受け付ける入力レコード入 力受付手段と、予め設定された、前記ベイジアンネットワーク近似処理装置への入力 値フィールドとそれに対する出力値フィールドとからなる経験レコードを取得する経験 レコード入力受付手段と、 NFを前記経験レコードに於ける入力値フィールドの数、 N Iを前記入力値フィールドの要素数とすると、数 1に基づレ、て前記経験レコード毎の有 効度を計算する有効度演算手段と、前記経験レコードを予め定められた基準に基づ レ、て前記有効度で分類し、分類された経験レコードに於ける出力値フィールドの値か ら出力データを計算する有効度分類手段と、前記有効度分類手段から前記出力デ ータを受信し出力する出力手段と、力 なるベイジアンネットワーク近似処理装置で ある。 [0017] The invention of claim 11 is a Bayesian network approximation processing device for performing Bayesian network approximation processing, wherein sensitivity data input receiving means for receiving input of sensitivity data which is data for weighting each field; An input record input receiving unit for receiving an input of an input record acquired outside the Bayesian network approximation processing device; and a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto. Experience record input receiving means for acquiring an experience record, and NF is the number of input value fields in the experience record, and NI is the number of elements of the input value field, and the experience record is Effectiveness calculation means for calculating the effectiveness of each of the above, and the experience record Is classified based on a predetermined criterion according to the validity, and the value of the output value field in the classified experience record is determined. A power Bayesian network approximation processing device; and an output means for receiving and outputting the output data from the validity classifier.
[0018] 請求項 17の発明は、前記ベイジアンネットワーク近似処理装置は、少なくとも、前 記感度データ、前記入力レコードを記憶するデータ記憶手段を有しており、前記有 効度演算手段は、前記計算した各経験レコードの有効度を前記データ記憶手段に 記憶し、前記有効度分類手段は、前記経験レコードを分類する際に、前記データ記 憶手段に記憶した各有効度を取得した後に前記経験レコードの分類を行う、ベイジ アンネットワーク近似処理装置である。  [0018] The invention according to claim 17 is characterized in that the Bayesian network approximation processing device has at least data storage means for storing the sensitivity data and the input record, and the validity calculation means includes The validity of each experience record is stored in the data storage means, and the validity classifying means, when classifying the experience record, obtains each validity stored in the data storage means and then acquires the experience record. A Bayesian network approximation processor that classifies
[0019] これらの発明によって、従来のようにベイジアンネットワークを構成する全てのべィ ズ関数の全ての値を、予め格納しておかなくても、本発明のように経験レコード毎の 有効度を計算することで、経験レコードを自動的に選択し、それに基づいてこれを出 力データとすることが可能となる。これによつて必要となる過去の事象に対する出力 値を減らすことが出来、従来よりも小規模なリソースであってもベイジアンネットワーク と同様な処理が可能となる。又、請求項 1、請求項 2の場合と異なり有効度基準デー タを用いる必要もない。  According to these inventions, it is possible to reduce the effectiveness of each experience record as in the present invention without storing all values of all Bayesian functions constituting the Bayesian network as in the related art. By calculating, it is possible to automatically select an experience record and use it as output data based on it. As a result, the required output value for past events can be reduced, and the same processing as a Bayesian network can be performed even with smaller resources than before. Also, unlike the case of claim 1 and claim 2, there is no need to use the validity reference data.
[0020] 有効度分類手段で、有効度の分類を行う際には、各種の方法をとることが出来る。  [0020] When the effectiveness is classified by the effectiveness classifying means, various methods can be used.
[0021] 例えば前記有効度演算手段で計算した有効度が前記有効度基準データよりも高 い経験レコードに於ける出力値フィールドの平均値を、前記出力データとすることが 出来る。 For example, an average value of an output value field in an experience record whose validity calculated by the validity calculating means is higher than the validity reference data can be used as the output data.
[0022] 又、前記有効度演算手段で計算した有効度が前記有効度基準データよりも高ぐ 且つ、予め定められた上位 n番目までに該当する経験レコードに於ける出力値フィー ルドの平均値を、前記出力データとしても良い。  [0022] Further, the average value of the output value fields in the experience records in which the validity calculated by the validity calculating means is higher than the validity reference data and which corresponds to the predetermined upper nth order May be used as the output data.
[0023] 更に、前記有効度演算手段で計算した有効度に基づいて前記経験レコードを前記 有効度基準データ毎に分類し、最も高い有効度基準に分類された経験レコードに於 ける出力値フィールドの値を、前記出力データとしても良い。 Further, the experience records are classified for each of the validity reference data based on the validity calculated by the validity calculating means, and an output value field of the experience record classified into the highest validity standard is classified. A value may be used as the output data.
[0024] この場合には、請求項 6に記載のように、前記最も高い有効度基準データに分類さ れた経験レコードが複数ある場合には、そこに分類された経験レコードに於ける出力 値フィールドの平均値を、前記出力データとしても良いし、請求項 7に記載のように、 前記最も高い有効度基準データに分類された経験レコードが複数ある場合には、そ こに分類された経験レコードのうち、最も高い有効度の経験レコードに於ける出力値 フィールドの値を、前記出力データとしても良いし、請求項 8に記載のように、前記最 も高い有効度基準データに分類された経験レコードが複数ある場合には、そこに分 類された経験レコードのうち、任意の経験レコードに於ける出力値フィールドの値を、 前記出力データとしても良い。 [0024] In this case, as described in claim 6, when there are a plurality of experience records classified into the highest effectiveness reference data, the output in the experience record classified therein is set. The average value of the value field may be used as the output data, or as described in claim 7, when there are a plurality of experience records classified as the highest validity reference data, they are classified there. Of the experience records, the value of the output value field in the experience record with the highest validity may be used as the output data, or as described in claim 8, classified into the highest validity reference data. When there are a plurality of experience records, the value of the output value field in an arbitrary experience record among the experience records classified there may be used as the output data.
[0025] 又、有効度基準データを用いない場合 (請求項 10、請求項 11の場合)の有効度の 分類も上述と同様に各種の方法をとることが出来る。  [0025] Further, in the case where the validity reference data is not used (claims 10 and 11), various methods can be used for the classification of the validity in the same manner as described above.
[0026] 例えば前記有効度のソーティングを行い、最も高い有効度である経験レコードに於 ける出力値フィールドの値を、出力データとすることが出来る。  [0026] For example, the sorting of the validity is performed, and the value of the output value field in the experience record having the highest validity can be used as the output data.
[0027] 又、前記有効度のソーティングを行レ、、予め定められた上位 n番目までに該当する 経験レコードに於ける出力値フィールドの平均値を、出力データとしても良い。  [0027] In addition, sorting of the validity may be performed, and an average value of an output value field in an experience record corresponding to a predetermined upper n-th order may be used as output data.
[0028] 更に、前記有効度のソーティングを行い、予め定められた上位 n番目までに該当す る経験レコードに於ける出力値フィールドの値を、各々出力データとしても良い。  [0028] Further, the validity may be sorted, and the values of the output value fields in the experience records corresponding to the predetermined upper nth order may be used as output data.
[0029] 加えて、予め定められた上位 n番目までに該当する経験レコードのうち、任意の経 験レコードに於ける出力値フィールドの値を、出力データとしても良い。  [0029] In addition, the value of the output value field in an arbitrary experience record among the experience records corresponding to the upper n-th predetermined record may be used as the output data.
[0030] 又、請求項 12から請求項 15のようにして選択した経験レコードの出力値フィールド に対して、その経験レコードの有効度を重み付けし、その重み付け後の有効度の平 均値を、出力データとしても良い。  [0030] Further, the output value field of the experience record selected as in claims 12 to 15 is weighted with the validity of the experience record, and the average value of the validity after the weighting is calculated as follows: It may be output data.
[0031] 本発明に力かるベイジアンネットワーク近似処理装置は、ハードウェア上で実現す ることが好適であるが、その際にフィールドを構成する確率分布は小数配列よりも整 数配列の方が、ハードウェアの処理に適しているので、整数配列とすると良い。  [0031] The Bayesian network approximation processing device of the present invention is preferably implemented on hardware. At this time, the probability distribution of the fields is expressed in the integer array more than the decimal array. Since it is suitable for hardware processing, an integer array is preferable.
[0032] 又、ベイジアンネットワーク近似処理装置をハードウェアとする場合には、請求項 19 力 請求項 23に記載のように、制御チップ等に直接回路として組み込んだり、組み 込み CPUに対する外付けのコプロセッサに組み込んだり、拡張ボードのハードウェア に組み込んだり、移動端末にマイクロコントローラとして内蔵したり、小規模制御装置 向けのオールインワン製品として組み込むことが好適である。 発明の効果 [0032] When the Bayesian network approximation processing device is hardware, as claimed in claim 19, the device may be directly incorporated in a control chip or the like as a circuit, or may be an external CPU for an embedded CPU. It is preferable to incorporate it in a processor, in hardware of an expansion board, in a mobile terminal as a microcontroller, or as an all-in-one product for small-scale control devices. The invention's effect
[0033] 本発明によって、ベイジアンネットワークの際に従来は必須であった各事象の全て の値を要素とするテーブルの作成が不要となる。本発明では、一部の値のみを有す るテーブル (経験レコード)から類似する経験レコードを抽出するので、従来よりもリソ ースを必要としない。従って小規模なハードウェア資源であっても、ベイジアンネット ワークの近似処理が可能となる。  According to the present invention, it is not necessary to create a table in which all values of each event are required in the Bayesian network. In the present invention, similar experience records are extracted from a table (experience records) having only a part of the values, so that resources are not required as compared with the related art. Therefore, even a small-scale hardware resource can perform Bayesian network approximation processing.
[0034] 又、従来はテーブル生成に時間がかかる為、最新の経験を直ちに反映させること が困難であつたが、本発明では、テーブルを使用しない為、最新の経験を直ちに反 映させることが可能となる。  [0034] Also, conventionally, it has been difficult to immediately reflect the latest experience because it takes time to generate a table, but in the present invention, since no table is used, the latest experience can be immediately reflected. It becomes possible.
[0035] 更にハードウェア化にあたっては、 SDRAM (Synchronous DRAM)のバースト転送  [0035] Further, in realizing hardware, burst transfer of SDRAM (Synchronous DRAM) is required.
(1回のアドレス指定で複数のデータをまとめて連続的に転送する方法)に適合して いるので、高速処理の効果が得られる。つまり、バースト転送を行う SDRAMの場合 、メモリにアドレスを送ると一定時間経過してからデータが連続的に返ってくるという 動作が可能であり、連続したアドレスであれば、 2つ目以降のデータは待ち時間なし に受け取れるので、高速処理を可能としている。そして本発明は、バースト転送に同 期して演算を行えるように設計されていることから、データが一つ入力される毎にその 場で演算を行いメモリの読み込みにかかる時間だけで演算が完了することから、高速 処理が実現可能となる。  Since this method conforms to the method of transferring multiple data at once by specifying one address, the effect of high-speed processing can be obtained. In other words, in the case of an SDRAM that performs burst transfer, data can be returned continuously after a certain period of time has passed when an address is sent to the memory. Can be received without waiting time, enabling high-speed processing. Since the present invention is designed so that the operation can be performed in synchronization with the burst transfer, the operation is performed on the spot every time one data is input, and the operation is completed only in the time required for reading the memory. Therefore, high-speed processing can be realized.
図面の簡単な説明  Brief Description of Drawings
[0036] [図 1]ベイジアンネットワーク近似処理装置のシステム構成の一例を示すシステム構 成図である。  FIG. 1 is a system configuration diagram showing an example of a system configuration of a Bayesian network approximation processing device.
[図 2]ベイジアンネットワーク近似処理装置の処理の流れの一例を示すフローチヤ一 ト図である。  FIG. 2 is a flowchart illustrating an example of a processing flow of a Bayesian network approximation processing device.
[図 3]ベイジアンネットワーク近似処理装置の処理の概念を示す概念図である。  FIG. 3 is a conceptual diagram showing a concept of processing of a Bayesian network approximation processing device.
[図 4]ベイジアンネットワーク近似処理装置がハードウェアに組み込まれた場合の概 念図である。  FIG. 4 is a conceptual diagram in the case where a Bayesian network approximation processing device is incorporated in hardware.
[図 5]有効度基準データ、感度データ、入力レコード、経験レコードの一例を示す図 である。 [図 6]油温及び油量のフィールドの一例である。 FIG. 5 is a diagram showing an example of effectiveness reference data, sensitivity data, input records, and experience records. FIG. 6 is an example of fields of oil temperature and oil amount.
[図 7]経験レコードの一例である。  FIG. 7 is an example of an experience record.
[図 8]感度データの一例である。  FIG. 8 is an example of sensitivity data.
園 9]有効度基準データの一例である。 Garden 9] is an example of effectiveness reference data.
園 10]実施例 2の場合の入力レコードである。 Garden 10] This is an input record in the case of the second embodiment.
園 11]実施例 2に於いて経験レコード 11Aの有効度を示す表である。 Garden 11] is a table showing the validity of the experience record 11A in Example 2.
園 12]実施例 2に於いて経験レコード 11Bの有効度を示す表である。 Garden 12] is a table showing the effectiveness of experience records 11B in Example 2.
園 13]実施例 2に於いて経験レコード 11Cの有効度を示す表である。 Garden 13] This is a table showing the effectiveness of the experience record 11C in Example 2.
園 14]実施例 2に於いて経験レコード 11Dの有効度を示す表である。 Garden 14] is a table showing the validity of the experience record 11D in the second embodiment.
園 15]実施例 2に於いて経験レコード 11Eの有効度を示す表である。 15 is a table showing the effectiveness of the experience record 11E in the second embodiment.
[図 16]実施例 2に於いて有効度基準データ毎に分類した各経験レコードの出力値フ ィールド及びその平均値を示す表である。  FIG. 16 is a table showing output value fields of each experience record classified according to effectiveness reference data and an average value thereof in Example 2.
園 17]実施例 3の場合の入力レコードである。 Garden 17] This is an input record in the case of the third embodiment.
園 18]実施例 3に於いて経験レコード 11Aの有効度を示す表である。 18 is a table showing the validity of the experience record 11A in Example 3.
園 19]実施例 3に於いて経験レコード 11Bの有効度を示す表である。 Garden 19] A table showing the effectiveness of experience record 11B in Example 3.
[図 20]実施例 3に於いて経験レコード 11Cの有効度を示す表である。  FIG. 20 is a table showing the validity of the experience record 11C in the third embodiment.
[図 21]実施例 3に於いて経験レコード 11Dの有効度を示す表である。  FIG. 21 is a table showing the effectiveness of experience records 11D in Example 3.
[図 22]実施例 3に於いて経験レコード 11Eの有効度を示す表である。  FIG. 22 is a table showing the effectiveness of experience records 11E in Example 3.
[図 23]実施例 3に於いて有効度基準データ毎に分類した各経験レコードの出力値フ ィールド及びその平均値を示す表である。  FIG. 23 is a table showing output value fields of each experience record classified according to effectiveness reference data and an average value thereof in Example 3.
[図 24]S 140及び S 150を C言語で表記した場合の例である。  FIG. 24 is an example of a case where S 140 and S 150 are represented in C language.
[図 25]ベイジアンネットワーク近似処理装置の他のシステム構成の一例である。  FIG. 25 is an example of another system configuration of the Bayesian network approximation processing device.
[図 26]ベイジアンネットワーク近似処理装置をハードウェア化した場合の高速処理の 概念を示す図である。  FIG. 26 is a diagram showing the concept of high-speed processing when a Bayesian network approximation processing device is implemented as hardware.
符号の説明 Explanation of symbols
1:ベイジアンネットワーク近似処理装置  1: Bayesian network approximation processor
2 :感度データ入力受付手段  2: Sensitivity data input receiving means
3:有効度基準データ入力受付手段 4 :入力レコード入力受付手段 3: Effectiveness standard data input receiving means 4: Input record input receiving means
5:経験レコード入力受付手段  5: Experience record input receiving means
6 :データ記憶手段  6: Data storage means
7 :有効度演算手段  7: Effectiveness calculation means
8 :有効度分類手段  8: Effectiveness classification means
9 :出力手段  9: Output means
10 :入力レコード  10: Input record
11 :経験レコード  11: Experience record
12 :出力データ  12: Output data
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0038] 本発明のベイジアンネットワーク近似処理装置 1のシステム構成の一例を図 1のシ ステム構成図を用いて説明する。 An example of the system configuration of the Bayesian network approximation processing device 1 of the present invention will be described with reference to the system configuration diagram of FIG.
[0039] ベイジアンネットワーク近似処理装置 1は、感度データ入力受付手段 2、有効度基 準データ入力受付手段 3、入力レコード入力受付手段 4、経験レコード入力受付手 段 5、データ記憶手段 6、有効度演算手段 7、有効度分類手段 8、出力手段 9とからな る。  The Bayesian network approximation processor 1 includes sensitivity data input receiving means 2, validity reference data input receiving means 3, input record input receiving means 4, experience record input receiving means 5, data storage means 6, validity It comprises arithmetic means 7, effectiveness classifying means 8, and output means 9.
[0040] ベイジアンネットワーク近似処理装置 1は、外部のセンサー等から取得した入力レコ ード 10 (後述)と予め設定してレ、る経験レコード 11 (後述)とに基づレ、て、ベイジアン ネットワーク近似処理装置 1で処理を行った結果を出力データ 12として出力し、例え ばモータや各種制御装置等の制御を行う。  [0040] The Bayesian network approximation processing device 1 uses the Bayesian network based on an input record 10 (described later) acquired from an external sensor or the like and a preset experience record 11 (described later). The result of the processing performed by the approximation processing device 1 is output as output data 12, for example, to control a motor and various control devices.
[0041] フィーノレドとは、一つの変数に対応した確率分布データである。例えば温度センサ 一からの入力値やモーターへの出力値を確率分布で示したものがフィールドとなる。 ベイジアンネットワーク近似処理装置 1の本実施例では、確率分布は、複数の代表値 に対してそれぞれの確率を整数で示した整数配列とする。例えば図 5 (c)に示すよう に、温度センサーからの入力値について、「一 20°C」の確率が 20%、「20°C— 30°C 」の確率が 70%、「30°C— 40°C」の確率が 10%、「40°C—」の確率が 0%であった場 合、温度センサー力 の入力値フィールドは { 20, 70, 10, 0}という整数配列となる。 [0041] Fino redo is probability distribution data corresponding to one variable. For example, a field shows the input value from the temperature sensor or the output value to the motor in a probability distribution. In the present embodiment of the Bayesian network approximation processing device 1, the probability distribution is an integer array in which respective probabilities are represented by integers for a plurality of representative values. For example, as shown in Fig. 5 (c), for the input value from the temperature sensor, the probability of “one 20 ° C” is 20%, the probability of “20 ° C-30 ° C” is 70%, and “30 ° C” — If the probability of “40 ° C” is 10% and the probability of “40 ° C—” is 0%, the input value field of the temperature sensor force is an integer array of {20, 70, 10, 0}. .
[0042] 尚、この配列は本来ならば確率分布であるので、小数点表示(上述の整数配列の 場合、 {0. 2, 0. 7, 0. 1, 0. 0})となる力 ベイジアンネットワーク近似処理装置 1を 制御チップ等のハードウェアに回路として設ける場合には整数処理の方が有効であ ることから確率を 100倍し、配列の要素としている。従って、 100倍して整数配列とせ ずに小数表示の配列であっても良い。 Since this array is originally a probability distribution, it is represented by a decimal point (the above integer array In this case, if the Bayesian network approximation processor 1 is provided as a circuit in hardware such as a control chip, integer processing is more effective. Because of that, the probability is multiplied by 100, and the elements of the array are used. Therefore, an array of decimal numbers may be used instead of an integer array multiplied by 100.
[0043] レコードとは、一つの事象に対応する、複数のフィールドをまとめたものである。例え ばある時刻に於ける「温度センサーからの入力値フィールド」、「湿度センサーからの 入力値フィールド」、「人数センサーからの入力値フィールド」、 「ヒーターへの出力値 フィールド」等をまとめて一つのレコードとして扱うものである。同じフィールドで構成さ れているレコードは、レコード同士を比較することが出来、本発明のベイジアンネット ワーク近似処理装置 1では、当該処理装置への現在の入力値を入力レコード 10、過 去に起こった事象を経験レコード 11とし、レコード同士を比較することで、適切な経 験レコード 11を選び出す。 A record is a collection of a plurality of fields corresponding to one event. For example, at a certain time, “input value field from temperature sensor”, “input value field from humidity sensor”, “input value field from people sensor”, “output value field to heater”, etc. It is treated as one record. Records consisting of the same fields can be compared with each other, and in the Bayesian network approximation processing device 1 of the present invention, the current input value to the processing device The recorded event is used as the experience record 11, and the appropriate experience record 11 is selected by comparing the records.
[0044] 出力データ 12は、ベイジアンネットワーク近似処理装置 1が外部に出力するフィー ノレドである。従って過去の経験レコード 11 (後述)に照らし合わせ、適切な経験レコー ド 11を選び出し、そこから出力データ 12を計算して、出力データ 12を出力することが 、このベイジアンネットワーク近似処理装置 1の機能となる。  The output data 12 is a finale output from the Bayesian network approximation processing device 1 to the outside. Therefore, it is possible to select an appropriate experience record 11 based on the past experience record 11 (described later), calculate the output data 12 therefrom, and output the output data 12, which is a function of the Bayesian network approximation processor 1. It becomes.
[0045] 入力レコード 10は、センサー等から取得した、現在のベイジアンネットワーク近似処 理装置 1への入力を示すレコードである。  [0045] The input record 10 is a record indicating the current input to the Bayesian network approximation processing device 1 obtained from a sensor or the like.
[0046] 経験レコード 11は、予め設定された、過去の様々な事象について、ベイジアンネッ トワーク近似処理装置 1への入力値フィールドとそれに対応する適切な出力値フィー ルドとを組み合わせたレコードである。ベイジアンネットワーク近似処理装置 1は、複 数の経験レコード 11を、入力レコード 10と比較することで、適切な経験レコード 11を 選び出し、出力値フィールドを得る。従って経験レコード 11の数が多いほど、精度の 高い結果の出力が可能となる。  The experience record 11 is a record in which an input value field to the Bayesian network approximation processing device 1 and an appropriate output value field corresponding thereto are set in advance for various past events. The Bayesian network approximation processor 1 selects an appropriate experience record 11 by comparing a plurality of experience records 11 with the input record 10, and obtains an output value field. Therefore, as the number of experience records 11 increases, the result can be output with higher accuracy.
[0047] 感度データ入力受付手段 2は、感度データの入力をベイジアンネットワーク近似処 理装置 1に受け付け(受信し、以下同様)、データ記憶手段 6に記憶する手段である。 感度データとは、レコードを比較する際に、それぞれのフィールドにどの程度の重み を与えるかを指定するデータである。感度データは、入力レコード 10の対応するフィ 一ルドの確率分布を示す配列の要素数と、経験レコード 11の対応するフィールドの 確率分布の配列の要素数とを乗じた数の要素を持つ。例えば入力値フィールドが { 2 0, 70, 10, 0}で示されるような 4要素の配列であった場合、対応する出力値フィー ルドも同様に 4要素の配列であれば、この感度データは 16要素(4 X 4要素)からなる 。感度データはフィールド毎に入力される。 The sensitivity data input accepting means 2 is means for accepting (receiving, and hereinafter the same) sensitivity data input by the Bayesian network approximation processing device 1 and storing the sensitivity data in the data storage means 6. Sensitivity data is data that specifies how much weight is given to each field when comparing records. Sensitivity data is stored in the corresponding field of input record 10. It has the number of elements obtained by multiplying the number of elements of the array indicating the probability distribution of one field by the number of elements of the array of the probability distribution of the corresponding field of the experience record 11. For example, if the input value field is a 4-element array as shown by {20, 70, 10, 0}, and the corresponding output value field is also a 4-element array, this sensitivity data Consists of 16 elements (4 x 4 elements). The sensitivity data is input for each field.
[0048] 有効度基準データ入力受付手段 3は、有効度基準データの入力をべイジアンネッ トワーク近似処理装置 1に受け付け (受信し、以下同様)、データ記憶手段 6に記憶 する手段である。有効度基準データとは、どれだけの有効度 (後述)を持つ出力デー タ 12を得るかを指定する数値である。従って、有効度基準データよりも高い有効度を 持つ経験レコード 11に於ける出力値フィールドを出力データ 12として得られることに なる。複数の有効度基準データを指定した場合には、有効度によって経験レコード 1 1が分類され、分類毎に出力値フィールドが得られる。  The validity reference data input receiving means 3 is means for receiving the input of the validity reference data in the Bayesian network approximation processing device 1 (receiving the same, the same applies hereinafter), and storing it in the data storage means 6. The validity reference data is a numerical value that specifies the validity (described later) of the output data 12 to be obtained. Therefore, the output value field in the experience record 11 having a higher validity than the validity reference data can be obtained as the output data 12. When a plurality of validity reference data are specified, the experience records 11 are classified according to the validity, and an output value field is obtained for each classification.
[0049] 有効度とは、入力レコード 10とある経験レコード 11とを比較したときの、類似の度合 いを示す数値である。入力レコード 10とある経験レコード 11を、感度データに基づい て比較することで、その経験レコード 11がどの程度入力レコード 10に似ているかを有 効度として示すことが出来る。ベイジアンネットワーク近似処理装置 1では、有効度の 高い経験レコード 11の出力値フィールドを集め、出力値を計算する。  The validity is a numerical value indicating the degree of similarity when comparing the input record 10 with a certain experience record 11. By comparing the input record 10 with a certain experience record 11 based on the sensitivity data, it is possible to indicate the degree to which the experience record 11 is similar to the input record 10 as the effectiveness. The Bayesian network approximation processing device 1 collects the output value fields of the experience record 11 with high effectiveness and calculates the output value.
[0050] 入力レコード 10のあるフィールドの要素数が NI個の場合、入力レコード 10の配列 を 1 X NIの配列、経験レコード 11の配列を NI X 1の配列とし、感度データは NI X NI の配列となるので、数 1で有効度が計算される。この有効度の計算は、有効度演算手 段 7で行レ、、数 1を用いた有効度の計算については後述する。  [0050] When the number of elements in a certain field of the input record 10 is NI, the array of the input record 10 is an array of 1 X NI, the array of the experience record 11 is an array of NI X 1, and the sensitivity data is NI X NI Since it is an array, the validity is calculated by Equation 1. The calculation of the validity is performed by the validity calculation means 7, and the calculation of the validity using Equation 1 will be described later.
[数 1]  [Number 1]
∑ ∑ 〉 ', ί (入力レコードのフィールド の要素 の確率値) X (経験レコードのフィールド の要素 j の確率値) X (フィールド の要素 ( , の感度データ) ) ∑〉〉 ', ί (probability value of field element of input record) X (probability value of element j of field of experience record) X (element of field (sensitivity data of,))
但し、 N Fはフィールド数、 N Iはフィールドの要素数  Where NF is the number of fields, NI is the number of elements in the field
[0051] 入力レコード入力受付手段 4は、センサー等のベイジアンネットワーク近似処理装 置 1の外部で取得された入力レコード 10を、ベイジアンネットワーク近似処理装置 1 に入力として受け付け (受信し、以下同様)、それをデータ記憶手段 6に記憶する手 段である。 [0052] 経験レコード入力受付手段 5は、予め設定された経験レコード 11をベイジアンネット ワーク近似処理装置 1に取得し (受信し、以下同様)、それを有効度演算手段 7に送 出する手段である。 [0051] The input record input receiving means 4 receives an input record 10 acquired outside the Bayesian network approximation processing device 1 such as a sensor as input to the Bayesian network approximation processing device 1 (receives the same, and so on). This is a means for storing it in the data storage means 6. The experience record input accepting means 5 is a means for acquiring the preset experience record 11 in the Bayesian network approximation processing device 1 (receiving the same, hereinafter the same), and sending it to the effectiveness calculating means 7. is there.
[0053] データ記憶手段 6は、感度データ入力受付手段 2で受け付けた感度データ、有効 度基準データ入力受付手段 3で受け付けた有効度基準データ、入力レコード入力受 付手段 4で受け付けた入力レコード 10等のデータを記憶する手段であり、例えばレ ジスタゃキャッシュ、メモリ等が該当するが、これらに限定するものではなぐデータを 記憶することが出来るものであれば如何なるものであっても良い。  The data storage means 6 stores the sensitivity data received by the sensitivity data input receiving means 2, the validity reference data received by the validity reference data input receiving means 3, and the input record 10 received by the input record input receiving means 4. This is a means for storing data such as a register cache and a memory. However, the present invention is not limited to these, and any means may be used as long as data can be stored.
[0054] 有効度演算手段 7は、入力レコード 10と経験レコード 11とについて、経験レコード 1 1毎に、その経験レコード 11のフィールド毎に与えられた感度データで重み付けを行 つた乗算の合計をそれぞれの有効度として計算する手段である。  The effectiveness calculating means 7 calculates, for each of the input records 10 and the experience records 11, for each of the experience records 11, the sum of multiplications weighted by the sensitivity data given for each field of the experience record 11. This is a means for calculating as the degree of validity.
[0055] 有効度分類手段 8は、有効度演算手段 7で計算した有効度を有効度基準データ毎 に分類し、有効度の高い経験レコード 11に於ける出力値フィールドの値を(有効度 が高いとその経験レコード 11が類似してレ、る可能性が高レ、ことから、好適には最も高 い有効度基準に分類された経験レコード 11に於ける出力値フィールドの値を)出力 データ 12として、出力手段 9に送出する手段である。尚、この有効度に分類された複 数の経験レコード 11がある場合には、そこに分類された各経験レコード 11に於ける 出力値フィールドの平均値を出力データ 12とすると良い。  The validity classifying means 8 classifies the validity calculated by the validity calculating means 7 for each validity reference data, and determines the value of the output value field in the experience record 11 with high validity (validity is The higher the value, the higher the likelihood that the experience record 11 will be, and therefore the value of the output value field in the experience record 11 that is preferably classified according to the highest validity criteria. Numeral 12 is means for sending to the output means 9. If there are a plurality of experience records 11 classified according to the degree of validity, the average value of the output value field in each of the experience records 11 classified therein may be used as the output data 12.
[0056] 図 3に本発明のベイジアンネットワーク近似処理装置 1の処理の概念図を示す。  FIG. 3 shows a conceptual diagram of the processing of the Bayesian network approximation processing device 1 of the present invention.
[0057] 具体的には、有効度演算手段 7が、各経験レコード 11について、上述の数 1を用い ることで有効度を計算し、それを有効度分類手段 8で有効度基準データ毎に分類す る。そして最も高レ、有効度の基準に分類された経験レコード 11に於ける出力値フィ 一ルドの値を、出力データ 12として出力手段 9に送出する。尚、有効度の分類は、有 効度毎にソーティング(並べ替え)を行えば良い。そして複数の経験レコード 11が分 類されている場合には、その各経験レコード 11の出力値フィールドの平均値を、出 力データ 12として出力手段 9に送出する。  Specifically, the validity calculating means 7 calculates the validity of each experience record 11 by using the above equation 1, and the validity classifying means 8 calculates the validity for each validity reference data. Classify. Then, the value of the output value field in the experience record 11 classified according to the highest level and the standard of effectiveness is sent to the output means 9 as output data 12. In addition, the classification of the effectiveness may be performed by sorting (sorting) for each effectiveness. When a plurality of experience records 11 are classified, the average value of the output value field of each experience record 11 is sent to the output means 9 as output data 12.
[0058] 尚、有効度演算手段 7に於いて、上述した有効度の計算に際しては、一般的には 感度データが固定されていることが多ぐ入力レコード 10の一つあたりに多数の経験 レコード 11を入力する為、値が変わる頻度は「感度データく入力レコード 10く経験 レコード 11」である。これを利用して、入力レコード 10を受け付けた時点で先に感度 データと入力レコード 10との乗算を計算しておき、これを中間データとしてデータ記 憶手段 6等に記憶しておき、経験レコード 11を読み込み有効度の計算に用いる際に 、データ記憶手段 6等に記憶した中間データと経験レコード 11の各要素とを乗算して 有効度を計算しても良い。これによつて処理速度、回路規模の面から有利となる。 [0058] In the calculation of the above-mentioned effectiveness in the effectiveness calculation means 7, generally, sensitivity data is often fixed, and a large number of experience records per input record 10 are required. Since record 11 is entered, the frequency at which the value changes is "sensitivity data, input record 10, and experience record 11". By utilizing this, when the input record 10 is received, the multiplication of the sensitivity data and the input record 10 is calculated first, and this is stored as intermediate data in the data storage means 6, etc. When 11 is used for the calculation of the read effectiveness, the effectiveness may be calculated by multiplying the intermediate data stored in the data storage means 6 and the like and each element of the experience record 11. This is advantageous in terms of processing speed and circuit scale.
[0059] つまり、「有効度 =感度データ X入力レコード 10 X経験レコード 11」をまとめて計算 する代わりに、「中間データ =感度データ X入力レコード 10」を予め計算しておき、 それをデータ記憶手段 6に記憶し、経験レコード 11を読み込んだ際に「有効度 =中 間データ X経験レコード 11」の計算をすることで、更に少ないリソースで多数の経験 レコード 11の処理を行うことが出来る。  [0059] In other words, instead of calculating "effectiveness = sensitivity data X input record 10 X experience record 11" collectively, "intermediate data = sensitivity data X input record 10" is calculated in advance, and it is stored in the data. By storing it in the means 6 and calculating the "effectiveness = intermediate data X experience record 11" when the experience record 11 is read, it is possible to process many experience records 11 with even less resources.
[0060] 出力手段 9は、有効度分類手段 8で分類した出力データ 12を、ベイジアンネットヮ ーク近似処理装置 1の外部へ処理結果として出力する手段である。  The output unit 9 is a unit that outputs the output data 12 classified by the effectiveness classifying unit 8 to the outside of the Bayesian network approximation processing device 1 as a processing result.
[0061] 本発明のベイジアンネットワーク近似処理装置 1は、ハードウェア、ソフトウェアのい ずれであっても処理することが出来るが、特に効果が著しいの力 S、ハードウェアとした 場合である。  [0061] The Bayesian network approximation processing device 1 of the present invention can process any of hardware and software, but is particularly effective when the hardware S and the hardware are used.
[0062] 従来のベイジアンネットワークを処理するソフトウェアの場合、図 26 (a)に示すように 、メモリ上で読み込んだ後に、 CPUで計算を行い、更に CPUでの計算が終了後再 度メモリ上での読み込み作業を実行している。  In the case of conventional Bayesian network processing software, as shown in FIG. 26 (a), after reading in the memory, calculation is performed in the CPU, and after the calculation in the CPU is completed, the calculation is performed again in the memory. Is performing a read operation.
[0063] 一方、本発明のベイジアンネットワーク近似処理装置 1をハードウェア化した場合、 図 26 (b)に示すように、 SDRAMのバースト転送に適合していることから、メモリ上で は常に読み込みを行い、それをハードウェアで計算している。つまり、メモリとハードウ エアとの間で並行してパイプライン処理が行えることから、従来よりも高速処理の効果 が得られることとなる。  On the other hand, when the Bayesian network approximation processing device 1 of the present invention is implemented as hardware, as shown in FIG. 26 (b), since it is suitable for SDRAM burst transfer, reading is always performed on the memory. Done and calculating it in hardware. In other words, since pipeline processing can be performed in parallel between the memory and the hardware, the effect of high-speed processing can be obtained as compared with the conventional case.
[0064] ベイジアンネットワーク近似処理装置 1は、制御チップ等に直接回路として組み込 んだり、組み込み CPUに対する外付けのコプロセッサに組み込んだり、拡張ボードの ハードウェアに組み込んだり、ュビキタス分野向け(例えば携帯電話、 PHS、 PDA等 の移動端末)にマイクロコントローラとして内蔵したり、小規模制御装置向けのオール インワン製品として組み込むことが好適である。 The Bayesian network approximation processing device 1 can be incorporated directly into a control chip or the like as a circuit, incorporated into an external coprocessor for an embedded CPU, incorporated into expansion board hardware, or used in the ubiquitous field (for example, It can be built as a microcontroller in mobile terminals such as telephones, PHS, PDAs, etc. Preferably, it is incorporated as an in-one product.
[0065] 図 4にベイジアンネットワーク近似処理装置 1がハードウェアに組み込まれた場合の 概念図を示す。図 4 (a)は CPUに対する外付けのコプロセッサとなった場合であり、 図 4 (b)は拡張ボードに組み込まれた場合であり、図 4 (c)ュビキタス分野向けにマイ クロコントローラとして内蔵した場合であり、図 4 (d)は小規模制御装置向けのオール インワン製品として組み込んだ場合である。  FIG. 4 shows a conceptual diagram when the Bayesian network approximation processing device 1 is incorporated in hardware. Fig. 4 (a) shows the case where an external coprocessor is used for the CPU, and Fig. 4 (b) shows the case where it is built into an expansion board.Fig. 4 (c) Built-in microcontroller for the ubiquitous field Fig. 4 (d) shows the case where it is incorporated as an all-in-one product for small-scale control devices.
[0066] 図 4 (a)の場合は、ベイジアンネットワーク近似処理装置 1の演算ユニットを CPUに 接続する場合を示しており、経験レコード 11や入力レコード 10は、 CPUから入力を 受け付ける(受信する)こととなる。従ってベイジアンネットワーク近似処理装置 1は、 演算ユニットのチップに記録されてレ、ることとなる。  FIG. 4 (a) shows a case where the arithmetic unit of the Bayesian network approximation processor 1 is connected to the CPU, and the experience record 11 and the input record 10 receive (receive) input from the CPU. It will be. Therefore, the Bayesian network approximation processing device 1 is recorded on the chip of the arithmetic unit.
[0067] 図 4 (b)の場合は、拡張ボード上にコントローラと SDRAMとベイジアンネットワーク 近似処理装置 1の演算ユニットとを設け、ホストはコントローラを通じて、経験レコード 11を SDRAMに書き込んでおく。ホストが入力レコード 10をコントローラに与えると、 コントローラは演算ユニットの設定を行レ、、 SDRAM力ら演算ユニットに経験レコード 11を転送して、演算ユニットがベイジアンネットワーク近似処理を行う。この間、ホスト は別の処理を行うことが出来、又、拡張ボードはホストとは関係なく高速動作できるメ ジ、 /卜力 Sある。  In the case of FIG. 4B, a controller, an SDRAM, and an arithmetic unit of the Bayesian network approximation processor 1 are provided on an expansion board, and the host writes the experience record 11 to the SDRAM through the controller. When the host gives the input record 10 to the controller, the controller sets the arithmetic unit, transfers the experience record 11 to the arithmetic unit from the SDRAM, and the arithmetic unit performs Bayesian network approximation processing. During this time, the host can perform other processing, and the expansion board has the advantage of being able to operate at high speed independently of the host.
[0068] 図 4 (c)の場合は、図 4 (a)を高速化する為に図 4 (b)の手法と組み合わせたもので あって、ベイジアンネットワーク近似処理装置 1の演算ユニットとコントローラを一体化 し、外部に SDRAMを接続し、更に CPUとも接続する。ここで SDRAMは演算ュニッ ト専用で、 CPUからはコントローラ経由でないとアクセスできない点が特徴となる。こ れも SDRAMと演算ユニットとが、ホストと関係なく動作できるメリットがある。  In the case of FIG. 4 (c), the method of FIG. 4 (a) is combined with the method of FIG. 4 (b) in order to speed up the operation of FIG. 4 (a). Integrate, connect SDRAM externally, and also connect to CPU. Here, the SDRAM is dedicated to the operation unit, and is characterized in that it can be accessed from the CPU only through the controller. This also has the advantage that the SDRAM and the arithmetic unit can operate independently of the host.
[0069] 図 4 (d)の場合は、図 4 (c)のコントローラを強化し、ワンチップマイコンとしての機能 を持たせ、機器全体の制御までを行わせようとするものである。  In the case of FIG. 4 (d), the controller of FIG. 4 (c) is strengthened so as to have a function as a one-chip microcomputer and to control the entire device.
[0070] 上述の拡張ボードとしては、例えば Gidel社の PROCStar25- 1C、 PROCStar25- 1、 PR〇CStar25_2、 PROCStar25_3や、三菱電機エンジニアリング株式会社の  [0070] As the above-mentioned expansion board, for example, PROCStar25-1C, PROCStar25-1, PR-CStar25_2, PROCStar25_3 of Gidel or Mitsubishi Electric Engineering Co., Ltd.
KAC-02Aがある。又、コントローラとしては、例えば ARM社の ARM922T、  There is KAC-02A. In addition, as a controller, for example, ARM922T of ARM,
ARM926EJ- S、 ARM1026EJ-S、 ARM1136J(F)_S、 ARM7TDMI, ARM7TDMI-S, SC100、 SC200、 ARM7EJ-S, ARM946E_S、 ARM966E-Sや、ルネサステクノロジ社の M32R/Eシリーズ(M32102S6FP、 M32104S6FP, M32121FCAWG、 ARM926EJ-S, ARM1026EJ-S, ARM1136J (F) _S, ARM7TDMI, ARM7TDMI-S, SC100, SC200, ARM7EJ-S, ARM946E_S, ARM966E-S and Renesas Technology M32R / E series (M32102S6FP, M32104S6FP, M32121FCAWG,
M32121MCB- XXXWG等)、 M32R/ECUシリーズ(32170グループ (M32170F3VFP、 M32170F4VFP, M32170F6VFP)、 32171グループ (M32171F2VFP、 M32171F3VFP, M32171F4VFP), 32172グループ (M32172F2VFP)、 32173グループ (M32173F2VFP)、 32174グループ(\432174?3 ??、 M32174F4VFP), 32176グループ(\432176?2 ??、 M32176F2VWG, M32176F2TFP, M32176F2TWG, M32176F3VFP, M32176F3VWG 、 M32176F3TFP, M32176F3TWG, M32176F4VFP, M32176F4VWG,  M32121MCB-XXXWG, etc.), M32R / ECU series (32170 group (M32170F3VFP, M32170F4VFP, M32170F6VFP), 32171 group (M32171F2VFP, M32171F3VFP, M32171F4VFP), 32172 group (M32172F2VFP), 321732 group (M32172F3) ??, M32174F4VFP), 32176 group (¥ 432176-2 ??, M32176F2VWG, M32176F2TFP, M32176F2TWG, M32176F3VFP, M32176F3VWG, M32176F3TFP, M32176F3TWG, M32176F4VFP, M32176FVWG
M32176F4TFP, M32176F4TWG), 32180グループ (M32180F8TFP、 M32180F8UFP、 M32180F8VFP), 32182グループ (M32182F3TFP、 M32182F3UFP, 32182F3VFP、 M32182F8TFP, M32182F8UFP, M32182F8VFP)、 M32R/Iシリーズがある。  M32176F4TFP, M32176F4TWG), 32180 Group (M32180F8TFP, M32180F8UFP, M32180F8VFP), 32182 Group (M32182F3TFP, M32182F3UFP, 32182F3VFP, M32182F8TFP, M32182F8UFP, M32182F8VFP, M32182F8VFP)
実施例 1  Example 1
[0071] 次にベイジアンネットワーク近似処理装置 1に於ける処理の流れについて、図 2のフ ローチャート図及び図 1のシステム構成図を用いて詳細に説明する。  Next, the flow of processing in the Bayesian network approximation processing device 1 will be described in detail with reference to the flowchart of FIG. 2 and the system configuration diagram of FIG.
[0072] ベイジアンネットワーク近似処理装置 1で用いる、少なくとも一以上の経験レコード 1 1は予め設定されてレ、るものとする。  [0072] It is assumed that at least one or more experience records 11 used in the Bayesian network approximation processing device 1 are set in advance.
[0073] まずベイジアンネットワーク近似処理装置 1は、感度データの入力を感度データ入 力受付手段 2で受け付け、それをデータ記憶手段 6に記憶する(S100)。この際に入 力を受け付ける感度データの一例を図 5 (al)力 図 5 (a3)に示す。図 5 (al)及び図 5 (a2)では感度データを対称に設定した場合であり、図 5 (a3)では感度データを非 対称に設定した場合の一例である。  First, the Bayesian network approximation processing device 1 receives sensitivity data input by the sensitivity data input receiving means 2, and stores it in the data storage means 6 (S100). An example of sensitivity data that accepts input at this time is shown in Fig. 5 (al) and Fig. 5 (a3). 5 (a) and 5 (a2) show the case where the sensitivity data is set symmetrically, and FIG. 5 (a3) shows the example where the sensitivity data is set asymmetrically.
[0074] 感度データは、上述したように、フィールドの類似度が、全体の類似度にどの程度 影響するかを示したものである。例えばフィールド Aの感度データをフィールド Bの感 度データの 2倍に設定した場合、フィールド Aの方が有効度が高くなるので、フィーノレ ド Aの類似度の方がフィールド Bよりも重要であるとレ、う指定をすることとなる。このよう な感度データの設定をすることによって、値があまり変動しないフィールドを強調した り、値が大きく変動するフィールドを弱めたりすることにも利用可能である。一方、感度 データの裾野を図 5 (a2)のように広く設定することによって、フィールドが完全に同一 でなくても有効であるという指定をすることも出来る。これはノイズ成分の多いデータ の場合に、変動を吸収することが出来る点で有効である。又、図 5 (a3)に示したよう に、感度データを非対称に設定した場合には、例えば図 5 (a3)のような場合には、 入力レコード 10が経験レコード 11側よりも大きければ有効であるとして設定すること も出来る。 As described above, the sensitivity data indicates how much the field similarity affects the overall similarity. For example, if you set the sensitivity data fields A to 2 times the sensitivity of the data field B, since towards the field A becomes higher effectiveness, and towards the similarity Finore de A is more important than the field B Let's specify. By setting such sensitivity data, it can be used to emphasize fields whose values do not fluctuate much or to weaken fields whose values fluctuate greatly. On the other hand, by setting the sensitivity data base widely as shown in Fig. 5 (a2), the fields are completely the same. It is possible to specify that it is valid even if it is not. This is effective in that data having a large amount of noise components can absorb fluctuations. Also, as shown in Fig. 5 (a3), when sensitivity data is set asymmetrically, for example, in the case of Fig. 5 (a3), it is effective if the input record 10 is larger than the experience record 11 side. Can also be set as
[0075] 尚、感度データは重み付けであるので、どのような数値基準 (例えば負数)であって も良い。  Since the sensitivity data is weighted, any numerical reference (eg, a negative number) may be used.
[0076] 次に、経験レコード 11の分類を行う為の、有効度基準データの入力を有効度基準 データ入力受付手段 3で受け付け、それをデータ記憶手段 6に記憶する(S110)。有 効度基準データの一例を図 5 (b)に示す。  Next, the input of the validity reference data for classifying the experience records 11 is received by the validity reference data input receiving means 3, and is stored in the data storage means 6 (S110). An example of the effectiveness reference data is shown in Fig. 5 (b).
[0077] 尚、 S100及び S110の感度データ、有効度基準データの入力受付についてはど ちらが先であっても良い。  [0077] Either of the input of the sensitivity data and the input of the validity reference data in S100 and S110 may be performed first.
[0078] 感度データ、有効度基準データの入力を受け付けた後、ベイジアンネットワーク近 似処理装置 1は、センサー等のベイジアンネットワーク近似処理装置 1の外部から取 得した入力レコード 10を、入力レコード入力受付手段 4で受け付け、これをデータ記 憶手段 6に記憶する(S 120)。  [0078] After receiving the input of the sensitivity data and the validity reference data, the Bayesian network approximation processing device 1 converts the input record 10 acquired from outside the Bayesian network approximation processing device 1 such as a sensor into an input record input reception. It is accepted by the means 4 and stored in the data storage means 6 (S120).
[0079] 図 5 (c)に入力レコード 10の一例を、図 5 (d)に経験レコード 11の一例を示す。図 5  FIG. 5 (c) shows an example of the input record 10, and FIG. 5 (d) shows an example of the experience record 11. Fig. 5
(c)の例では、入力レコード 10が温度データを表す場合であって、「一 20°Cを値 0」、 「20°C— 30°Cを値 1」、「30°C— 40°Cを値 2」、「40°C—を値 3」として表現しており、 更に値 0をとる確率は 20%、値 1をとる確率は 70%、値 2をとる確率は 10%、値 3をと る確率は 0%を示している。又、図 5 (d)の例では、経験レコード 11として温度と湿度 と風速の関係を示しており、「値が 0になる確率は、温度が 20%、湿度が 0%、風速が 90%」、「値が 1になる確率は、温度が 70%、湿度が 10%、風速が 10%」、「値が 2に なる確率は、温度が 10%、湿度が 20%、風速が 0%」、「値が 3になる確率は、温度 力 ¾%、湿度が 70%、風速が 0%」となることを示している。  In the example of (c), when the input record 10 represents temperature data, “one 20 ° C is a value of 0”, “20 ° C—30 ° C is a value of 1”, and “30 ° C—40 ° The value of C is expressed as 2 '' and the value of 40 ° C is expressed as 3.The probability of taking the value 0 is 20%, the probability of taking the value 1 is 70%, the probability of taking the value 2 is 10%, the value The probability of taking 3 indicates 0%. In the example of Fig. 5 (d), the relationship between temperature, humidity, and wind speed is shown as experience record 11, "The probability that the value becomes 0 is 20% for temperature, 0% for humidity, and 90% for wind speed. '', `` The probability of a value of 1 is 70% for temperature, 10% for humidity, and 10% for wind speed '', `` The probability of a value of 2 is for a temperature of 10%, a humidity of 20%, and a wind speed of 0% ”,“ The probability that the value will be 3 indicates that the temperature power is ¾%, the humidity is 70%, and the wind speed is 0% ”.
[0080] 経験レコード入力受付手段 5は各経験レコード 11を一つずつ取得し(S130)、有 効度演算手段 7に対して、取得した経験レコード 11を一つずつ送出する。そして有 効度演算手段 7は、データ記憶手段 6に記憶している入力レコード 10と感度データと The experience record input receiving means 5 acquires each experience record 11 one by one (S 130) and sends the acquired experience records 11 to the effectiveness calculating means 7 one by one. Then, the validity calculation means 7 compares the input record 10 stored in the data storage means 6 with the sensitivity data.
Figure imgf000019_0001
Figure imgf000019_0001
誰鄹入入^ΛΛ Η^r 1110 πS、S、 u u〜〜〜〜 20) + (20 X 0 X 0) + (80 X 0 X 0) + (0 X 0 X 20) + (0 X 0 X 100) = 744000とな る。 鄹 鄹 入 ^ r r ^ r 1110 πS, S, uu ~ ~ ~ ~ 20) + (20 X 0 X 0) + (80 X 0 X 0) + (0 X 0 X 20) + (0 X 0 X 100) = 744000.
[0084] 上述で各フィールド毎に有効度が計算できたので、経験レコード 11に対する有効 度を計算する為に、その合計値を各フィールド毎に加算することとなる。例えば上述 の例の場合、 S140で得られた各フィールドに対する有効度を加算する。従って、こ の場合の有効度は 840000 + 744000 = 1584000となる。  Since the validity has been calculated for each field as described above, the total value is added for each field in order to calculate the validity for the experience record 11. For example, in the case of the above example, the validity for each field obtained in S140 is added. Therefore, the effectiveness in this case is 840000 + 744000 = 1584000.
[0085] このように数 1で計算した各経験レコード 11の有効度を、有効度分類手段 8が、 S1 10で入力を受け付けた有効度基準データ毎に分類を行い、最も高い有効度基準デ ータに該当する経験レコード 11に於ける出力値フィールドの値を出力データ 12とす る(S150)。当該有効度基準データに複数の経験レコード 11が分類された場合には 、そこに分類された各経験レコード 11に於ける出力値フィールドの平均値を計算し、 出力データとする。  [0085] The validity classifying means 8 classifies the validity of each experience record 11 calculated in Equation 1 for each validity reference data received in S1 10 and obtains the highest validity standard data. The value of the output value field in the experience record 11 corresponding to the data is set as the output data 12 (S150). When a plurality of experience records 11 are classified into the validity reference data, the average value of the output value field in each of the classified experience records 11 is calculated and used as output data.
[0086] 上述の図 5 (b)に示した有効度基準データの場合 (これが最も高い有効度基準デ ータであるとする)、上述の有効度は 1584000であることから、基準は「0」となり、こ の経験レコード 11が最も高レ、 (この場合は経験レコード 11がーつなので、常にこの 経験レコード 11の出力値フィールドの値)ので、この経験レコード 11の出力値フィー ルドを出力値として出力手段 9に送出する。  [0086] In the case of the validity reference data shown in Fig. 5 (b) (this is the highest validity reference data), the above-mentioned validity is 1584000, so the criterion is "0". , And this experience record 11 is the highest level. (In this case, the value of the output value field of this experience record 11 is output because the number of experience records 11 is one.), So the output value field of this experience record 11 is output. It is sent to the output means 9 as a value.
[0087] S140及び S150に於ける処理を、図 24にコンピュータ言語の一つである C言語を 用いて表記する。尚、ベイジアンネットワーク近似処理装置 1を制御チップ等のハード ウェアとして実現する場合には、この処理がハードウェア資源上で実現されてレ、ること は言うまでもない。 [0087] The processing in S140 and S150 is described in FIG. 24 using C language, which is one of computer languages. When the Bayesian network approximation processing device 1 is realized as hardware such as a control chip, it goes without saying that this processing is realized on hardware resources.
[0088] S150で有効度分類手段 8が処理した出力データ 12を、有効度分類手段 8は出力 手段 9に送出し、出力手段 9が出力値として、出力データ 12を出力する(S160)。 実施例 2  The output data 12 processed by the effectiveness classifying means 8 in S150 is sent to the output means 9 and the output means 9 outputs the output data 12 as an output value (S160). Example 2
[0089] 次に具体的な例を用いてベイジアンネットワーク近似処理装置 1の処理の流れを説 明する。尚、ベイジアンネットワーク近似処理装置 1で処理するモデルとして故障予 測とし、入力されるのは油温と油量の 2つであるとする。即ち入力レコード 10として油 温と油量の 2つがあり、出力されるのは故障予測に力かる出力データ 12である。 [0090] 油温フィールドは図 6 (a)に示すように、「一 20°Cの時には値 0」、「20°C— 40°Cの 時には値 1」、「40°C— 60°Cの時には値 2」、「60°C—の時には値 3」の 4値を取るフィ 一ルドとし、油量フィールドは図 6 (b)に示すように、「800ml—の時には値 0」、「600 ml 800mlの時には値 1」、「400ml 600mlの時には値 2」、「一 400mlの時には 値 3」を取るフィールドとする。ここでは、説明を容易にする為に、値が大きければ異 常であるように割り当てたが、任意に割り当てて良い。又、故障予測フィールドは図 6 (c)に示すように、「正常の時には値 0」、「注意の時には値 1」、「要点検の時には値 2 」、「非常停止の時には値 3」を取るフィーノレドとする。 Next, the processing flow of the Bayesian network approximation processing device 1 will be described using a specific example. It is assumed that a failure prediction is performed as a model to be processed by the Bayesian network approximation processing device 1, and that two input values are an oil temperature and an oil amount. That is, there are two input records 10 of oil temperature and oil quantity, and the output is output data 12 that is useful for failure prediction. [0090] As shown in Fig. 6 (a), the oil temperature field is "Value 0 at one 20 ° C", "Value 1 at 20 ° C-40 ° C", "40 ° C-60 ° C". The value of the oil volume field is 4 when the value is 2 and the value is 3 when the temperature is 60 ° C. As shown in Fig. 6 (b), the oil volume field is A field that takes the value 1 for 600 ml 800 ml, the value 2 for 400 ml 600 ml, and the value 3 for 400 ml 600 ml. Here, for the sake of simplicity of explanation, the assignment is made as if the value is large, but may be assigned arbitrarily. As shown in Fig. 6 (c), the failure prediction field contains "Value 0 for normal operation", "Value 1 for caution", "Value 2 for inspection required", and "Value 3 for emergency stop". Take Fino Redo.
[0091] 更に経験レコード 11として、過去の入力レコード 10と、その事象の時に出力して欲 しい出力値フィールドを設定しておく。この例では図 7 (a)から図 7 (e)の 5つの経験レ コード 11がある場合を説明する。図 7 (a)では事象が正常時の場合の入力レコード 1 0とその出力値フィールドからなる経験レコード 11のサンプル、図 7 (b)では事象が注 意時の場合の入力レコード 10とその出力値フィールドからなる経験レコード 11のサ ンプル 1、図 7 (c)では事象が注意時の場合の入力レコード 10とその出力値フィール ドからなる経験レコード 11のサンプル 2、図 7 (d)では事象が要点検時の場合の入力 レコード 10とその出力値フィールド力 なる経験レコード 11のサンプル、図 7 (e)では 事象が非常停止時の場合の入力レコード 10とその出力値フィールドからなる経験レ コード 11のサンプル、の場合である。  [0091] Further, as the experience record 11, a past input record 10 and an output value field desired to be output at the time of the event are set. In this example, the case where there are five experience records 11 shown in FIGS. 7 (a) to 7 (e) will be described. Figure 7 (a) is a sample of the experience record 11 consisting of the input record 10 and its output value field when the event is normal, and Figure 7 (b) is the input record 10 and its output when the event is cautious Sample 1 of experience record 11 consisting of value fields.In Fig. 7 (c), input record 10 when the event is a caution and sample 2 of experience record 11 consisting of its output value field.Fig. 7 (d) shows the event. Is an input record 10 and its output value field in the case of an inspection required.A sample of the experience record 11 is shown.In Fig. 7 (e), an experience record consisting of the input record 10 and its output value field when the event is an emergency stop This is the case of 11 samples.
[0092] ベイジアンネットワーク近似処理装置 1は、図 8に示す油温フィールドの感度データ と油量フィールドの感度データとを、ベイジアンネットワーク近似処理装置 1の感度デ ータ入力受付手段 2で受け付け、それをデータ記憶手段 6に記憶する(S100)。  [0092] The Bayesian network approximation processing device 1 receives the sensitivity data of the oil temperature field and the sensitivity data of the oil amount field shown in FIG. Is stored in the data storage means 6 (S100).
[0093] 次に、ベイジアンネットワーク近似処理装置 1の有効度基準データ入力受付手段 3 は、図 9に示すデータを有する有効度基準データの入力を受け付け、それをデータ 記憶手段 6に記憶する(S 110)。  Next, the validity reference data input receiving means 3 of the Bayesian network approximation processing device 1 receives the input of the validity reference data having the data shown in FIG. 9, and stores it in the data storage means 6 (S 110).
[0094] 感度データ、有効度基準データの入力を受け付けた後、本実施例では油温と油量 力 装置の故障予測を行うので、ベイジアンネットワーク近似処理装置 1は、油温を 検知する油温センサーと油量を検知する油量センサーで取得した、図 10に示す入 カレコード 10を入力レコード入力受付手段 4で入力を受け付け、それをデータ記憶 手段 6に記憶する(SI 20)。 [0094] After accepting the input of the sensitivity data and the validity reference data, in this embodiment, the oil temperature and the failure of the oil power device are predicted. Therefore, the Bayesian network approximation processor 1 detects the oil temperature at which the oil temperature is detected. The input record 10 shown in Fig. 10 obtained by the sensor and the oil level sensor that detects the oil level is received by the input record input receiving means 4, and the data is stored. Store in means 6 (SI 20).
[0095] 次に経験レコード入力受付手段 5は、各経験レコード 11を一つずつ取得し(S130) 、有効度演算手段 7に対して、取得した経験レコード 11を一つずつ送出する。そして 有効度演算手段 7は、データ記憶手段 6に記憶している入力レコード 10と感度デー タとを取得する。そして有効度演算手段 7は、経験レコード 11毎に、入力レコード 10 と各経験レコード 11とについてフィールド毎に与えられた感度データで重み付けを 行った乗算の合計をそれぞれの有効度として計算する(即ち数 1を計算する) (S140 Next, the experience record input receiving means 5 acquires each experience record 11 one by one (S 130), and sends the acquired experience records 11 to the effectiveness calculating means 7 one by one. Then, the validity calculating means 7 acquires the input record 10 and the sensitivity data stored in the data storage means 6. Then, for each experience record 11, the effectiveness calculation means 7 calculates the sum of multiplications weighted by the sensitivity data given for each field for the input record 10 and each experience record 11 as the respective effectiveness (that is, for each experience record 11). (Calculate number 1) (S140
[0096] そして有効度分類手段 8は、有効度演算手段 7で計算した有効度に基づいて、経 験レコード 11毎の有効度を有効度基準データ毎に分類し、高い有効度基準データ( 好適には最も高い有効度基準データ)に分類された経験レコード 11に於ける出力値 フィールドの値を、出力手段 9で出力する出力データ 12とする(S150)。もし最も高 い有効度基準データに分類された経験レコード 11が複数あった場合には、そこに分 類された経験レコード 11の出力値フィールドの平均値を出力データ 12とする。尚、 S 140及び S150の処理について、 C言語で表現すると図 24のように表現できる。 [0096] The validity classifying means 8 classifies the validity of each of the experience records 11 for each validity reference data based on the validity calculated by the validity calculating means 7, and outputs the high validity reference data (preferably The value of the output value field in the experience record 11 classified as the highest validity reference data) is set as the output data 12 output by the output means 9 (S150). If there is a plurality of experience records 11 classified as the highest validity standard data, the average value of the output value field of the experience record 11 classified there is used as the output data 12. The processing of S140 and S150 can be expressed as shown in FIG. 24 when expressed in C language.
[0097] S140及び S150の処理を実施例 2の具体例に当てはめると下記のような計算とな る。 [0097] When the processes of S140 and S150 are applied to the specific example of the second embodiment, the following calculation is performed.
[0098] まず入力レコード 10と経験レコード 11Aとの比較を行い有効度を有効度演算手段  First, the input record 10 is compared with the experience record 11A, and the validity is calculated by the validity calculating means.
7で計算すると、図 11のように油温フィールドの有効度、油量フィールドの有効度が 計算できる。この場合、油温フィーノレドの有効度は 680000となり、油量フィールドの 有効度は 300000となる。従って経験レコード 11Aとの比較を行った場合の有効度 は 980000となる。この経験レコード 11Aの場合の有効度をデータ記憶手段 6に記憶 する。  With the calculation in 7, the validity of the oil temperature field and the validity of the oil quantity field can be calculated as shown in Fig. 11. In this case, the effectiveness of the oil temperature finolade is 680,000 and the effectiveness of the oil volume field is 300,000. Therefore, the effectiveness of comparison with experience record 11A is 980000. The validity of the experience record 11A is stored in the data storage means 6.
[0099] 入力レコード 10と経験レコード 11Bとの比較を行い有効度を有効度演算手段 7で 計算すると、図 12のように油温フィールドの有効度、油量フィールドの有効度が計算 できる。この場合、油温フィーノレドの有効度は 680000となり、油量フィールドの有効 度は 50000となる。従って経験レコード 11Bとの比較を行った場合の有効度は 7300 00となる。この経験レコード 11Bの場合の有効度をデータ記憶手段 6に記憶する。 [0100] 入力レコード 10と経験レコード 11Cとの比較を行い有効度を有効度演算手段 7で 計算すると、図 13のように油温フィールドの有効度、油量フィールドの有効度が計算 できる。この場合、油温フィーノレドの有効度は 320000となり、油量フィールドの有効 度は 300000となる。従って経験レコード 11Cとの比較を行った場合の有効度は 620 000となる。この経験レコード 11Cの場合の有効度をデータ記憶手段 6に記憶する。 [0099] By comparing the input record 10 with the experience record 11B and calculating the validity by the validity calculating means 7, the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the effectiveness of the oil temperature finoled is 680,000 and the effectiveness of the oil quantity field is 50,000. Therefore, the effectiveness when compared with the experience record 11B is 7300000. The validity of the experience record 11B is stored in the data storage means 6. [0100] By comparing the input record 10 with the experience record 11C and calculating the validity by the validity calculating means 7, the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the effectiveness of the oil temperature finoled is 320,000, and the effectiveness of the oil volume field is 300000. Therefore, the effectiveness when compared with the experience record 11C is 620 000. The validity of the experience record 11C is stored in the data storage means 6.
[0101] 入力レコード 10と経験レコード 11Dとの比較を行い有効度を有効度演算手段 7で 計算すると、図 14のように油温フィールドの有効度、油量フィールドの有効度が計算 できる。この場合、油温フィーノレドの有効度は 320000となり、油量フィールドの有効 度は 50000となる。従って経験レコード 11Dとの比較を行った場合の有効度は 3700 00となる。この経験レコード 11Dの場合の有効度をデータ記憶手段 6に記憶する。  [0101] By comparing the input record 10 with the experience record 11D and calculating the validity by the validity calculating means 7, the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the effectiveness of the oil temperature finoled is 320,000 and the effectiveness of the oil quantity field is 50,000. Therefore, the effectiveness when compared with the experience record 11D is 3700000. The validity of the experience record 11D is stored in the data storage means 6.
[0102] 入力レコード 10と経験レコード 11Eとの比較を行い有効度を有効度演算手段 7で 計算すると、図 15のように油温フィールドの有効度、油量フィールドの有効度が計算 できる。この場合、油温フィーノレドの有効度は 40000となり、油量フィールドの有効度 は 300000となる。従って経験レコード 11Eとの比較を行った場合の有効度は 3400 00となる。この経験レコード 11Eの場合の有効度をデータ記憶手段 6に記憶する。  [0102] By comparing the input record 10 with the experience record 11E and calculating the validity by the validity calculating means 7, the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the validity of the oil temperature finoled is 40,000 and the validity of the oil amount field is 300,000. Therefore, the effectiveness when compared with the experience record 11E is 3400000. The validity of the experience record 11E is stored in the data storage means 6.
[0103] このように各経験レコード 11Aから経験レコード 11Eの有効度と、 S110で入力を受 け付けた有効度基準データとを、有効度分類手段 8がデータ記憶手段 6から抽出し、 有効度分類手段 8が有効度基準データ毎に分類し、最も高い有効度基準データに 分類された経験レコード 11に於ける出力値フィールドの値を、出力手段 9に出力デ ータ 12として送出する(S150)。ここでは最も高い有効度基準データに分類された経 験レコード 11は一つなので、出力値フィールドの平均値は計算しなくても良い。  [0103] Thus, the validity classifying means 8 extracts the validity of each of the experience records 11A to 11E and the validity reference data accepted in S110 from the data storage means 6, and The classifying means 8 classifies each of the effectiveness reference data, and sends the value of the output value field in the experience record 11 classified as the highest effectiveness reference data to the output means 9 as the output data 12 (S150). ). In this case, since only one experience record 11 is classified as the highest validity reference data, it is not necessary to calculate the average value of the output value field.
[0104] 従って上述の場合、経験レコード 11Aの有効度は 980000であることから、有効度 基準データの値 2 (800001— 1000000)に該当する。  Therefore, in the above case, the validity of the experience record 11A is 980000, which corresponds to the value 2 (800001-10000000) of the validity reference data.
[0105] 経験レコード 11Bの有効度は 730000であることから、有効度基準データの値 3 (6 00001— 800000)に該当する。  [0105] Since the validity of the experience record 11B is 730000, it corresponds to the value 3 (60001-800000) of the validity reference data.
[0106] そして経験レコード 11Cの有効度は 620000であることから、経験レコード 11Bと同 様に有効度基準データの値 3 (600001 800000)に該当する。  [0106] Since the validity of the experience record 11C is 620000, it corresponds to the value 3 (600001 800000) of the validity reference data similarly to the experience record 11B.
[0107] 経験レコード 11Dの有効度は 370000であることから、有効度基準データの値 5 (2 00001— 400000)に該当する。 [0107] Since the validity of experience record 11D is 370000, the value 5 (2 00001-400000).
[0108] 経験レコード 1 IEの有効度は 340000であることから、経験レコード 11Dと同様に 有効度基準データの値 5 (200001— 400000)に該当する。  [0108] Since the validity of the experience record 1 IE is 340000, it corresponds to the value 5 (200001-400000) of the validity reference data as in the case of the experience record 11D.
[0109] 上述のように有効度基準データ毎に分類されたものを模式的に図 16に示す。そう すると最も高い有効度基準データに分類されたのが、基準値「2」の経験レコード 11 Aなので、この経験レコード 11Aに於ける出力値フィールド(図 7 (a)に於ける経験レ コード 11の故障予測の値)の値を参照すると、「100, 0, 0, 0」である。そして基準値 「2」に分類された経験レコード 11はこれだけなので、出力データ 12の値も「100, 0, 0, 0」となる。従って有効度分類手段 8は出力手段 9で出力する出力データ 12として 「100, 0, 0, 0」を出力手段 9に送出する。  FIG. 16 schematically shows the data classified for each validity reference data as described above. Then, since the experience record 11A with the reference value “2” was classified as the highest effectiveness reference data, the output value field in this experience record 11A (experience record 11 in FIG. 7 (a)) was used. Referring to the value of (failure prediction value of the above), it is “100, 0, 0, 0”. Since the experience record 11 classified into the reference value “2” is only this, the value of the output data 12 is also “100, 0, 0, 0”. Therefore, the effectiveness classifying means 8 sends “100, 0, 0, 0” to the output means 9 as the output data 12 output by the output means 9.
[0110] 出力手段 9は、有効度分類手段 8から出力データ 12「100, 0, 0, 0」を受信すると 、それを出力する。従って、現在の出力結果としては、正常時であると判断できる。 実施例 3  [0110] When receiving the output data 12 "100, 0, 0, 0" from the effectiveness classifying means 8, the output means 9 outputs it. Therefore, it can be determined that the current output result is normal. Example 3
[0111] 次に経験データ、感度データ、有効度基準データは実施例 2と同様であって (即ち 、経験データが図 7、感度データが図 8、有効度基準データが図 9)、入力レコード入 力受付手段 4で受け付け(S120)、データ記憶手段 6に記憶した入力レコード 10が 図 17に示す場合であったとする。尚、 S100、 S110は実施例 2と同様となる。  Next, the experience data, sensitivity data, and effectiveness reference data are the same as those in Example 2 (that is, the experience data is FIG. 7, the sensitivity data is FIG. 8, and the effectiveness reference data is FIG. 9). Assume that the input record 10 received by the input receiving means 4 (S120) and the input record 10 stored in the data storage means 6 is as shown in FIG. Note that S100 and S110 are the same as in the second embodiment.
[0112] 実施例 2と同様に、経験レコード入力受付手段 5は、各経験レコード 11を一つずつ 取得し (S130)、有効度演算手段 7に対して、取得した経験レコード 11を一つずつ 送出する。そして有効度演算手段 7は、データ記憶手段 6に記憶している入力レコー ド 10と感度データとを取得する。有効度演算手段 7は、経験レコード 11毎に、入カレ コード 10と各経験レコード 11とについてフィールド毎に与えられた感度データで重み 付けを行った乗算の合計をそれぞれの有効度として計算する (fiPち数 1を計算する) (S140)。  As in the second embodiment, the experience record input receiving means 5 acquires each experience record 11 one by one (S130), and sends the acquired experience records 11 one by one to the effectiveness calculation means 7. Send out. Then, the validity calculation means 7 acquires the input record 10 and the sensitivity data stored in the data storage means 6. The effectiveness calculation means 7 calculates, for each experience record 11, the sum of multiplications weighted by the sensitivity data given for each field for the input record 10 and each experience record 11, as the respective effectiveness. (Calculate fiP number 1) (S140).
[0113] そして有効度分類手段 8は、有効度演算手段 7で計算した有効度に基づいて、経 験レコード 11毎の有効度を有効度基準データ毎に分類し、最も高い有効度基準デ ータに分類された経験レコード 11に於ける出力値フィールドの値を、出力手段 9で出 力する出力データとする(S150)。もし最も高い有効度基準データに分類された経験 レコード 11が複数あった場合には、そこに分類された経験レコード 11に於ける出力 値フィールドの平均値を計算し、出力データ 12とする。 [0113] Then, the effectiveness classifying means 8 classifies the effectiveness of each of the experience records 11 for each of the effectiveness reference data based on the effectiveness calculated by the effectiveness calculation means 7, and outputs the highest effectiveness reference data. The value of the output value field in the experience record 11 classified into the data is used as the output data output by the output means 9 (S150). Experience classified as the highest efficacy criterion data When there are a plurality of records 11, the average value of the output value fields in the experience records 11 classified therein is calculated and output data 12 is obtained.
[0114] S140及び S150の処理を実施例 3の具体例に当てはめると下記のような計算とな る。 [0114] When the processes of S140 and S150 are applied to the specific example of the third embodiment, the following calculation is performed.
[0115] まず入力レコード 10と経験レコード 11Aとの比較を行い有効度を有効度演算手段  First, the input record 10 is compared with the experience record 11A, and the validity is calculated by the validity calculating means.
7で計算すると、図 18のように油温フィールドの有効度、油量フィールドの有効度が 計算できる。この場合、油温フィーノレドの有効度は 320000となり、油量フィールドの 有効度は 20000となる。従って経験レコード 11Aとの比較を行った場合の有効度は 340000となる。この経験レコード 11 Aの場合の有効度をデータ記憶手段 6に記憶 する。  By calculating in step 7, the validity of the oil temperature field and the validity of the oil quantity field can be calculated as shown in Fig. 18. In this case, the effectiveness of the oil temperature finoled is 320,000 and the effectiveness of the oil volume field is 20000. Therefore, the effectiveness when compared with experience record 11A is 340000. The validity of the experience record 11A is stored in the data storage means 6.
[0116] 入力レコード 10と経験レコード 11Bとの比較を行い有効度を有効度演算手段 7で 計算すると、図 19のように油温フィールドの有効度、油量フィールドの有効度が計算 できる。この場合、油温フィーノレドの有効度は 320000となり、油量フィールドの有効 度は 340000となる。従って経験レコード 11Bとの比較を行った場合の有効度は 660 000となる。この経験レコード 11Bの場合の有効度をデータ記憶手段 6に記憶する。  [0116] By comparing the input record 10 with the experience record 11B and calculating the validity by the validity calculating means 7, the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the effectiveness of the oil temperature finoled is 320,000 and the effectiveness of the oil quantity field is 340,000. Therefore, the effectiveness when compared with the experience record 11B is 660 000. The validity of the experience record 11B is stored in the data storage means 6.
[0117] 入力レコード 10と経験レコード 11Cとの比較を行い有効度を有効度演算手段 7で 計算すると、図 20のように油温フィールドの有効度、油量フィールドの有効度が計算 できる。この場合、油温フィーノレドの有効度は 680000となり、油量フィーノレドの有効 度は 160000となる。従って経験レコード 11Cとの比較を行った場合の有効度は 840 000となる。この経験レコード 11Cの場合の有効度をデータ記憶手段 6に記憶する。  [0117] By comparing the input record 10 with the experience record 11C and calculating the validity by the validity calculating means 7, the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the effectiveness of the oil temperature finoled is 680,000 and the effectiveness of the oil amount finoled is 160,000. Therefore, the effectiveness when compared with the experience record 11C is 840 000. The validity of the experience record 11C is stored in the data storage means 6.
[0118] 入力レコード 10と経験レコード 11Dとの比較を行い有効度を有効度演算手段 7で 計算すると、図 21のように油温フィールドの有効度、油量フィールドの有効度が計算 できる。この場合、油温フィーノレドの有効度は 680000となり、油量フィールドの有効 度は 34000となる。従って経験レコード 11Dとの比較を行った場合の有効度は 1020 000となる。この経験レコード 11Dの場合の有効度をデータ記憶手段 6に記憶する。  When the input record 10 is compared with the experience record 11D and the validity is calculated by the validity calculating means 7, the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the effectiveness of the oil temperature finoled is 680,000 and the effectiveness of the oil volume field is 34,000. Therefore, the effectiveness when compared with the experience record 11D is 1020 000. The validity of the experience record 11D is stored in the data storage means 6.
[0119] 入力レコード 10と経験レコード 11Eとの比較を行い有効度を有効度演算手段 7で 計算すると、図 22のように油温フィールドの有効度、油量フィールドの有効度が計算 できる。この場合、油温フィーノレドの有効度は 320000となり、油量フィールドの有効 度は 20000となる。従って経験レコード 1 IEとの比較を行った場合の有効度は 3400 00となる。この経験レコード 11Eの場合の有効度をデータ記憶手段 6に記憶する。 [0119] By comparing the input record 10 with the experience record 11E and calculating the validity by the validity calculating means 7, the validity of the oil temperature field and the validity of the oil amount field can be calculated as shown in FIG. In this case, the validity of the oil temperature finoled is 320,000, and the validity of the oil amount field is The degree is 20000. Therefore, the effectiveness of comparison with the experience record 1 IE is 3400000. The validity of the experience record 11E is stored in the data storage means 6.
[0120] このように各経験レコード 11Aから経験レコード 11Eの有効度と、 S110で入力を受 け付けた有効度基準データとを、有効度分類手段 8がデータ記憶手段 6から抽出し、 有効度分類手段 8が有効度基準データ毎に分類し、最も高い有効度基準データに 分類された経験レコード 11に於ける出力値フィールドの値を、出力手段 9に出力デ ータ 12として送出する(S150)。ここでは最も高い有効度基準データに分類された経 験レコード 11は一つなので、出力値フィールドの平均値は計算しなくても良い。  As described above, the validity classifying means 8 extracts the validity of each of the experience records 11A to 11E and the validity reference data received in S110 from the data storage means 6, and The classifying means 8 classifies each of the effectiveness reference data, and sends the value of the output value field in the experience record 11 classified as the highest effectiveness reference data to the output means 9 as the output data 12 (S150). ). In this case, since only one experience record 11 is classified as the highest validity reference data, it is not necessary to calculate the average value of the output value field.
[0121] 従って上述の場合、経験レコード 11Aの有効度は 340000であることから、有効度 基準データの値 5 (0 200000)に該当する。  [0121] Therefore, in the above case, the validity of the experience record 11A is 340000, which corresponds to the value 5 (0 200000) of the validity reference data.
[0122] 経験レコード 11Bの有効度は 660000であることから、有効度基準データの値 3 (6 00001— 800000)に該当する。  [0122] Since the validity of the experience record 11B is 660000, it corresponds to the value 3 (600001-100000) of the validity reference data.
[0123] そして経験レコード 11Cの有効度は 840000であることから、有効度基準データの 値 2 (800001— 1000000)に該当する。  [0123] Since the validity of the experience record 11C is 840000, it corresponds to the value 2 (800001-1000000) of the validity reference data.
[0124] 経験レコード 11Dの有効度は 1020000であることから、有効度基準データの値 1 ( 1000001— 1200000)に該当する。  [0124] Since the validity of the experience record 11D is 1020000, it corresponds to the value 1 (1000001-1200000) of the validity reference data.
[0125] 経験レコード 11Eの有効度は 340000であることから、経験レコード 11Aと同様に 有効度基準データの値 5 (200001— 400000)に該当する。  [0125] Since the validity of experience record 11E is 340,000, it corresponds to the value 5 (200001-400000) of the validity reference data as in experience record 11A.
[0126] 上述のように有効度基準データ毎に分類されたものを模式的に図 23に示す。そう すると最も高い有効度基準データに分類されたのが、基準値「1」の経験レコード 11 Dなので、この経験レコード 11Dに於ける出力値フィールド(図 7 (d)に於ける経験レ コード 11の故障予測の値)の値を参照すると、「0, 0, 100, 0」である。そして基準値 「1」に分類された経験レコード 11はこれだけなので、出力データ 12の値も「0, 0, 10 0, 0」となる。従って有効度分類手段 8は出力手段 9で出力する出力データとして「0 , 0, 100, 0」を出力手段 9に送出する。  FIG. 23 schematically shows the data classified for each validity reference data as described above. Then, since the experience record 11D with the reference value “1” was classified as the highest validity reference data, the output value field in this experience record 11D (experience record 11 in FIG. 7 (d)) was used. When referring to the value of the failure prediction value of “0, 0, 100, 0”. Since the only experience record 11 classified into the reference value “1” is this, the value of the output data 12 is also “0, 0, 100, 0”. Therefore, the validity classifying means 8 sends “0, 0, 100, 0” to the output means 9 as output data output by the output means 9.
[0127] 出力手段 9は、有効度分類手段 8から出力データ 12「0, 0, 100, 0」を受信すると 、それを出力する。従って、現在の出力結果としては、要点検時であると判断できる。 実施例 4 [0128] ベイジアンネットワーク近似処理装置 1に於いて、有効度分類手段 8は、有効度演 算手段 7で計算した有効度を有効度基準データ毎に分類し、最も高い有効度基準 データに属する複数の経験レコード 11のうち、最も高い有効度の経験レコード 11に 於ける出力値フィールドの値を出力データ 12として、出力手段 9に送出しても良い。 When the output means 9 receives the output data 12 “0, 0, 100, 0” from the effectiveness classifying means 8, it outputs it. Therefore, it can be determined that the current output result is a time when inspection is required. Example 4 [0128] In the Bayesian network approximation processing device 1, the effectiveness classifying means 8 classifies the effectiveness calculated by the effectiveness calculating means 7 for each effectiveness reference data, The value of the output value field in the experience record 11 having the highest validity among the experience records 11 of the above may be sent to the output means 9 as the output data 12.
[0129] 又、有効度分類手段 8は、有効度演算手段 7で計算した有効度を有効度基準デー タ毎に分類し、最も高い有効度基準データに属する複数の経験レコード 11のうち、 任意の経験レコード 11に於ける出力値フィールドの値を出力データ 12として、出力 手段 9に送出しても良い。  The validity classifying means 8 classifies the validity calculated by the validity calculating means 7 for each validity reference data, and selects any of the plurality of experience records 11 belonging to the highest validity reference data. The value of the output value field in the experience record 11 may be sent to the output means 9 as output data 12.
[0130] 更に、有効度分類手段 8は、有効度演算手段 7で計算した有効度を有効度基準デ ータ毎に分類し、最も高い有効度基準データに属する複数の経験レコード 11のうち 、予め定められた上位 n個の経験レコード 11に於ける出力値フィールドの値の平均 値を出力データ 12として、出力手段 9に送出しても良い。  Further, the validity classifying means 8 classifies the validity calculated by the validity calculating means 7 for each validity reference data, and among the plurality of experience records 11 belonging to the highest validity reference data, The average value of the values of the output value fields in the predetermined n uppermost experience records 11 may be sent to the output means 9 as the output data 12.
実施例 5  Example 5
[0131] 次に、ベイジアンネットワーク近似処理装置 1に、有効度基準データ入力受付手段 3を設けない場合を図 25に示す。この場合、有効度分類手段 8以外の処理は実施例 1から実施例 4と同様である。従って図 2に於いて S110の処理も不要となる。  Next, FIG. 25 shows a case where the Bayesian network approximation processing device 1 is not provided with the validity reference data input receiving means 3. In this case, processing other than the effectiveness classifying means 8 is the same as in the first to fourth embodiments. Therefore, the processing of S110 in FIG. 2 is also unnecessary.
[0132] この実施例の場合、有効度演算手段 7で計算した有効度に基づいて、有効度分類 手段 8は、有効度演算手段 7で計算した有効度のソーティング (並べ替え)を行い、最 も高レ、有効度の経験レコード 11に於ける出力値フィールドの値を (好適には最も高 い有効度基準に分類された経験レコード 11に於ける出力値フィールドの値を)出力 データ 12として、出力手段 9に送出する手段となる。  In the case of this embodiment, based on the effectiveness calculated by the effectiveness calculation means 7, the effectiveness classification means 8 sorts (sorts) the effectiveness calculated by the effectiveness calculation means 7, The value of the output value field in the experience record 11 with the highest validity is also used as the output data 12 (preferably the value of the output value field in the experience record 11 classified according to the highest validity criterion). , And output means 9.
[0133] 又、有効度分類手段 8は、最も高い有効度を有する経験レコード 11の出力値フィ 一ルドの値を用いずとも、予め定められた上位 n個に該当する有効度を有する経験 レコード 11の出力値フィールドの値の平均値を出力データ 12としても良いし、上位 n 個のうちいずれかの経験レコード 11の出力値フィールドの値を出力データ 12として も良い。更に、出力データ 12が複数あってもよい場合には、上位 n個の経験レコード 11の出力値フィールドの値を各々出力データ 12としても良い。  [0133] Further, the validity classifying means 8 may use the experience record having the validity corresponding to the predetermined upper n items without using the value of the output value field of the experience record 11 having the highest validity. The average value of the values of the 11 output value fields may be used as the output data 12, or the value of the output value field of any one of the n most experienced records 11 may be used as the output data 12. Further, when there is a plurality of output data 12, the values of the output value fields of the top n experience records 11 may be used as the output data 12.
実施例 6 [0134] 上述したように、実施例 1から実施例 5に於いて出力データ 12を生成する際に更に 、各経験レコード 10の出力値フィールドを、その経験レコード 11の有効度によって、 重み付けした平均値としても良い。 Example 6 As described above, when generating the output data 12 in the first to fifth embodiments, the output value field of each experience record 10 is further weighted according to the effectiveness of the experience record 11 to obtain an average. It may be a value.
[0135] 具体的には、経験レコード l lrの出力値フィールドを xo [r]、有効度演算手段 7で 計算したその経験レコード 11の有効度を val [r]で示す場合、有効度分類手段 8は、 ∑ (xo [r] X val[r] ) ÷∑ (val[r] )を計算し、出力データ 12とする。そして有効度分 類手段 8は、その出力データ 12を出力手段 9に送信し、出力手段 9で当該出力デー タ 12を出力する。  Specifically, when the output value field of the experience record l lr is represented by xo [r] and the validity of the experience record 11 calculated by the validity computing means 7 is represented by val [r], the validity classifying means 8 calculates ∑ (xo [r] X val [r]) ÷ ∑ (val [r]) and sets it as output data 12. Then, the validity classifying means 8 transmits the output data 12 to the output means 9, and the output means 9 outputs the output data 12.
[0136] このように有効度に更に重み付けを行うことによって、有効度の高い経験レコード 1 1を重視しつつ、有効度がそれほど高くない経験レコード 11も取りいれた出力データ 12を得ることが出来る。  As described above, by further weighting the validity, it is possible to obtain the output data 12 in which the experience record 11 with a low validity is also taken in while emphasizing the experience record 11 with a high validity.
[0137] 又、有効度そのものではなぐ有効度の 2乗を使って重み平均を計算しても良い。こ れにより、有効度が特に高い経験レコード 11を重視した出力データ 12を得ることが 出来る。  [0137] Also, the weighted average may be calculated using the square of the effectiveness rather than the effectiveness itself. As a result, it is possible to obtain the output data 12 which emphasizes the experience records 11 with particularly high effectiveness.
[0138] 本発明に於ける各手段は、その機能が論理的に区別されているのみであって、物 理上あるいは事実上は同一の領域を為してレ、ても良レ、。  [0138] Each means in the present invention is only logically distinguished in its function, and physically or practically uses the same area.
[0139] 尚、本発明を実施するにあたり本実施態様の機能を実現するソフトウェアのプロダラ ムを記録した記憶媒体をシステムに供給し、そのシステムのコンピュータが記憶媒体 に格納されたプログラムを読み出し実行することによって実現されることは当然である  In implementing the present invention, a storage medium storing a program of software for realizing the functions of the present embodiment is supplied to the system, and a computer of the system reads and executes the program stored in the storage medium. Of course
[0140] この場合、記憶媒体から読み出されたプログラム自体が前記した実施態様の機能 を実現することとなり、そのプログラムを記憶した記憶媒体は本発明を当然のことなが ら構成することになる。 [0140] In this case, the program itself read from the storage medium implements the functions of the above-described embodiments, and the storage medium storing the program naturally constitutes the present invention. .
[0141] 又、 HDLのような回路設計言語を用いることによって、プログラム可能な LSIに、本 発明のベイジアンネットワーク近似処理装置 1の各機能を書き込んで動作させても良 レ、。例えば FPGAと呼ばれる(Field Programmable Gate Array)プログラム可能なロジ ック 'デバイスに、 HDLのような回路設計言語で記述したベイジアンネットワーク近似 処理装置 1の各機能を書き込んで、計算を行わせる。 [0142] プログラムを供給する為の記憶媒体としては、例えば磁気ディスク、ハードディスク、 光ディスク、光磁気ディスク、磁気テープ、不揮発性のメモリカード等を使用すること ができる。 [0141] Further, by using a circuit design language such as HDL, each function of the Bayesian network approximation processing device 1 of the present invention may be written and operated in a programmable LSI. For example, each function of the Bayesian network approximation processor 1 described in a circuit design language such as HDL is written to a programmable logic device called an FPGA (Field Programmable Gate Array), and calculation is performed. [0142] As a storage medium for supplying the program, for example, a magnetic disk, a hard disk, an optical disk, a magneto-optical disk, a magnetic tape, a nonvolatile memory card, and the like can be used.
[0143] 更に、プログラムを記憶媒体に記憶するのみならず、ネットワークからダウンロードし て取得することによつても良い。  [0143] Further, the program may be stored not only in a storage medium but also obtained by downloading from a network.
[0144] 又、コンピュータが読み出したプログラムを実行することにより、上述した実施態様 の機能が実現されるだけではなぐそのプログラムの指示に基づき、コンピュータ上で 稼働しているオペレーティングシステムなどが実際の処理の一部又は全部を行レ、、そ の処理によって前記した実施態様の機能が実現される場合も含まれることは言うまで もない。  [0144] Further, the functions of the above-described embodiments are not only realized by the computer executing the readout program, but the operating system or the like running on the computer performs the actual processing based on the instructions of the program. It goes without saying that some or all of the above may be performed, and the processing may realize the functions of the above-described embodiments.
[0145] 更に、記憶媒体から読み出されたプログラムが、コンピュータに揷入された機能拡 張ボードやコンピュータに接続された機能拡張ユニットに備わる不揮発性あるいは揮 発性の記憶手段に書き込まれた後、そのプログラムの指示に基づき、機能拡張ボー ドあるいは機能拡張ユニットに備わる演算処理装置などが実際の処理の一部あるい は全部を行い、その処理により前記した実施態様の機能が実現される場合も含まれ ることは当然である。  [0145] Further, after the program read from the storage medium is written to the nonvolatile or volatile storage means provided in the function expansion board inserted into the computer or the function expansion unit connected to the computer, Based on the instructions of the program, the processing unit provided in the function expansion board or the function expansion unit performs part or all of the actual processing, and the processing realizes the functions of the above-described embodiments. Naturally, it is also included.
産業上の利用可能性  Industrial applicability
[0146] 本発明によって、ベイジアンネットワークの際に従来は必須であった各事象の全て の値を要素とするテーブルの作成が不要となる。本発明では、一部の値のみを有す るテーブル (経験レコード)から類似する経験レコードを抽出するので、従来よりもリソ ースを必要としない。従って小規模なハードウェア資源であっても、ベイジアンネット ワークの近似処理が可能となる。  [0146] According to the present invention, it is not necessary to create a table in which all values of each event are required as elements in the Bayesian network. In the present invention, similar experience records are extracted from a table (experience records) having only a part of the values, so that resources are not required as compared with the related art. Therefore, even a small-scale hardware resource can perform Bayesian network approximation processing.
[0147] 又、従来はテーブル生成に時間がかかる為、最新の経験を直ちに反映させること が困難であつたが、本発明では、テーブルを使用しない為、最新の経験を直ちに反 映させることが可能となる。  [0147] Also, conventionally, it has been difficult to immediately reflect the latest experience because it takes time to generate a table, but in the present invention, since no table is used, the latest experience can be immediately reflected. It becomes possible.
[0148] 更にハードウェア化にあたっては、 SDRAM (Synchronous DRAM)のバースト転送  [0148] Furthermore, in the case of hardware implementation, SDRAM (Synchronous DRAM) burst transfer
(1回のアドレス指定で複数のデータをまとめて連続的に転送する方法)に適合して いるので、高速処理の効果が得られる。つまり、バースト転送を行う SDRAMの場合 、メモリにアドレスを送ると一定時間経過してからデータが連続的に返ってくるという 動作が可能であり、連続したアドレスであれば、 2つ目以降のデータは待ち時間なし に受け取れるので、高速処理を可能としている。そして本発明は、バースト転送に同 期して演算を行えるように設計されていることから、データが一つ入力される毎にその 場で演算を行いメモリの読み込みにかかる時間だけで演算が完了することから、高速 処理が実現可能となる。 Since this method conforms to the method of transferring multiple data at once by specifying a single address, the effect of high-speed processing can be obtained. In other words, in the case of SDRAM that performs burst transfer When an address is sent to the memory, data can be returned continuously after a certain period of time, and if it is a continuous address, the second and subsequent data can be received without waiting time. Processing is possible. Since the present invention is designed so that the operation can be performed in synchronization with the burst transfer, the operation is performed on the spot every time one data is input, and the operation is completed only in the time required for reading the memory. Therefore, high-speed processing can be realized.

Claims

請求の範囲 The scope of the claims
[1] ベイジアンネットワークの近似処理を行うベイジアンネットワーク近似処理装置であ つて、  [1] A Bayesian network approximation processor that performs Bayesian network approximation processing.
各フィールドに対する重み付けを与えるデータである感度データの入力を受け付け る感度データ入力受付手段と、  A sensitivity data input receiving means for receiving input of sensitivity data which is data for giving a weight to each field;
有効度基準データの入力を受け付ける有効度基準データ入力受付手段と、 前記ベイジアンネットワーク近似処理装置の外部で取得された入力レコードの入力を 受け付ける入力レコード入力受付手段と、  Validity reference data input receiving means for receiving input of validity reference data; input record input receiving means for receiving input of an input record acquired outside the Bayesian network approximation processing device;
予め設定された、前記ベイジアンネットワーク近似処理装置への入力値フィールドと それに対する出力値フィールドとからなる経験レコードを取得する経験レコード入力 受付手段と、  Experience record input receiving means for acquiring an experience record consisting of a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto,
前記経験レコード毎に、前記入力レコードと前記経験レコードについて、それらのフィ 一ルド毎に与えられた感度データで重み付けを行った乗算の合計を、各経験レコー ドに対する有効度として計算する有効度演算手段と、  For each experience record, for the input record and the experience record, an effectiveness calculation for calculating the sum of multiplications weighted by the sensitivity data given for each field as the effectiveness for each experience record Means,
前記経験レコードを前記有効度基準データに基づいて、前記有効度で分類し、分類 された経験レコードに於ける出力値フィールドの値から出力データを計算する有効度 分類手段と、  Validity classifying means for classifying the experience records based on the validity reference data based on the validity and calculating output data from values of output value fields in the classified experience records;
前記有効度分類手段から前記出力データを受信し出力する出力手段と、 力 なることを特徴とするベイジアンネットワーク近似処理装置。  An output means for receiving and outputting the output data from the effectiveness classifying means; and a Bayesian network approximation processing device.
[2] ベイジアンネットワークの近似処理を行うベイジアンネットワーク近似処理装置であ つて、  [2] A Bayesian network approximation processor that performs Bayesian network approximation processing.
各フィールドに対する重み付けを与えるデータである感度データの入力を受け付け る感度データ入力受付手段と、  A sensitivity data input receiving means for receiving input of sensitivity data which is data for giving a weight to each field;
有効度基準データの入力を受け付ける有効度基準データ入力受付手段と、 前記ベイジアンネットワーク近似処理装置の外部で取得された入力レコードの入力を 受け付ける入力レコード入力受付手段と、  Validity reference data input receiving means for receiving input of validity reference data; input record input receiving means for receiving input of an input record acquired outside the Bayesian network approximation processing device;
予め設定された、前記ベイジアンネットワーク近似処理装置への入力値フィールドと それに対する出力値フィールドとからなる経験レコードを取得する経験レコード入力 受付手段と、 Experience record input for acquiring an experience record including a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto Reception means,
NFを前記経験レコードに於ける入力値フィールドの数、 NIを前記入力値フィールド の要素数とすると、数 1に基づいて前記経験レコード毎の有効度を計算する有効度 演算手段と、  Assuming that NF is the number of input value fields in the experience record, and NI is the number of elements of the input value field, effectiveness calculating means for calculating the effectiveness of each experience record based on Equation 1;
前記経験レコードを前記有効度基準データに基づいて、前記有効度で分類し、分類 された経験レコードに於ける出力値フィールドの値から出力データを計算する有効度 分類手段と、  Validity classifying means for classifying the experience records based on the validity reference data based on the validity and calculating output data from values of output value fields in the classified experience records;
前記有効度分類手段から前記出力データを受信し出力する出力手段と、 力 なることを特徴とするベイジアンネットワーク近似処理装置。  An output means for receiving and outputting the output data from the effectiveness classifying means; and a Bayesian network approximation processing device.
[数 1]  [Number 1]
2 ' Τ- Υ ί (入力レコードのフィールド f の要素 iの確率値) χ(経験レコードのフィールド f の要素 j の確率値) χ « ^ (フィールド f の要素 ( i , j ) の感度データ) } 2 'Τ- Υ ί (probability value of element i of field f of input record) χ (probability value of element j of field f of experience record) «« ^ (sensitivity data of element (i, j) of field f) }
但し、 N Fはフィールド数、 N Iはフィールドの要素数  Where NF is the number of fields, NI is the number of elements in the field
[3] 前記有効度分類手段は、 [3] The effectiveness classifying means includes:
前記有効度演算手段で計算した有効度が前記有効度基準データよりも高い経験レ コードに於ける出力値フィールドの平均値を、前記出力データとする、  An average value of an output value field in an empirical record whose validity calculated by the validity calculating means is higher than the validity reference data is defined as the output data.
ことを特徴とする請求項 1又は請求項 2に記載のベイジアンネットワーク近似処理装 置。  The Bayesian network approximation processing device according to claim 1 or 2, wherein:
[4] 前記有効度分類手段は、  [4] The effectiveness classifying means comprises:
前記有効度演算手段で計算した有効度が前記有効度基準データよりも高ぐ且つ、 予め定められた上位 番目までに該当する経験レコードに於ける出力値フィールドの 平均値を、前記出力データとする、  The average value of the output value field in the experience record whose validity calculated by the validity calculating means is higher than the validity reference data and which corresponds to a predetermined upper order is used as the output data. ,
ことを特徴とする請求項 1又は請求項 2に記載のベイジアンネットワーク近似処理装 置。  The Bayesian network approximation processing device according to claim 1 or 2, wherein:
[5] 前記有効度分類手段は、  [5] The effectiveness classifying means,
前記有効度演算手段で計算した有効度に基づいて前記経験レコードを前記有効度 基準データ毎に分類し、最も高い有効度基準に分類された経験レコードに於ける出 力値フィールドの値を、前記出力データとする、  The experience records are classified for each of the validity reference data based on the validity calculated by the validity calculating means, and the value of the output value field in the experience record classified into the highest validity criterion is defined as Output data,
ことを特徴とする請求項 1又は請求項 2に記載のベイジアンネットワーク近似処理装 置。 A Bayesian network approximation processing device according to claim 1 or 2, Place.
[6] 前記有効度分類手段は、  [6] The effectiveness classifying means,
前記最も高い有効度基準データに分類された経験レコードが複数ある場合には、そ こに分類された経験レコードに於ける出力値フィールドの平均値を、前記出力データ とする、  When there are a plurality of experience records classified as the highest validity reference data, an average value of an output value field in the experience records classified into the plurality of experience records is used as the output data.
ことを特徴とする請求項 5に記載のベイジアンネットワーク近似処理装置。  6. The Bayesian network approximation processing device according to claim 5, wherein:
[7] 前記有効度分類手段は、 [7] The effectiveness classifying means,
前記最も高い有効度基準データに分類された経験レコードが複数ある場合には、そ こに分類された経験レコードのうち、最も高い有効度の経験レコードに於ける出力値 フィールドの値を、前記出力データとする、  When there are a plurality of experience records classified as the highest effectiveness reference data, the value of the output value field in the experience record having the highest effectiveness among the experience records classified into the Data
ことを特徴とする請求項 5に記載のベイジアンネットワーク近似処理装置。  6. The Bayesian network approximation processing device according to claim 5, wherein:
[8] 前記有効度分類手段は、 [8] The effectiveness classifying means,
前記最も高い有効度基準データに分類された経験レコードが複数ある場合には、そ こに分類された経験レコードのうち、任意の経験レコードに於ける出力値フィールドの 値を、前記出力データとする、  When there are a plurality of experience records classified as the highest validity reference data, the value of an output value field in an arbitrary experience record among the experience records classified into the plurality of experience records is used as the output data. ,
ことを特徴とする請求項 5に記載のベイジアンネットワーク近似処理装置。  6. The Bayesian network approximation processing device according to claim 5, wherein:
[9] 前記ベイジアンネットワーク近似処理装置は、 [9] The Bayesian network approximation processor,
少なくとも、前記感度データ、前記有効度基準データ、前記入力レコードを記憶する データ記憶手段を有しており、  At least a data storage unit for storing the sensitivity data, the validity reference data, and the input record,
前記有効度演算手段は、前記計算した各経験レコードの有効度を前記データ記憶 手段に記憶し、  The validity calculating means stores the calculated validity of each experience record in the data storage means,
前記有効度分類手段は、前記有効度基準データ毎に前記経験レコードを分類する 際に、前記データ記憶手段に記憶した各有効度を取得した後に前記経験レコードの 分類を行う、  The validity classifying means, when classifying the experience records for each of the validity reference data, classifies the experience records after acquiring each validity stored in the data storage means.
ことを特徴とする請求項 1又は請求項 2に記載のベイジアンネットワーク近似処理装 置。  The Bayesian network approximation processing device according to claim 1 or 2, wherein:
[10] ベイジアンネットワークの近似処理を行うベイジアンネットワーク近似処理装置であ つて、 各フィールドに対する重み付けを与えるデータである感度データの入力を受け付け る感度データ入力受付手段と、 [10] A Bayesian network approximation processor that performs Bayesian network approximation processing, A sensitivity data input receiving means for receiving input of sensitivity data which is data for giving a weight to each field;
前記ベイジアンネットワーク近似処理装置の外部で取得された入力レコードの入力を 受け付ける入力レコード入力受付手段と、 Input record input receiving means for receiving an input of an input record acquired outside the Bayesian network approximation processing device;
予め設定された、前記ベイジアンネットワーク近似処理装置への入力値フィールドと それに対する出力値フィールドとからなる経験レコードを取得する経験レコード入力 受付手段と、 Experience record input receiving means for acquiring an experience record consisting of a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto,
前記経験レコード毎に、前記入力レコードと前記経験レコードについて、それらのフィ 一ルド毎に与えられた感度データで重み付けを行った乗算の合計を、各経験レコー ドに対する有効度として計算する有効度演算手段と、 For each experience record, for the input record and the experience record, an effectiveness calculation for calculating the sum of multiplications weighted by the sensitivity data given for each field as the effectiveness for each experience record Means,
前記経験レコードを予め定められた基準に基づいて前記有効度で分類し、分類され た経験レコードに於ける出力値フィールドの値から出力データを計算する有効度分 類手段と、 Validity classifying means for classifying the experience records based on the validity based on a predetermined criterion, and calculating output data from a value of an output value field in the classified experience records;
前記有効度分類手段から前記出力データを受信し出力する出力手段と、 からなることを特徴とするベイジアンネットワーク近似処理装置。 An output means for receiving and outputting the output data from the effectiveness classifying means; and a Bayesian network approximation processing device.
ベイジアンネットワークの近似処理を行うベイジアンネットワーク近似処理装置であ つて、  A Bayesian network approximation processor for performing Bayesian network approximation processing,
各フィールドに対する重み付けを与えるデータである感度データの入力を受け付け る感度データ入力受付手段と、 A sensitivity data input receiving means for receiving input of sensitivity data which is data for giving a weight to each field;
前記ベイジアンネットワーク近似処理装置の外部で取得された入力レコードの入力を 受け付ける入力レコード入力受付手段と、 Input record input receiving means for receiving an input of an input record acquired outside the Bayesian network approximation processing device;
予め設定された、前記ベイジアンネットワーク近似処理装置への入力値フィールドと それに対する出力値フィールドとからなる経験レコードを取得する経験レコード入力 受付手段と、 Experience record input receiving means for acquiring an experience record comprising a preset input value field to the Bayesian network approximation processing device and an output value field corresponding thereto,
NFを前記経験レコードに於ける入力値フィールドの数、 NIを前記入力値フィールド の要素数とすると、数 1に基づいて前記経験レコード毎の有効度を計算する有効度 演算手段と、  Assuming that NF is the number of input value fields in the experience record, and NI is the number of elements of the input value field, effectiveness calculating means for calculating the effectiveness of each experience record based on Equation 1;
前記経験レコードを予め定められた基準に基づいて前記有効度で分類し、分類され た経験レコードに於ける出力値フィールドの値から出力データを計算する有効度分 類手段と、 The experience records are classified according to the degree of effectiveness based on a predetermined criterion, and classified. Means for calculating output data from the value of the output value field in the experience record
前記有効度分類手段から前記出力データを受信し出力する出力手段と、 力 なることを特徴とするベイジアンネットワーク近似処理装置。  An output means for receiving and outputting the output data from the effectiveness classifying means; and a Bayesian network approximation processing device.
[数 1] T- Y U入力レコードのフィールド f の要素 iの確率値) χ(経験レコードのフィールド f の要素 j の確率値) X  [Equation 1] T-Y U Probability value of element i of field f of input record) χ (Probability value of element j of field f of experience record) X
(フィールド f の要素 ( i, j ) の感度データ) )  (Sensitivity data of element (i, j) of field f))
但し、 N Fはフィールド数、 N Iはフィールドの要素数  Where NF is the number of fields, NI is the number of elements in the field
[12] 前記有効度分類手段は、 [12] The effectiveness classifying means,
前記有効度のソーティングを行レ、、最も高い有効度である経験レコードに於ける出力 値フィールドの値を、出力データとする、  Sorting of the validity is performed, and the value of the output value field in the experience record having the highest validity is used as output data.
ことを特徴とする請求項 10又は請求項 11に記載のベイジアンネットワーク近似処理 装置。  12. The Bayesian network approximation processing device according to claim 10, wherein:
[13] 前記有効度分類手段は、  [13] The effectiveness classifying means,
前記有効度のソーティングを行い、予め定められた上位 n番目までに該当する経験 レコードに於ける出力値フィールドの平均値を、出力データとする、  Sorting of the validity is performed, and the average value of the output value field in the experience record corresponding to the predetermined upper n-th is defined as output data.
ことを特徴とする請求項 10又は請求項 11に記載のベイジアンネットワーク近似処理 装置。  12. The Bayesian network approximation processing device according to claim 10, wherein:
[14] 前記有効度分類手段は、  [14] The effectiveness classifying means,
前記有効度のソーティングを行い、予め定められた上位 n番目までに該当する経験 レコードに於ける出力値フィールドの値を、各々出力データとする、  Sorting of the validity is performed, and the values of the output value fields in the experience records corresponding to the predetermined upper nth order are used as output data.
ことを特徴とする請求項 10又は請求項 11に記載のベイジアンネットワーク近似処理 装置。  12. The Bayesian network approximation processing device according to claim 10, wherein:
[15] 前記有効度分類手段は、  [15] The effectiveness classifying means,
予め定められた上位 n番目までに該当する経験レコードのうち、任意の経験レコード に於ける出力値フィールドの値を、出力データとする、  Of the experience records corresponding to the predetermined upper nth order, the value of the output value field in an arbitrary experience record is used as output data.
ことを特徴とする請求項 10又は請求項 11に記載のベイジアンネットワーク近似処理 装置。  12. The Bayesian network approximation processing device according to claim 10, wherein:
[16] 前記有効度分類手段は、 前記出力値フィールドの値に対して、選択した経験レコードの有効度を重み付けし、 その重み付け後の有効度の平均値を、出力データとする、 [16] The effectiveness classifying means, The value of the output value field is weighted with the validity of the selected experience record, and the average value of the weighted validity is used as output data.
ことを特徴とする請求項 12から請求項 15のいずれかに記載のベイジアンネットワーク 近似処理装置。  The Bayesian network approximation processing device according to any one of claims 12 to 15, wherein:
[17] 前記ベイジアンネットワーク近似処理装置は、 [17] The Bayesian network approximation processor,
少なくとも、前記感度データ、前記入力レコードを記憶するデータ記憶手段を有して おり、  At least the sensitivity data, a data storage means for storing the input record,
前記有効度演算手段は、前記計算した各経験レコードの有効度を前記データ記憶 手段に記憶し、  The validity calculating means stores the calculated validity of each experience record in the data storage means,
前記有効度分類手段は、前記経験レコードを分類する際に、前記データ記憶手段に 記憶した各有効度を取得した後に前記経験レコードの分類を行う、  The validity classifying means, when classifying the experience records, classifies the experience records after acquiring each validity stored in the data storage means.
ことを特徴とする請求項 10又は請求項 11に記載のベイジアンネットワーク近似処理 装置。  12. The Bayesian network approximation processing device according to claim 10, wherein:
[18] 前記各フィールドの確率分布は整数配列である、  [18] The probability distribution of each of the fields is an integer array.
ことを特徴とする請求項 1、請求項 2、請求項 10又は請求項 11のいずれかに記載の ベイジアンネットワーク近似処理装置。  12. The Bayesian network approximation processing device according to claim 1, wherein:
[19] 前記ベイジアンネットワーク近似処理装置は、 [19] The Bayesian network approximation processor,
制御チップに回路として組み込まれている  Built into the control chip as a circuit
ことを特徴とする請求項 1から請求項 18のいずれかに記載のベイジアンネットワーク 近似処理装置。  19. The Bayesian network approximation processing device according to claim 1, wherein:
[20] 前記ベイジアンネットワーク近似処理装置は、 [20] The Bayesian network approximation processor,
組み込み CPUに対する外付けのコプロセッサに組み込まれている  Embedded in an external coprocessor for the embedded CPU
ことを特徴とする請求項 1から請求項 18のいずれかに記載のベイジアンネットワーク 近似処理装置。  19. The Bayesian network approximation processing device according to claim 1, wherein:
[21] 前記ベイジアンネットワーク近似処理装置は、 [21] The Bayesian network approximation processor,
拡張ボードのハードウェアとして組み込まれてレ、る  Built-in as hardware for the expansion board
ことを特徴とする請求項 1から請求項 18のいずれかに記載のベイジアンネットワーク 近似処理装置。 19. The Bayesian network approximation processing device according to claim 1, wherein:
[22] 前記ベイジアンネットワーク近似処理装置は、 [22] The Bayesian network approximation processor,
移動端末にマイクロコントローラとして内蔵されている  Built in microcontroller in mobile terminal
ことを特徴とする請求項 1から請求項 18のいずれかに記載のベイジアンネットワーク 近似処理装置。  19. The Bayesian network approximation processing device according to claim 1, wherein:
[23] 前記ベイジアンネットワーク近似処理装置は、 [23] The Bayesian network approximation processor,
小規模制御装置に組み込まれてレ、る  Incorporated in small-scale control devices
ことを特徴とする請求項 1から請求項 18のいずれかに記載のベイジアンネットワーク 近似処理装置。  19. The Bayesian network approximation processing device according to claim 1, wherein:
PCT/JP2004/017842 2003-12-02 2004-12-01 Bayesian network approximation device WO2005055135A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2003-403039 2003-12-02
JP2003403039A JP2005165624A (en) 2003-12-02 2003-12-02 Bayesian network approximation processor

Publications (1)

Publication Number Publication Date
WO2005055135A1 true WO2005055135A1 (en) 2005-06-16

Family

ID=34650051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2004/017842 WO2005055135A1 (en) 2003-12-02 2004-12-01 Bayesian network approximation device

Country Status (2)

Country Link
JP (1) JP2005165624A (en)
WO (1) WO2005055135A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2593757B (en) 2020-04-02 2022-04-06 Graphcore Ltd Control of processing node operations

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08129488A (en) * 1994-09-05 1996-05-21 Toshiba Corp Inference method and inference system
JP2001075808A (en) * 1999-07-14 2001-03-23 Hewlett Packard Co <Hp> Bayesian network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08129488A (en) * 1994-09-05 1996-05-21 Toshiba Corp Inference method and inference system
JP2001075808A (en) * 1999-07-14 2001-03-23 Hewlett Packard Co <Hp> Bayesian network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KIMURA Y. ET AL: "Bayesian Net Gakushu no Chishiki System heno Oyo", KEISOKU TO SEIGYO, vol. 38, no. 7, 10 July 1999 (1999-07-10), pages 468 - 473, XP002990224 *

Also Published As

Publication number Publication date
JP2005165624A (en) 2005-06-23

Similar Documents

Publication Publication Date Title
Secker et al. AISEC: an artificial immune system for e-mail classification
EP2791781B1 (en) Methods and systems for data analysis in a state machine
CN104067282B (en) Counter operation in state machine lattice
CN104011723B (en) Boolean logic in state machine lattice
CN103988212B (en) Method and system for being route in state machine
US10489062B2 (en) Methods and systems for using state vector data in a state machine engine
Eskin et al. Protein family classification using sparse markov transducers
US20080082481A1 (en) System and method for characterizing a web page using multiple anchor sets of web pages
JP2004171539A (en) Method and system of identifying use pattern of web page
US20090210470A1 (en) Apparatus and methods for lossless compression of numerical attributes in rule based systems
CN107346433A (en) A kind of text data sorting technique and server
CN109857984B (en) Regression method and device of boiler load rate-efficiency curve
He et al. Learn to floorplan through acquisition of effective local search heuristics
JP2004512615A (en) Method and apparatus for preempting referenced resources
CN113010778A (en) Knowledge graph recommendation method and system based on user historical interest
CN116049536A (en) Recommendation method and related device
WO2005055135A1 (en) Bayesian network approximation device
Harkin et al. Genetic algorithm driven hardware–software partitioning for dynamically reconfigurable embedded systems
US7107192B1 (en) Method for computing models based on attributes selected by entropy
CN112507229A (en) Document recommendation method and system and computer equipment
CN112270177A (en) News cover mapping method and device based on content similarity and computing equipment
JP2009021928A (en) Information processing apparatus and program
CN113994388A (en) Method and apparatus for managing neural network model
CN111966571B (en) Time estimation cooperative processing method based on ARM-FPGA coprocessor heterogeneous platform
Rathi QSort–Dynamic pivot in original Quick Sort

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase

Ref document number: 04819849

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 4819849

Country of ref document: EP