CN110993004B - Naive Bayes classification method, engine and system based on resistive random access memory array - Google Patents
Naive Bayes classification method, engine and system based on resistive random access memory array Download PDFInfo
- Publication number
- CN110993004B CN110993004B CN201911100579.8A CN201911100579A CN110993004B CN 110993004 B CN110993004 B CN 110993004B CN 201911100579 A CN201911100579 A CN 201911100579A CN 110993004 B CN110993004 B CN 110993004B
- Authority
- CN
- China
- Prior art keywords
- voltage
- current
- bit line
- memory cell
- ref
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C13/00—Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00, or G11C25/00
- G11C13/0002—Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00, or G11C25/00 using resistive RAM [RRAM] elements
- G11C13/0021—Auxiliary circuits
- G11C13/0023—Address circuits or decoders
- G11C13/0026—Bit-line or column circuits
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C13/00—Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00, or G11C25/00
- G11C13/0002—Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00, or G11C25/00 using resistive RAM [RRAM] elements
- G11C13/0021—Auxiliary circuits
- G11C13/0023—Address circuits or decoders
- G11C13/0028—Word-line or row circuits
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Pure & Applied Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Algebra (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Computational Mathematics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Read Only Memory (AREA)
Abstract
The invention discloses naive Bayes based on a resistive random access memory arrayThe classification method, engine and system belong to the field of storage and calculation fusion, and comprise the following steps: transforming naive Bayes with a monotonically decreasing function f to convert a continuous multiplication to a point multiplication, where the prior probabilityAnd conditional probabilityRespectively becomeAndone bit line is allocated to each category, and on each bit line, one prior probability is correspondingly allocated to each categoryAnd each conditional probabilityAllocating a storage unit; setting the conductance of the memory cell to correlate to a corresponding prior probability or conditional probability; according to the value of each attribute in the to-be-classified example, applying voltage on the corresponding word line, and simultaneously applying voltage on the word line corresponding to the prior probability to make the current of the memory cell correspondingly equal to that of the memory cellOrAnd collecting the current on each bit line, and determining the category corresponding to the minimum current as a final classification result. The invention can reduce the delay and power consumption of the naive Bayes classification method.
Description
Technical Field
The invention belongs to the field of storage and calculation fusion, and particularly relates to a naive Bayes classification method and an engine based on a resistive random access memory array.
Background
A Resistive Random-Access Memory (ReRAM) is a novel nonvolatile passive two-terminal Memory device. Can be composed of a specific resistive switching material (e.g., some metal oxides) and the stored information is represented by the magnitude of the resistance of the cell. Generally, a cell exhibits a high resistance state to represent a logic 0, and a cell exhibits a low resistance state to represent a logic 1. Further subdivision of the cell resistance values may be used to represent multi-bit values. As shown in fig. 1(a), the model of the conductive wire of the ReRAM unit is that the conductive wire is connected with the upper and lower poles to present the lowest resistance value, and otherwise, the conductive wire presents the highest resistance value. Applying forward and reverse pulses to the cell can change the resistance of the cell (change the conductive filament length). As shown in fig. 1(b), the resistance of the cell varies with the forward pulse and the reverse pulse, and the variation of the resistance of the cell under the pulse is represented by the magnitude of the current.
As shown in FIG. 1(c), ReRAM can be used to construct a memory array in the form of a cross-point array, where the ReRAM device is directly sandwiched between word and bit lines and the cell area can reach the theoretical minimum 4F2(F is the characteristic size) is one of the effective schemes for constructing mass storage. The array organization form of the ReRAM can just complete the matrix vector multiplication operation, so that the ReRAM can well realize the storage and calculation fusion. One bit line in a ReRAM array may simulate a vector dot-product operation, specificallyWhereinAndrespectively representing the voltage vector of the word line input and the conductance vector formed by the ReRAM unit of the row, and the dot product operation result isThe current flowing on one bit line is the sum of the product of the voltage and the cell conductance, and the voltage vector is exactly completedAnd ReRAM cell conductance vectorAnalog dot-product (inner product) operation of (1). The dot product operation of a single ReRAM bit line is expanded to the whole array, and the current of a plurality of bit lines is the result of the matrix vector multiplication operation, wherein the matrix is the conductance value of the ReRAM unit in the array, and the vector is the voltage value applied to the word line. The ReRAM cell can also be configured as a memory array in the form of a one-transistor-one-resistor (1T 1R) or a one-transistor-one-resistor (1 TnR), and can also perform matrix-vector multiplication operations.
Naive Bayes (Bayes, NB) algorithm is a widely used classification algorithm with the goal of giving one instance to be classified x (a)1,a2,…,am) Then, a class c with the maximum conditional probability is found, thereby completing classification, wherein the to-be-classified instances x contain m attributes in total, and A is used foriThe set of all possible values of the ith (i ═ 1,2,3 … … m) attribute representing instance x, i.e. ai∈AiThe set of all classes is denoted by S, i.e., there is c ∈ S. The characteristic conditions of the naive Bayes hypothesis are mutually independent, and the prior probability is obtained through the learning of a given training set in the training processAnd conditional probabilityThen after inputting the example x to be classified in the actual classification process, the class which enables the posterior probability to be maximum is calculated as follows:
prior probability in equationFor each class proportion in the training setConditional probabilityFor each attribute value in each categoryargmaxc(E (c)) represents the value of c when the value of the expression E (c) is maximized; training to obtain prior probabilityAnd conditional probabilityAnd obtaining a classification model for classifying the examples to be classified. In order to alleviate the problem of model accuracy caused by the assumption that the complete condition of the attributes is independent, the existing NB algorithm endows different attributes with different weights wiHence the name Weighted naive Bayes algorithm (Weighted)WNB), accordingly, the prediction formula of the WNB algorithm is:
the NB algorithm can be seen as a special WNB algorithm, which is characterized by the weight w of each attributeiAre all 1.
At present, a naive Bayes algorithm is realized by software mostly, and storage and calculation are separated under a traditional von Neumann architecture, so that when the naive Bayes algorithm is used for classification, a memory is frequently accessed, data is frequently migrated between a CPU and the memory, and the solution speed is low, the delay is high, and the power consumption is high.
Disclosure of Invention
Aiming at the defects and improvement requirements of the prior art, the invention provides a naive Bayes classification method and an engine based on a resistive random access memory array, and aims to reduce the delay and power consumption of the naive Bayes classification method.
To achieve the above object, according to a first aspect of the present invention, there is provided a naive bayes classification method based on a resistive random access memory array, comprising:
obtaining a priori probability through a training process of a WNB algorithmConditional probabilities corresponding to the values of the attributesAnd using a monotonically decreasing function f to assign a priori probabilitiesAnd conditional probabilityRespectively transformed into prior probabilitiesAnd conditional probability
For each class c epsilon S, a bit line is allocated in the resistive random access memory array and is a priori probability on the bit lineAnd each conditional probabilityRespectively allocating a storage unit; setting the conductance of the memory cell such that the conductance of the memory cell is related to the corresponding prior probability or conditional probability; between different categories, the storage units corresponding to the prior probabilities are located on the same word line, and the storage units corresponding to the same attribute values are located on the same word line;
according to the example to be classified x ═ a1,a2,…,am) Applying voltage on the corresponding word line, and simultaneously applying voltage on the word line corresponding to the prior probability, so that the current of the memory cell corresponding to the prior probability is equal to the current of the memory cell corresponding to the prior probability after the voltage is appliedAnd the current of the storage unit corresponding to each attribute value is
Collecting the current on each bit line, obtaining the minimum current in the current, and determining the category corresponding to the minimum current as the final classification result;
wherein, wiWeight representing the ith attribute, ai∈AiRepresents the value of the ith attribute, AiRepresenting all value sets of the ith attribute, wherein i belongs to {1,2, … m }, and m represents the total number of the attributes; s represents a set of all categories;
since the function f satisfiesWherein y and z represent function variables, i.e. Therefore, the continuous multiplication operation in the prediction formula of the WNB algorithm can be transformed into a point multiplication operation by the function f, thereby transforming the prediction formula into:wherein, argmincAnd E (c) represents the value of c when the value of the expression E (c) is minimized, and the prediction formulas before and after transformation are equivalent because the function f is monotonically decreased, that is, the results of classifying the same to-be-classified examples are consistent before and after the prediction formulas are transformed.
The continuous multiplication operation in the prediction formula of the WNB algorithm can be converted into the point multiplication operation by utilizing the monotone decreasing function, the consistent classification results of the same to-be-classified examples before and after conversion are ensured, and the point multiplication operation is further completed by utilizing the bit line in the resistive random access memory array, so that the naive Bayes algorithm can be completed in the resistive random access memory array.
Further, the function f is a log function with a base greater than 1.
Further, after setting the conductance of the memory cell, the conductance of the memory cell corresponding to the prior probability isThe conductance of the memory cell corresponding to each attribute value is
And when applying voltage, applying unit voltage on the word line corresponding to the prior probability, wherein the voltage applied on the word line corresponding to the attribute value is the corresponding attribute weight wi。
Since the current of a memory cell is the product of the voltage applied to its word line and its conductance, the present invention proceeds as described aboveThe conductance setting method and the voltage applying method can ensure that the current of the memory cell corresponding to the prior probability isAnd the specific value of each attribute corresponds to the current of the corresponding storage unit asI.e. the product of the weight corresponding to the value of the attribute and the conditional probability.
Further, after setting the conductance of the memory cell, the conductance of the memory cell corresponding to the prior probability isThe conductance of the memory cell corresponding to each attribute value is
And when voltage is applied, unit voltage is applied to the word line corresponding to the prior probability and the word line corresponding to the attribute value.
Because the current of the memory cell is the product of the voltage applied to the word line of the memory cell and the conductance of the memory cell, the invention can ensure that the current of the memory cell corresponding to the prior probability is the value of the current of the memory cell corresponding to the prior probability by the conductance setting method and the voltage application methodAnd the specific value of each attribute corresponds to the current of the corresponding storage unit asThat is, the attribute takes the product of the corresponding weight and the conditional probability, and due to the nonlinear resistance characteristic of the ReRAM cell, applying the same unit voltage to the memory cell can achieve better test accuracy than applying different voltages to the ReRAM cell.
According to a second aspect of the present invention, there is provided a naive bayes engine based on a resistive memory array, comprising: the device comprises a training module, a conductance setting module, a voltage applying module and a minimum current detecting module;
a training module for obtaining the prior probability through the training process of WNB algorithmConditional probabilities corresponding to the values of the attributesAnd using a monotonically decreasing function f to assign a priori probabilitiesAnd conditional probabilityRespectively transformed into prior probabilitiesAnd conditional probability
The conductance setting module is used for allocating a bit line for each class c epsilon S in the resistive random access memory array, and the prior probability is on the bit lineAnd each conditional probabilityRespectively allocating a storage unit; setting the conductance of the memory cell such that the conductance of the memory cell is related to the corresponding prior probability or conditional probability; between different categories, the storage units corresponding to the prior probabilities are located on the same word line, and the storage units corresponding to the same attribute values are located on the same word line;
a voltage application module for applying (a) according to the example x to be classified1,a2,…,am) The values of the attributes are as followsApplying voltage to corresponding word line, and applying voltage to the word line corresponding to the prior probability to make the current of the memory cell corresponding to the prior probability beAnd the current of the storage unit corresponding to each attribute value is
The minimum current detection module is used for collecting the current on each bit line, obtaining the minimum current in the current, and determining the category corresponding to the minimum current as a final classification result;
wherein, wiWeight representing the ith attribute, ai∈AiRepresents the value of the ith attribute, AiRepresenting all value sets of the ith attribute, wherein i belongs to {1,2, … m }, and m represents the total number of the attributes; s represents a set of all categories;
further, the minimum current detection module comprises a voltage controller, a one-hot code unit, a plurality of operational amplifiers and a plurality of voltage comparators;
the voltage controller is used for generating a reference voltage Vref;
The reverse input end of each operational amplifier is respectively connected with one bit line of the resistive random access memory array, and the input voltage of the same-direction input end is 0; the operational amplifier is used for converting the current on the corresponding bit line into bit line voltage so as to realize the acquisition of bit line current;
the reverse input end of each voltage comparator is respectively connected with the output end of one operational amplifier, and the homodromous input end of each voltage comparator is connected with the output end of the voltage controller; the voltage comparator is used for comparing the reference voltage VrefComparing with the corresponding bit line voltage and outputting a comparison result;
the input end of the one-hot code unit is connected with the output ends of all the voltage comparators, and the one-hot code unit is used for detecting the comparison results of all the voltage comparators and identifying the minimum bit line voltage when the comparison results of all the voltage comparators form the one-hot code, so that the minimum bit line current is identified, and the category corresponding to the minimum current is determined as the final classification result to be output;
the voltage controller is also used for adjusting the reference voltage V when the comparison results of all the voltage comparators do not form one-hot codesref。
In a traditional engine for performing calculation based on the resistive random access memory, a device for performing analog-to-digital conversion is required, so that the chip area is large. The minimum current detection module only comprises basic devices such as an operational amplifier, a voltage comparator, a one-hot code unit and a voltage controller, and can detect the minimum current in all currents at the same time, so that the resistive memory array-based naive Bayes engine provided by the invention has the advantages of small chip area and high calculation speed.
Further, the voltage controller adjusts the reference voltage when the comparison results of all the voltage comparators do not form the one-hot code, and the method comprises the following steps:
(S1) initializing the reference voltage VrefIs a preset minimum voltage Vlowest;
(S2) detecting whether the one-hot code unit forms the one-hot code, if so, judging that the classification is successful, and ending the operation; otherwise, go to step (S3);
(S3) if Vref≤VhighestIncreasing the reference voltage by a preset incremental amplitude delta VrefProceeding to step (S2); otherwise, judging that the classification fails and finishing the operation;
wherein, VhighestIs a preset maximum voltage.
In the voltage comparator, when the voltage obtained by current conversion is greater than the reference voltage VrefIf so, the voltage comparator outputs a high level, otherwise, the voltage comparator outputs a low level; the method for adjusting the reference voltage is a step-by-step increasing method by applying a voltage at [ V ]lowest,Vhighest]Within the range ofStep up the magnitude V of the reference voltagerefWhen V isrefWhen the voltage is larger than any one bit line voltage, the comparison results of all the voltage comparators form the one-hot code, and finally, the corresponding category can be output through the one-hot code unit.
Further, the voltage controller adjusts the reference voltage when the comparison results of all the voltage comparators do not form the one-hot code, and the method comprises the following steps:
(T1) initializing the reference voltage VrefHas a value range with an upper bound of V1=VhighestAnd initializing a reference voltage VrefHas a value range lower bound of V2=Vlowest;
(T2) setting a reference voltage VrefIs a Vref=(V1+V2) And/2, detecting whether the one-hot code unit forms the one-hot code, if so, judging that the classification is successful, and ending the operation; otherwise, go to step (T3);
(T3) if it is not less than the reference voltage VrefIs 0, then the reference voltage V is appliedrefIs updated to V1=VrefThereafter, the process proceeds to step (T2); if less than the reference voltage VrefIf the number of bit line voltages is greater than 1, the reference voltage V is appliedrefIs updated to V at the lower bound of the value range1=VrefThereafter, the process proceeds to step (T2);
wherein, VhighestAnd VlowestRespectively a preset highest voltage and a preset lowest voltage.
The method for adjusting the reference voltage is a two-voltage method, and the change range of the reference voltage is reduced by half at one time by the adjusting method, so that the reference voltage can approach the target voltage value more quickly, and the classification process is accelerated.
According to a third aspect of the present invention, there is provided a classification system including the naive bayes engine based on a resistive memory array provided in the second aspect of the present invention, the system further including a resistive memory array;
the conductance setting module is connected with a word line and a bit line of the resistance change memory array; the voltage applying module is connected with a word line of the resistive random access memory array; the minimum current detection module is connected with a bit line of the resistive random access memory array.
According to the classification system provided by the invention, the resistance change memory array is used as both the memory device and the core operation device, so that the integration of memory and calculation is achieved, and a naive Bayes algorithm can be completed under the conditions of low delay, low energy consumption and low chip area overhead.
Generally, by the above technical solution conceived by the present invention, the following beneficial effects can be obtained:
(1) according to the naive Bayes classification method, the engine and the system based on the resistive random access memory array, provided by the invention, the continuous multiplication operation in the prediction formula of the WNB algorithm can be converted into the point multiplication operation by utilizing the monotonically decreasing function, the same classification result of the same to-be-classified examples before and after the conversion is ensured to be consistent, and the point multiplication operation is further completed by utilizing the bit line in the resistive random access memory array, so that the naive Bayes algorithm can be completed in the resistive random access memory array, and the delay and the power consumption of the naive Bayes classification method can be reduced.
(2) In the preferred scheme of the naive Bayes classification method, the engine and the system based on the resistive random access memory array, the conductance of the memory cell corresponding to the prior probability is the prior probability by setting the conductance of the memory cellAnd the conductance of the memory cell corresponding to the specific value of each attribute is the corresponding weight wiAnd conditional probabilityAnd the same unit voltage is applied to the memory cell, thereby ensuring that the current of the memory cell corresponding to the prior probability is equal toAnd the specific value of each attribute corresponds to the current of the corresponding storage unit asAnd meanwhile, better test accuracy can be obtained.
(3) According to the naive Bayes engine based on the resistive random access memory array, the minimum current detection module only comprises basic devices such as an operational amplifier, a voltage comparator, a one-hot code unit and a voltage controller, the chip area is small, and the calculation speed is high.
(4) In the preferred scheme of the naive Bayes engine based on the resistive random access memory array, the voltage controller adjusts the reference voltage by using a voltage division method, and the variation range of the reference voltage is reduced by half at one time, so that the reference voltage can approach the target voltage value more quickly, and the classification process is accelerated.
Drawings
Fig. 1(a) is a schematic diagram of a conductive wire model of a conventional resistive random access memory cell;
fig. 1(b) is a schematic diagram of a current variation with pulse number in a conventional resistive random access memory cell;
fig. 1(c) is a schematic diagram of a cross-point array of the resistive memory cells and a matrix vector calculation thereof;
fig. 2 is a schematic diagram of a naive bayes classification method based on a resistive random access memory array according to an embodiment of the present invention;
FIG. 3(a) is a schematic diagram of conductance setting and voltage application according to an embodiment of the present invention;
FIG. 3(b) is a schematic diagram of another conductance setting and voltage application method provided by the embodiment of the present invention;
FIG. 4 is a schematic diagram of a minimum current detection module according to an embodiment of the present invention;
fig. 5 is a schematic diagram illustrating a method for adjusting a reference voltage by a voltage controller according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
In the present application, the terms "first," "second," and the like (if any) in the description and the drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Before explaining the technical scheme of the invention in detail, the naive Bayes algorithm is briefly introduced as follows:
naive Bayes (Bayes, NB) algorithm is a widely used classification algorithm with the goal of giving one instance to be classified x (a)1,a2,…,am) Then, a class c having the maximum conditional probability is found, thereby completing the classification. The characteristic conditions of the naive Bayes hypothesis are mutually independent, the prior probability and the conditional probability are obtained through learning of a given training set in the training process, and then an example x (a) to be classified is input in the actual classification process1,a2,…,am) Then, the class that maximizes the posterior probability is determined. That is, the goal of the NB algorithm is to find a class c with the highest conditional probability given x:
wherein S is a set of all categories,then argmax, the class to which the final NB algorithm predicts for instance xc(E (c)) represents the value of c when the expression E (c) is maximized. We use A asiThe set of all possible values of the ith attribute representing instance x, namely ai∈Ai. By the Bayesian formula P (A | B) ═ P (B | A) × P (A)/P (B) and the assumption that the attributes in the class are completely conditionally independent, we can combineThe deformation is as follows:
substituting (2) into NB algorithm equation (1) while removing the same molecules, one can get:
as known from the prediction formula, before the NB algorithm is implemented, the prior probability, namely the ratio of each category in the training set, needs to be obtainedAnd conditional probability, i.e. the ratio of the individual attribute values in each classThe training process of naive Bayes is a process of calculating prior probability and conditional probability after collecting training set data. This process is simply a statistic on the training data set. And classifying the target to be classified according to the model obtained in the training process.
In order to alleviate the problem of model accuracy caused by the assumption that the complete condition of the attributes is independent, the existing NB algorithm endows different attributes with different weights wiHence the name Weighted naive Bayes algorithm (Weighted)WNB). The final prediction formula of the algorithm is as follows:
the NB algorithm is known as WNB algorithm at wiAll are the special cases under 1.
In order to solve the problems of high delay and high power consumption of the existing WNB algorithm due to software-based implementation, the naive Bayes classification method based on the resistive random access memory array provided by the invention comprises the following steps as shown in FIG. 2:
obtaining a priori probability through a training process of a WNB algorithmConditional probabilities corresponding to the values of the attributesAnd using a monotonically decreasing function f to assign a priori probabilitiesAnd conditional probabilityRespectively transformed into prior probabilitiesAnd conditional probabilityWherein, ai∈AiRepresents the value of the ith attribute, AiRepresenting all value sets of the ith attribute, i belongs to {1,2, … m }, m represents the total number of the attributes, S represents the set of all categories, and c belongs to S represents a specific analogy;
for each class c epsilon S, a bit line is allocated in the resistive random access memory array and is a priori probability on the bit lineAnd each conditional probabilityRespectively allocating a storage unit; setting the conductance of the memory cell such that the conductance of the memory cell is related to the corresponding prior probability or conditional probability, the specific way in which the conductance is setAs shown in fig. 1, the resistance of the memory cell can be changed by applying forward and reverse pulses to the cell, i.e. by changing the length of the conductive filament; between different categories, the storage units corresponding to the prior probabilities are located on the same word line, and the storage units corresponding to the same attribute values are located on the same word line; with Size (A)i) And (3) representing the total number of all possible values of the ith attribute, for the ith attribute, allocating a storage unit to the conditional probability corresponding to each attribute value, and allocating Size (A) to the ith attribute in totali) A total number of memory cells allocated on the bit line for a particular class c
According to the example to be classified x ═ a1,a2,…,am) Applying voltage on the corresponding word line, and simultaneously applying voltage on the word line corresponding to the prior probability, so that the current of the memory cell corresponding to the prior probability is equal to the current of the memory cell corresponding to the prior probability after the voltage is appliedAnd the current of the storage unit corresponding to each attribute value isSo that after application of a voltage, for a particular class c, the current on the corresponding bit line isAs shown in fig. 2, the array stores all possible values of each attribute in the array unit, and the plurality of bit lines on the memory array can simultaneously complete the calculation of all the categories in the set S;
collecting the current on each bit line, obtaining the minimum current in the current, and determining the category corresponding to the minimum current as the final classification result;
wherein the function f satisfies:as an alternative implementation, in this embodiment, the function f is a-log function with a base number greater than 1.
Therefore, the continuous multiplication operation in the prediction formula of the WNB algorithm can be transformed into a point multiplication operation by the function f, thereby transforming the prediction formula into:
wherein, argmincE (c) represents the value of c when the expression E (c) is minimized;
and because the function f is monotonously decreased, the prediction formula (4) and the prediction formula (5) before and after transformation are equivalent, namely, the classification results of the same examples to be classified are consistent before and after the transformation of the prediction formula.
According to the naive Bayes classification method based on the resistive random access memory unit, continuous multiplication operation in a prediction formula of a WNB algorithm can be converted into point multiplication operation by using a monotonically decreasing function, the same classification results of the same to-be-classified examples before and after conversion are ensured to be consistent, and then the point multiplication operation is completed by using a bit line in the resistive random access memory array, so that the naive Bayes algorithm can be completed in the resistive random access memory array.
In an alternative embodiment, as shown in figure 3(a),in the naive Bayes classification method based on the resistive random access memory unit, after the conductance of the memory unit is set, the conductance of the memory unit corresponding to the prior probability isThe conductance of the memory cell corresponding to each attribute value is
And when applying voltage, applying unit voltage on the word line corresponding to the prior probability, wherein the voltage applied on the word line corresponding to the attribute value is the corresponding attribute weight wi;
Because the current of the memory cell is the product of the voltage applied to the word line of the memory cell and the conductance of the memory cell, the conductance setting method and the voltage applying method can ensure that the current of the memory cell corresponding to the prior probability is the current of the memory cell corresponding to the prior probabilityAnd the specific value of each attribute corresponds to the current of the corresponding storage unit asI.e. the product of the weight corresponding to the value of the attribute and the conditional probability.
In an alternative embodiment, as shown in fig. 3(b), in the naive bayes classification method based on the resistive random access memory cell, after the conductance of the memory cell is set, the conductance of the memory cell corresponding to the prior probability isThe conductance of the memory cell corresponding to each attribute value is
When voltage is applied, unit voltage is applied to the word line corresponding to the prior probability and the word line corresponding to the attribute value;
by the aboveThe conductance setting method and the voltage applying method can ensure that the current of the memory cell corresponding to the prior probability isAnd the specific value of each attribute corresponds to the current of the corresponding storage unit asAlso, due to the non-linear resistance characteristics of ReRAM cells, applying the same unit voltage to the memory cell can result in better test accuracy than applying a different voltage to a ReRAM cell.
Corresponding to the naive Bayes classification method based on the resistive random access memory array, the invention also provides a naive Bayes engine based on the resistive random access memory array, which comprises the following steps: the device comprises a training module, a conductance setting module, a voltage applying module and a minimum current detecting module;
a training module for obtaining the prior probability through the training process of WNB algorithmConditional probabilities corresponding to the values of the attributesAnd using a monotonically decreasing function f to assign a priori probabilitiesAnd conditional probabilityRespectively transformed into prior probabilitiesAnd conditional probability
A conductance setting module for storing the resistance of each class c ∈ SAllocating a bit line in the storage array and having a priori probability on the bit lineAnd each conditional probabilityRespectively allocating a storage unit; setting the conductance of the memory cell such that the conductance of the memory cell is related to the corresponding prior probability or conditional probability; between different categories, the storage units corresponding to the prior probabilities are located on the same word line, and the storage units corresponding to the same attribute values are located on the same word line;
a voltage application module for applying (a) according to the example x to be classified1,a2,…,am) Applying voltage on the corresponding word line, and simultaneously applying voltage on the word line corresponding to the prior probability, so that the current of the memory cell corresponding to the prior probability is equal to the current of the memory cell corresponding to the prior probability after the voltage is appliedAnd the current of the storage unit corresponding to each attribute value is
The minimum current detection module is used for collecting the current on each bit line, obtaining the minimum current in the current, and determining the category corresponding to the minimum current as a final classification result;
wherein, wiWeight representing the ith attribute, ai∈AiRepresents the value of the ith attribute, AiRepresenting all value sets of the ith attribute, wherein i belongs to {1,2, … m }, and m represents the total number of the attributes; s represents a set of all categories;
in the embodiment of the present invention, the specific implementation of the training module, the conductance setting module and the voltage applying module may refer to the description of the above method embodiment, and will not be repeated here.
In an alternative embodiment, as shown in fig. 4, the minimum current detection module includes a voltage controller, a one-hot code unit, a plurality of operational amplifiers, and a plurality of voltage comparators;
the voltage controller is used for generating a reference voltage Vref;
The reverse input end of each operational amplifier is respectively connected with one bit line of the resistive random access memory array, and the input voltage of the same-direction input end is 0; the operational amplifier is used for converting the current on the corresponding bit line into bit line voltage so as to realize the acquisition of bit line current;
the reverse input end of each voltage comparator is respectively connected with the output end of one operational amplifier, and the homodromous input end of each voltage comparator is connected with the output end of the voltage controller; the voltage comparator is used for comparing the reference voltage VrefComparing with the corresponding bit line voltage and outputting a comparison result;
the input end of the one-hot code unit is connected with the output ends of all the voltage comparators, and the one-hot code unit is used for detecting the comparison results of all the voltage comparators and identifying the minimum bit line voltage when the comparison results of all the voltage comparators form the one-hot code, so that the minimum bit line current is identified, and the category corresponding to the minimum current is determined as the final classification result to be output;
the voltage controller is also used for adjusting the reference voltage V when the comparison results of all the voltage comparators do not form one-hot codesref;
In a traditional engine which carries out calculation based on a resistive random access memory, a device for analog-to-digital conversion is required, so that the chip area is large; according to the naive Bayes engine based on the resistive random access memory array, the minimum current detection module only comprises basic devices such as the operational amplifier, the voltage comparator, the one-hot code unit and the voltage controller, and can detect the minimum current in all currents at the same time, so that the chip area is small, and the calculation speed is high.
In an alternative embodiment, when the comparison results of all the voltage comparators do not form the one-hot code, the method for the voltage controller to adjust the reference voltage is a step-by-step increasing method, as shown in fig. 5, and the method specifically includes:
(S1) initializing the reference voltage VrefIs a preset minimum voltage Vlowest;
(S2) detecting whether the one-hot code unit forms the one-hot code, if so, judging that the classification is successful, and ending the operation; otherwise, go to step (S3);
(S3) if Vref≤VhighestIncreasing the reference voltage by a preset incremental amplitude delta VrefProceeding to step (S2); otherwise, judging that the classification fails and finishing the operation;
wherein, VhighestIs a preset maximum voltage;
in the voltage comparator, when the voltage obtained by current conversion is greater than the reference voltage VrefIf so, the voltage comparator outputs a high level, otherwise, the voltage comparator outputs a low level; stepwise increasing the rate of increase by a factor of [ Vlowest, V ]highest]Within a range of gradually increasing the magnitude V of the reference voltagerefWhen V isrefWhen the voltage of the bit line is larger than the voltage of any bit line, the comparison results of all the voltage comparators form the one-hot code, and finally, the corresponding category can be output through the one-hot code unit; maximum voltage VhighestAnd the lowest voltage VlowestCan be set to the voltage obtained when the conductance of the memory cell is highest and lowest, reference voltage VrefThe incremental amplitude Δ V of (a) can be determined according to the minimum accuracy that the resistance change memory can achieve.
In an alternative embodiment, when the comparison results of all the voltage comparators do not form the one-hot code, the voltage controller adjusts the reference voltage to be a two-voltage method, as shown in fig. 5, where the method specifically includes:
(T1) initializing the reference voltage VrefHas a value range with an upper bound of V1=VhighestAnd initializing a reference voltage VrefHas a value range lower bound of V2=Vlowest;
(T2) setting a reference voltage VrefIs a Vref=(V1+V2)/2Detecting whether the one-hot code unit forms the one-hot code, if so, judging that the classification is successful, and ending the operation; otherwise, go to step (T3);
(T3) if it is not less than the reference voltage VrefIs 0, then the reference voltage V is appliedrefIs updated to V1=VrefThereafter, the process proceeds to step (T2); if less than the reference voltage VrefIf the number of bit line voltages is greater than 1, the reference voltage V is appliedrefIs updated to V at the lower bound of the value range1=VrefThereafter, the process proceeds to step (T2);
wherein, VhighestAnd VlowestThe preset highest voltage and the preset lowest voltage are respectively set as the voltage obtained when the conductance of the memory unit is highest and lowest;
the voltage division method reduces the variation range of the reference voltage by half at one time, so that the reference voltage can approach the target voltage value more quickly, and the classification process is accelerated.
The invention also provides a classification system comprising the naive Bayes engine based on the resistive random access memory array, and the system also comprises a resistive random access memory array;
the conductance setting module is connected with a word line and a bit line of the resistance change memory array; the voltage applying module is connected with a word line of the resistive random access memory array; the minimum current detection module is connected with a bit line of the resistance change memory array;
according to the classification system provided by the invention, the resistance change memory array is used as both the memory device and the core operation device, so that the integration of memory and calculation is achieved, and a naive Bayes algorithm can be completed under the conditions of low delay, low energy consumption and low chip area overhead.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (9)
1. A naive Bayes classification method based on a resistive random access memory array is characterized by comprising the following steps:
obtaining a priori probability through a training process of a WNB algorithmConditional probabilities corresponding to the values of the attributesAnd using a monotonically decreasing function f to assign a priori probabilitiesAnd conditional probabilityRespectively transformed into prior probabilitiesAnd conditional probability
For each class c epsilon S, a bit line is allocated in the resistive random access memory array and is a priori probability on the bit lineAnd each conditional probabilityRespectively allocating a storage unit; setting the conductance of the memory cell such that the conductance of the memory cell is related to the corresponding prior probability or conditional probability; between different categories, the storage units corresponding to the prior probabilities are located on the same word line, and the storage units corresponding to the same attribute values are located on the same word line;
according to the example to be classified x ═ a1,a2,…,am) The value of each attribute in the phaseApplying a voltage to the corresponding word line, and simultaneously applying a voltage to the word line corresponding to the prior probability so that the current of the memory cell corresponding to the prior probability is equal to the current of the memory cell corresponding to the prior probability after the voltage is appliedAnd the current of the storage unit corresponding to each attribute value is
Collecting the current on each bit line, obtaining the minimum current in the current, and determining the category corresponding to the minimum current as the final classification result;
wherein, wiWeight representing the ith attribute, ai∈AiRepresents the value of the ith attribute, AiRepresenting all value sets of the ith attribute, wherein i belongs to {1,2, … m }, and m represents the total number of the attributes; s represents a set of all categories;
2. the naive Bayes classification method based on resistive random access memory array according to claim 1, wherein the function f is a-log function with a base number greater than 1.
3. The naive Bayes classification method based on the resistive random access memory array as claimed in claim 1 or 2, wherein after the conductance of the memory cell is set, the conductance of the memory cell corresponding to the prior probability isThe conductance of the memory cell corresponding to each attribute value is
And when applying voltage, applying unit voltage on the word line corresponding to the prior probability, wherein the voltage applied on the word line corresponding to the attribute value is the corresponding attribute weight wi。
4. The naive Bayes classification method based on the resistive random access memory array as claimed in claim 1 or 2, wherein after the conductance of the memory cell is set, the conductance of the memory cell corresponding to the prior probability isThe conductance of the memory cell corresponding to each attribute value is
And when voltage is applied, unit voltage is applied to the word line corresponding to the prior probability and the word line corresponding to the attribute value.
5. A naive Bayes engine based on a resistive random access memory array, comprising: the device comprises a training module, a conductance setting module, a voltage applying module and a minimum current detecting module;
the training module is used for obtaining the prior probability through the training process of the WNB algorithmConditional probabilities corresponding to the values of the attributesAnd using a monotonically decreasing function f to assign a priori probabilitiesAnd conditional probabilityRespectively transformed into prior probabilitiesAnd conditional probability
The conductance setting module is used for allocating a bit line for each category c e S in the resistive random access memory array, and the prior probability is on the bit lineAnd each conditional probabilityRespectively allocating a storage unit; setting the conductance of the memory cell such that the conductance of the memory cell is related to the corresponding prior probability or conditional probability; between different categories, the storage units corresponding to the prior probabilities are located on the same word line, and the storage units corresponding to the same attribute values are located on the same word line;
the voltage application module is used for applying (a) according to the example x to be classified1,a2,…,am) Applying voltage on the corresponding word line, and simultaneously applying voltage on the word line corresponding to the prior probability, so that the current of the memory cell corresponding to the prior probability is equal to the current of the memory cell corresponding to the prior probability after the voltage is appliedAnd the current of the storage unit corresponding to each attribute value is
The minimum current detection module is used for collecting the current on each bit line, obtaining the minimum current in the current, and determining the category corresponding to the minimum current as a final classification result;
wherein, wiRepresenting the ith attributeWeight, ai∈AiRepresents the value of the ith attribute, AiRepresenting all value sets of the ith attribute, wherein i belongs to {1,2, … m }, and m represents the total number of the attributes; s represents a set of all categories;
6. the resistive-switching-memory-array-based naive bayes engine of claim 5, wherein the minimum current detection module comprises a voltage controller, a one-hot unit, a plurality of operational amplifiers, and a plurality of voltage comparators;
the voltage controller is used for generating a reference voltage Vref;
The reverse input end of each operational amplifier is respectively connected with one bit line of the resistive random access memory array, and the input voltage of the same-direction input end is 0; the operational amplifier is used for converting the current on the corresponding bit line into bit line voltage so as to realize the acquisition of bit line current;
the reverse input end of each voltage comparator is respectively connected with the output end of one operational amplifier, and the homodromous input end of each voltage comparator is connected with the output end of the voltage controller; the voltage comparator is used for comparing the reference voltage VrefComparing with the corresponding bit line voltage and outputting a comparison result;
the input end of the one-hot code unit is connected with the output ends of all the voltage comparators, the one-hot code unit is used for detecting the comparison results of all the voltage comparators and identifying the minimum bit line voltage when the comparison results of all the voltage comparators form the one-hot code, so that the minimum bit line current is identified, and the category corresponding to the minimum current is determined as the final classification result to be output;
the voltage controller is also used for adjusting the reference voltage V when the comparison results of all the voltage comparators do not form the one-hot coderef。
7. The resistive-switching-memory-array-based naive bayes engine of claim 6, wherein the voltage controller adjusts the reference voltage when the comparison results of all voltage comparators do not form an one-hot code, the method comprising:
(S1) initializing the reference voltage VrefIs a preset minimum voltage Vlowest;
(S2) detecting whether an unique code is formed in the unique code unit, if so, determining that the classification is successful, and ending the operation; otherwise, go to step (S3);
(S3) if Vref≤VhighestIncreasing the reference voltage by a preset incremental amplitude delta VrefProceeding to step (S2); otherwise, judging that the classification fails and finishing the operation;
wherein, VhighestIs a preset maximum voltage.
8. The resistive-switching-memory-array-based naive bayes engine of claim 6, wherein the voltage controller adjusts the reference voltage when the comparison results of all voltage comparators do not form an one-hot code, the method comprising:
(T1) initializing the reference voltage VrefHas a value range with an upper bound of V1=VhighestAnd initializing a reference voltage VrefHas a value range lower bound of V2=Vlowest;
(T2) setting a reference voltage VrefIs a Vref=(V1+V2) And/2, detecting whether the one-hot code unit forms the one-hot code or not, if so, judging that the classification is successful, and ending the operation; otherwise, go to step (T3);
(T3) if it is not less than the reference voltage VrefIs 0, then the reference voltage V is appliedrefIs updated to V1=VrefThereafter, the process proceeds to step (T2); if less than the reference voltage VrefIf the number of bit line voltages is greater than 1, the reference voltage V is appliedrefIs updated to V at the lower bound of the value range1=VrefThereafter, the process proceeds to step (T2);
wherein, VhighestAnd VlowestRespectively a preset highest voltage and a preset lowest voltage.
9. A classification system comprising the naive bayes engine based on a resistive switching memory array according to any of claims 5-8, further comprising a resistive switching memory array;
the conductance setting module is connected with a word line and a bit line of the resistive random access memory array; the voltage applying module is connected with a word line of the resistive random access memory array; the minimum current detection module is connected with a bit line of the resistive random access memory array.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911100579.8A CN110993004B (en) | 2019-11-12 | 2019-11-12 | Naive Bayes classification method, engine and system based on resistive random access memory array |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911100579.8A CN110993004B (en) | 2019-11-12 | 2019-11-12 | Naive Bayes classification method, engine and system based on resistive random access memory array |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110993004A CN110993004A (en) | 2020-04-10 |
CN110993004B true CN110993004B (en) | 2021-08-20 |
Family
ID=70083896
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911100579.8A Active CN110993004B (en) | 2019-11-12 | 2019-11-12 | Naive Bayes classification method, engine and system based on resistive random access memory array |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110993004B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112241922B (en) * | 2020-09-07 | 2024-03-05 | 国网浙江省电力有限公司经济技术研究院 | Power grid asset comprehensive value assessment method based on improved naive Bayesian classification |
CN113191402B (en) * | 2021-04-14 | 2022-05-20 | 华中科技大学 | Memristor-based naive Bayes classifier design method, system and classifier |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109840833A (en) * | 2019-02-13 | 2019-06-04 | 苏州大学 | Bayes's collaborative filtering recommending method |
CN110413612A (en) * | 2019-07-02 | 2019-11-05 | 华中科技大学 | A kind of mixing internal memory performance optimization method and system based on hybrid index |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180211177A1 (en) * | 2017-01-25 | 2018-07-26 | Pearson Education, Inc. | System and method of bayes net content graph content recommendation |
-
2019
- 2019-11-12 CN CN201911100579.8A patent/CN110993004B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109840833A (en) * | 2019-02-13 | 2019-06-04 | 苏州大学 | Bayes's collaborative filtering recommending method |
CN110413612A (en) * | 2019-07-02 | 2019-11-05 | 华中科技大学 | A kind of mixing internal memory performance optimization method and system based on hybrid index |
Also Published As
Publication number | Publication date |
---|---|
CN110993004A (en) | 2020-04-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11580411B2 (en) | Systems for introducing memristor random telegraph noise in Hopfield neural networks | |
CN109817267B (en) | Deep learning-based flash memory life prediction method and system and computer-readable access medium | |
Kan et al. | Simple reservoir computing capitalizing on the nonlinear response of materials: theory and physical implementations | |
CN109146073B (en) | Neural network training method and device | |
US11354569B2 (en) | Neural network computation circuit including semiconductor storage elements | |
CN110993004B (en) | Naive Bayes classification method, engine and system based on resistive random access memory array | |
US11544540B2 (en) | Systems and methods for neural network training and deployment for hardware accelerators | |
CN111027619B (en) | Memristor array-based K-means classifier and classification method thereof | |
US11610105B2 (en) | Systems and methods for harnessing analog noise in efficient optimization problem accelerators | |
CN112990444B (en) | Hybrid neural network training method, system, equipment and storage medium | |
Schuman et al. | Resilience and robustness of spiking neural networks for neuromorphic systems | |
CN113643175B (en) | Data processing method and electronic device | |
CN103927550A (en) | Handwritten number identifying method and system | |
CN112819036A (en) | Spherical data classification device based on memristor array and operation method thereof | |
Gangardiwala et al. | Dynamically weighted majority voting for incremental learning and comparison of three boosting based approaches | |
CN106205680A (en) | Resistance variable memory device, reading circuit unit and operational approach thereof | |
CN110210412B (en) | Hyperspectral image classification method based on deep learning and multi-example learning | |
CN116467451A (en) | Text classification method and device, storage medium and electronic equipment | |
CN115331754A (en) | Molecule classification method based on Hash algorithm | |
Polani | On the optimization of self-organizing maps by genetic algorithms | |
CN116361449A (en) | Multi-label classification method, apparatus, device and computer readable storage medium | |
CN110729010B (en) | Semiconductor circuit and method of operating the same | |
CN112597311A (en) | Terminal information classification method and system based on low-earth-orbit satellite communication | |
Kaedi et al. | Holographic memory-based Bayesian optimization algorithm (HM-BOA) in dynamic environments | |
CN115664422B (en) | Distributed successive approximation type analog-to-digital converter and operation method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |