CN109062867A - Object and attribute while increased matrix Dynamic Attribute Reduction method - Google Patents

Object and attribute while increased matrix Dynamic Attribute Reduction method Download PDF

Info

Publication number
CN109062867A
CN109062867A CN201810756077.XA CN201810756077A CN109062867A CN 109062867 A CN109062867 A CN 109062867A CN 201810756077 A CN201810756077 A CN 201810756077A CN 109062867 A CN109062867 A CN 109062867A
Authority
CN
China
Prior art keywords
attribute
decision table
matrix
increment
variation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810756077.XA
Other languages
Chinese (zh)
Inventor
景运革
王春红
王宝丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yuncheng University
Original Assignee
Yuncheng University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yuncheng University filed Critical Yuncheng University
Priority to CN201810756077.XA priority Critical patent/CN109062867A/en
Publication of CN109062867A publication Critical patent/CN109062867A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses object and attribute while increased matrix Dynamic Attribute Reduction method, it is related to rough set and Granule Computing theory and technology field in data mining, calculating the minimal attributes reductions RED of decision table before changing firstU;After object and attribute increase to decision table simultaneously, whether the opposite Knowledge Granulation of decision table is equal after calculating the opposite Knowledge Granulation of decision table minimal attributes reductions and changing, if it is REDUIt is the minimal attributes reductions after variation, decision table removes RED after otherwise calculating variation using matrix method and increment mechanismUExcept all properties external importance, maximum attribute is chosen to be added in minimal attributes reductions, opposite Knowledge Granulation is calculated until with the opposite Knowledge Granulation of decision table is equal after variation, deletes redundant attributes, the minimal attributes reductions of decision table after being changed.The present invention solves the problems, such as object and attribute in decision table while can quickly calculate minimal attributes reductions when increasing, and helps to improve the efficiency of dynamic data knowledge excavation.

Description

Object and attribute while increased matrix Dynamic Attribute Reduction method
Technical field
It is same more particularly to object and attribute the present invention relates to rough set in data mining and Granule Computing theory and technology field When increased matrix Dynamic Attribute Reduction method.
Background technique
Now with the fast development of computer network, storage and the communication technology, all trades and professions all have accumulated various magnanimity Using data, and these using data the constantly dynamic change in reality, go to calculate these with conventional attribute Algorithm for Reduction Dynamic data needs to compute repeatedly, cause runing time consumption it is huge, this prevent traditional data mining method from time, have Effect ground solves the problems, such as Knowledge Discovery in dynamic data.Incremental learning technology can simulate the Mechanism of Cognition of people, can be sufficiently sharp Dynamic update and amendment are carried out to the knowledge newly increased into original operation result, so that updated knowledge be made to have more Real-time greatly reduces demand of the dynamic data analysis processing to time and space, improves the efficiency of data mining.This It is more able to satisfy the actual demand of society to a certain extent.
Realized since conventional attribute Algorithm for Reduction mainly utilizes in mathematics the intersection of sets and union to operate, calculating and Representation is all relatively more abstract and complicated.Matrix is very useful mathematical computation tool, it is because representation is intuitive, calculates behaviour Make the advantages such as simple, be widely applied to the fields such as engineering calculation and scientific research, handles dynamic number with the method for matrix According to the problem of set attribute reduction have extremely important theoretical value and more practical value.
Summary of the invention
The embodiment of the invention provides object and attribute while increased matrix Dynamic Attribute Reduction methods, can solve existing There is the problem of technology.
The present invention provides object and attribute while increased matrix Dynamic Attribute Reduction methods, and this method includes following step It is rapid:
Step 1, the opposite knowledge of decision table before the decision table, increment object set, delta attribute collection, variation before input changes The opposite knowledge of the minimal attributes reductions of decision table and minimal attributes reductions before granularity, conditional attribute Equivalent Relation Matrix, variation Granularity;
Step 2, after increment object set is added to decision table, the equivalence relation for calculating decision table after increasing increment object increases Moment matrix and increment object Equivalent Relation Matrix;
Step 3, it after delta attribute collection is added to decision table, calculates delta attribute Equivalent Relation Matrix and increases increment The equivalence relation Increment Matrix of decision table after attribute;
Step 4, after increment object set is added to decision table, decision table after the increase increment object obtained according to step 2 Equivalence relation Increment Matrix and increment object Equivalent Relation Matrix calculate variation after decision table Equivalent Relation Matrix;
Step 5, it after increment object set and delta attribute collection are added to decision table, is obtained using step 2 and step 3 Increase increment object after decision table equivalence relation Increment Matrix, increase delta attribute after decision table equivalence relation Increment Matrix The Equivalent Relation Matrix of decision table after variation is calculated with increment object Equivalent Relation Matrix;
Step 6, the opposite Knowledge Granulation that increment object set is calculated according to the opposite Knowledge Granulation of decision table before changing, is becoming On the basis of changing the preceding opposite Knowledge Granulation of decision table and the opposite Knowledge Granulation of increment object set, by based on matrix method Increment mechanism calculates the opposite Knowledge Granulation of decision table after variation, in the opposite Knowledge Granulation and minimum attribute of minimal attributes reductions The opposite knowledge of minimal attributes reductions decision table after variation is calculated on the basis of the opposite Knowledge Granulation of reduction increment object set Granularity, if variation after decision table opposite Knowledge Granulation and the minimal attributes reductions decision table after variation opposite Knowledge Granulation It is equal, then it jumps and executes step 9, otherwise jump and execute step 7;
Step 7, each attribute a is relative to decision before changing after calculating decision table increase increment object set and delta attribute collection The external importance of table minimal attributes reductions, circulation choose maximum external importance a0 and are added to decision table minimum category Property reduction in, calculate increase attribute a0 after minimal attributes reductions opposite Knowledge Granulation, until the phase of itself and decision table after variation Until equal to Knowledge Granulation;
Step 8, for each attribute a in minimal attributes reductions, minimal attributes reductions is calculated and delete the phase after attribute a To Knowledge Granulation, if minimal attributes reductions delete the opposite knowledge of the opposite Knowledge Granulation and decision table after variation after attribute a Granularity is equal, then deletes attribute a from minimal attributes reductions, and the attribute reduction finally obtained is the minimum of decision table after variation Attribute reduction;
Step 9, the minimal attributes reductions of decision table after output changes.
Object and attribute in the embodiment of the present invention increased matrix Dynamic Attribute Reduction method simultaneously, when some objects and After attribute is added to decision table simultaneously, the Equivalent Relation Matrix of decision table and on the basis of opposite Knowledge Granulation before variation passes through Increment mechanism based on matrix method calculates the equivalence relation Increment Matrix of decision table and opposite Knowledge Granulation after changing.Becoming Before changing on the basis of decision table minimal attributes reductions, the Dynamic Attribute Reduction algorithm based on matrix method is devised, it can be effective Ground solves the problems, such as how quickly to calculate least reduction when object set and property set dynamic increase in decision table.When some new attributes After being added to decision table simultaneously with new object, the least reduction of decision table after variation, the non-dynamic category based on matrix method are solved Property Algorithm for Reduction calculate variation after decision table least reduction when, need to restart to calculate, cause calculate least reduction when Between consume it is huge;And use the Dynamic Attribute Reduction algorithm of the invention based on matrix method can be by decision table before changing On the basis of equivalence relation, opposite Knowledge Granulation and least reduction, the equivalence of decision table after variation is calculated using increment mechanism Relational matrix and opposite Knowledge Granulation, can be quickly obtained the least reduction of decision table after variation.Therefore, inventive algorithm solves Minimum Reduction of Decision Tables calculates calculating time of the time less than the non-dynamic old attribute reduction algorithms based on matrix method after variation.
Compared with existing attribute reduction technology, the positive effect of the present invention is as follows:
One, be different from conventional attribute Algorithm for Reduction, the present invention calculated by matrix method decision table Equivalent Relation Matrix, The knowledge such as the inside and outside importance of opposite Knowledge Granulation and attribute, representation is intuitive, and calculating operation is simple, is easy real It is existing.
Realized since conventional attribute Algorithm for Reduction mainly utilizes in mathematics the intersection of sets and union to operate, calculating and Representation is all relatively more abstract and complicated.Many numbers in Dynamic Attribute Reduction algorithm proposed by the present invention based on matrix method Value is obtained by matrix operation, since software tools (such as MATLAB) many in actual life can effectively be located in The problem of managing matrix operation, therefore the Dynamic Attribute Reduction algorithm proposed by the present invention based on matrix method can be by above-mentioned software Tool is realized.
Two, it is different from conventional attribute Algorithm for Reduction, after decision table increases object and attribute, by the present invention in that using increment Mechanism can be quickly found out the least reduction of decision table after variation.
Current existing most of old attribute reduction algorithms are what static data collection designed, in processing dynamic data set reduction When, because original operation result cannot be efficiently used, prevent traditional data mining method from time, efficiently solve dynamic number The problem of according to middle Knowledge Discovery.Due in decision tables many in reality object and attribute be all that dynamic is increased, the present invention is logical The increment mechanism based on matrix method is crossed, the Equivalent Relation Matrix of decision table, opposite Knowledge Granulation and minimum attribute before variation On the basis of reduction, the Equivalent Relation Matrix of decision table after changing, opposite Knowledge Granulation and minimum attribute can be quickly obtained about Letter greatly reduces demand of the dynamic data analysis processing to time and space, improves Dynamic Data Mining and Knowledge Discovery Efficiency, to make updated knowledge with more real-time.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments of invention for those of ordinary skill in the art without creative efforts, can be with It obtains other drawings based on these drawings.
Fig. 1 is the process of object provided in an embodiment of the present invention and attribute while increased matrix Dynamic Attribute Reduction method Figure;
Fig. 2 is that the present invention is directed to 6 different data collection, is moved using the Dynamic Attribute Reduction algorithm based on matrix method with non- State old attribute reduction algorithms calculate Minimum Reduction of Decision Tables runing time comparison result figure;
When Fig. 3 is that the present invention is directed to while increasing different size property set and object set, moving based on matrix method is utilized State old attribute reduction algorithms and non-dynamic old attribute reduction algorithms calculate Minimum Reduction of Decision Tables runing time comparison result figure.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
Referring to Fig.1, the embodiment of the invention provides object and attribute while increased matrix Dynamic Attribute Reduction methods, should Method the following steps are included:
Step 1, the decision table before input variation, increment object set UX, delta attribute collection P, decision table is opposite before variation Knowledge Granulation GPU(D | C), conditional attribute Equivalent Relation MatrixBecome The minimal attributes reductions RED of decision table before changingUAnd REDUOpposite Knowledge Granulation GPU(D|REDU).Decision before wherein changing Table includes object set U, conditional attribute collection C, decision kind set D and object's property value.
Step 2, as increment object set UXAfter being added to decision table, the equivalence relation of decision table after increasing increment object is calculated Increment MatrixIt is closed according to the conditional attribute equivalence of the decision table before variation It is matrixCalculate increment object Equivalent Relation MatrixWherein RCIndicate the equivalence in conditional attribute collection C making policy decision table Relationship.
Step 3, it after delta attribute collection P is added to decision table, is closed according to the conditional attribute equivalence of the decision table before variation It is matrixCalculate delta attribute Equivalent Relation MatrixAccording toWithMeter Calculate the equivalence relation Increment Matrix of decision table after increasing delta attribute
Step 4, as increment object set UXAfter being added to decision table, decision after the increase increment object obtained according to step 2 The equivalence relation Increment Matrix and increment object Equivalent Relation Matrix of table calculate the Equivalent Relation Matrix of decision table after variationWherein RC∪PIt indicates in conditional attribute collection C and delta attribute collection P making policy decision The equivalence relation of table.
Step 5, as increment object set UXAfter being added to decision table with delta attribute collection P, obtained using step 2 and step 3 Increase increment object after decision table equivalence relation Increment Matrix, increase delta attribute after decision table equivalence relation increment square Battle array and increment object Equivalent Relation Matrix calculate the Equivalent Relation Matrix of decision table after variation
Step 6, according to the opposite Knowledge Granulation of decision table before changingCalculate increment object set UX Opposite Knowledge GranulationIn opposite Knowledge Granulation GPU(D | C) andOn the basis of, by being based on matrix The increment mechanism of method calculates the opposite Knowledge Granulation of decision table after variationWhereinFor variation after decision table Equivalent Relation Matrix,It is closed to change the conditional attribute equivalence of preceding decision table It is matrix,It is that decision table increases the equivalence relation Increment Matrix after delta attribute,Increase for decision table Equivalence relation Increment Matrix after measuring object,Equivalence relation after increasing increment object and delta attribute for decision table Increment Matrix,It is to be respectivelyTransposed matrix,For increment Object Equivalent Relation Matrix,It is that decision table increases the Equivalent Relation Matrix after conditional attribute and decision attribute,It is the conditional attribute of increment object set and the equivalence relation Increment Matrix of decision attribute,To determine after variation The conditional attribute of plan table and the equivalence relation Increment Matrix of decision attribute,It is that decision table increases increment pair after changing As the equivalence relation Increment Matrix of postcondition attribute,It is that decision table increases increment object postcondition attribute after changing With the equivalence relation Increment Matrix of decision attribute,For the opposite Knowledge Granulation of decision table after variation, GPU(D|C) For change before decision table opposite Knowledge Granulation,For the opposite Knowledge Granulation of increment object set, | U ∪ UX| it is variation Afterwards in decision table all objects number, | U | be the number of all objects in decision table before changing, | UX| it is in increment object set The number of all objects, Sum ()t×nFor the sum of matrix all elements, | |2Square of object number is represented, wherein RC∪DTable Show the equivalence relation in conditional attribute collection C and decision kind set D making policy decision table.In opposite Knowledge Granulation GPU(D|REDU) andOn the basis of calculate minimal attributes reductions REDUThe opposite Knowledge Granulation of decision table after variationIfIt then jumps and executes step 9, otherwise jump execution step 7.Wherein,For conditional attribute Equivalent Relation MatrixArithmetic mean of instantaneous value,For conditional attribute and decision attribute etc. Valence relational matrixArithmetic mean of instantaneous value.
Step 7, according to formulaIt calculates decision table and increases increment pair As collection and each attribute a after delta attribute collection are relative to minimal attributes reductions REDUExternal importanceIt circuits sequentially and chooses maximum external importance And it is added to minimal attributes reductions REDUIn, i.e. REDU←REDU∪{a0, it calculates and increases attribute a0RED afterwardsUOpposite know Know granularity, until its with the opposite Knowledge Granulation of decision table is equal after variation until.
Step 8, for minimal attributes reductions REDUIn each attribute a, calculate REDUOpposite knowledge after deleting attribute a GranularityIfThen by attribute a from REDUMiddle deletion, That is REDU←REDU{ a }, the minimal attributes reductions RED finally obtainedUFor the minimal attributes reductions of decision table after variation.
Step 9, by REDUIt is assigned toThe least reduction of decision table after output variation
Object and attribute while increased matrix Dynamic Attribute Reduction method, the condition of decision table before variation in the present invention Attribute Equivalence relational matrixThe opposite Knowledge Granulation GP of decision table before changingU(D | C), minimal attributes reductions REDUAnd the opposite Knowledge Granulation GP of least reductionU(D|REDU) on the basis of, as increment object set UXAdd with delta attribute collection P After being added to decision table, the opposite Knowledge Granulation of decision table after variation is calculated by matrix increment mechanismAnd REDU? The opposite Knowledge Granulation of decision table after variationAnd judge whether they are equal, ifThen REDUIt is the minimal attributes reductions of decision table after changing.IfDecision table removes RED after calculating variation using matrix method and increment mechanismUExcept other The external importance of all properties successively chooses the maximum attribute of external importance and is added to minimal attributes reductions REDUIn, And calculate RED after addition attributeUOpposite Knowledge Granulation and compared with the opposite Knowledge Granulation of decision table after variation, until Until it is equal with the opposite Knowledge Granulation of decision table after variation;Finally successively delete REDUIn redundant attributes, changed The minimal attributes reductions of decision table afterwards
In order to verify the validity proposed by the invention based on matrix method Dynamic Attribute Reduction algorithm, in UCI machine Study public data collection website in downloaded Dermatology, Cancer, Kr-vs-kp, Mushroom, Ticdata2000, The data that 6 data sets of Letter are verified as emulation experiment of the invention, and the runing time of calculating minimal attributes reductions Numerical value with reduction is as evaluation index of the invention.
Emulation experiment one
In order to illustrate the validity proposed by the invention based on matrix method Dynamic Attribute Reduction algorithm, 50% pair of each data set in Dermatology, Cancer, Kr-vs-kp, Mushroom, Ticdata2000, Letter As collection, 50% conditional attribute collection and decision kind set as basic, remaining object set and property set are made respectively For increment object data set and delta attribute collection, when increment object set and delta attribute collection are added dynamically to basic, Decision table carries out l-G simulation test after being changed, and has carried out comparative analysis, emulation experiment knot with non-dynamic old attribute reduction algorithms Fruit shows: the numerical value of the least reduction of decision table is similar after two kinds of algorithms are changed, in some of them data set Least reduction numerical value is identical, but based on matrix method Dynamic Attribute Reduction algorithm can be quickly obtained variation after decision The least reduction of table.Dynamic Attribute Reduction algorithm and non-dynamic old attribute reduction algorithms based on matrix method calculate decision after variation For the runing time comparison result of table least reduction as shown in Fig. 2, the X-axis in figure is different data collection, Y-axis is algorithms of different category Property reduction runing time common logarithm value (since reduction runing time difference is larger, in order to keep figure objectively anti- The trend of algorithms of different is reflected, so the common logarithm value that Y-axis is algorithms of different attribute reduction runing time indicates), same In two histograms of data set, left side be that non-dynamic attribute reduction obtains as a result, right side be dynamic attribute of the present invention about The result that letter obtains.It is easy to get from Fig. 2, compared with non-dynamic old attribute reduction algorithms, algorithm proposed by the invention, which calculates, to be become The runing time of Minimum Reduction of Decision Tables is less after change, proposed by the invention when especially for larger data collection Letter The advantage that algorithm calculates least reduction runing time is more obvious.
Emulation experiment two
With the object of decision table and being continuously increased for attribute, whether had based on matrix method Dynamic Attribute Reduction algorithm Effect? it is effectively, to give the object for successively increasing different weight percentage and attribute is emulated to further verify the present invention Experimental verification.The present invention is in Dermatology, Cancer, Kr-vs-kp, Mushroom, Ticdata2000, Letter 50% object set of each data set, 50% conditional attribute collection and decision kind set are as basic, remaining right As collecting and property set selects 20%, 40%, 60%, 80%, 100% data as increment object data set and attribute respectively After collection is gradually added to basic, decision table carries out l-G simulation test after being changed.Relevant Simulation results such as Fig. 3 Shown, the X axis in figure is increased property set and object set of different sizes, when Y-axis is the operation of algorithms of different attribute reduction Between common logarithm value (since reduction runing time difference is larger, in order to enable figure objectively to reflect becoming for algorithms of different Gesture, so the common logarithm value that Y-axis is algorithms of different attribute reduction runing time indicates), (a) is the category different weight percentage Property and object simultaneously when increasing to Dermatology data set, using based on matrix method Dynamic Attribute Reduction algorithm with it is non- Dynamic Attribute Reduction algorithm calculates Minimum Reduction of Decision Tables runing time comparison result figure, (b) for the attribute of different weight percentage When increasing to cancer data set simultaneously with object, Dynamic Attribute Reduction algorithm and non-dynamic attribute based on matrix method are utilized Algorithm for Reduction calculates Minimum Reduction of Decision Tables runing time comparison result figure, (c) same for the attribute and object different weight percentage When increasing to Kr-vs-kp data set, utilize Dynamic Attribute Reduction algorithm and non-dynamic attribute reduction based on matrix method Algorithm calculates Minimum Reduction of Decision Tables runing time comparison result figure, (d) is that the attribute and object of different weight percentage are increased simultaneously When being added to Mushroom data set, using based on the Dynamic Attribute Reduction algorithm of matrix method and non-dynamic old attribute reduction algorithms Minimum Reduction of Decision Tables runing time comparison result figure is calculated, (e) is that the attribute and object of different weight percentage are increased to simultaneously When Ticdata2000 data set, using based on the Dynamic Attribute Reduction algorithm of matrix method and non-dynamic old attribute reduction algorithms Minimum Reduction of Decision Tables runing time comparison result figure is calculated, (f) is that the attribute and object of different weight percentage are increased to simultaneously When Letter data set, Dynamic Attribute Reduction algorithm and non-dynamic old attribute reduction algorithms calculating decision based on matrix method are utilized Table least reduction runing time comparison result figure.It is easy to get from Fig. 3, with the increase of data set, inventive algorithm, which solves, to be become The calculating time of Minimum Reduction of Decision Tables is less than the calculating time of non-dynamic old attribute reduction algorithms after change, this says to a certain extent The least reduction that algorithm proposed by the present invention is illustrated in processing dynamic data set is reasonable.
Although preferred embodiments of the present invention have been described, it is created once a person skilled in the art knows basic Property concept, then additional changes and modifications may be made to these embodiments.So it includes excellent that the following claims are intended to be interpreted as It selects embodiment and falls into all change and modification of the scope of the invention.
Obviously, various changes and modifications can be made to the invention without departing from essence of the invention by those skilled in the art Mind and range.In this way, if these modifications and changes of the present invention belongs to the range of the claims in the present invention and its equivalent technologies Within, then the present invention is also intended to include these modifications and variations.

Claims (2)

1. object and attribute while increased matrix Dynamic Attribute Reduction method, which is characterized in that method includes the following steps:
Step 1, the opposite knowledge grain of decision table before the decision table, increment object set, delta attribute collection, variation before input changes The opposite knowledge grain of the minimal attributes reductions of decision table and minimal attributes reductions before degree, conditional attribute Equivalent Relation Matrix, variation Degree;
Step 2, after increment object set is added to decision table, the equivalence relation increment square of decision table after increasing increment object is calculated Battle array and increment object Equivalent Relation Matrix;
Step 3, it after delta attribute collection is added to decision table, calculates delta attribute Equivalent Relation Matrix and increases delta attribute The equivalence relation Increment Matrix of decision table afterwards;
Step 4, after increment object set is added to decision table, according to step 2 obtain increase increment object after decision table etc. Valence relationship Increment Matrix and increment object Equivalent Relation Matrix calculate the Equivalent Relation Matrix of decision table after variation;
Step 5, after increment object set and delta attribute collection are added to decision table, increased using the increase that step 2 and step 3 obtain Measure object after decision table equivalence relation Increment Matrix, increase delta attribute after decision table equivalence relation Increment Matrix and increment Object Equivalent Relation Matrix calculates the Equivalent Relation Matrix of decision table after variation;
Step 6, the opposite Knowledge Granulation that increment object set is calculated according to the opposite Knowledge Granulation of decision table before changing, before variation On the basis of the opposite Knowledge Granulation of decision table and the opposite Knowledge Granulation of increment object set, pass through the increment based on matrix method Mechanism calculates the opposite Knowledge Granulation of decision table after variation, in the opposite Knowledge Granulation and minimal attributes reductions of minimal attributes reductions The opposite Knowledge Granulation of minimal attributes reductions decision table after variation is calculated on the basis of the opposite Knowledge Granulation of increment object set, If the opposite Knowledge Granulation of decision table after variation is equal with minimal attributes reductions for the opposite Knowledge Granulation of decision table after variation, It then jumps and executes step 9, otherwise jump and execute step 7;
Step 7, it calculates decision table and increases after increment object set and delta attribute collection each attribute a relative to decision table before changing most The external importance of small attribute reduction, circulation choose maximum external importance a0And it is added to decision table minimum attribute about In letter, calculates and increase attribute a0The opposite Knowledge Granulation of minimal attributes reductions afterwards, until it knows with the opposite of decision table after variation Until knowledge granularity is equal;
Step 8, for each attribute a in minimal attributes reductions, minimal attributes reductions is calculated and delete the opposite knowledge after attribute a Granularity, if minimal attributes reductions delete the opposite Knowledge Granulation phase of the opposite Knowledge Granulation and decision table after variation after attribute a Deng, then attribute a is deleted from minimal attributes reductions, the attribute reduction finally obtained be variation after decision table minimum attribute about Letter;
Step 9, the minimal attributes reductions of decision table after output changes.
2. object as described in claim 1 and attribute while increased matrix Dynamic Attribute Reduction method, which is characterized in that step The Equivalent Relation Matrix of decision table after the variation being calculated in rapid 5 are as follows:Decision table after the variation being calculated in step 6 Opposite Knowledge Granulation are as follows: WhereinFor variation after decision table Equivalent Relation Matrix,For the conditional attribute etc. for changing preceding decision table Valence relational matrix,It is that decision table increases the equivalence relation Increment Matrix after delta attribute,For decision table increasing Equivalence relation Increment Matrix after adding increment object,For decision table increase after increment object and delta attribute etc. Valence relationship Increment Matrix,It is respectively Transposed matrix,To increase Object Equivalent Relation Matrix is measured,It is that decision table increases the Equivalent Relation Matrix after conditional attribute and decision attribute,It is the conditional attribute of increment object set and the equivalence relation Increment Matrix of decision attribute,To determine after variation The conditional attribute of plan table and the equivalence relation Increment Matrix of decision attribute,It is that decision table increases increment pair after changing As the equivalence relation Increment Matrix of postcondition attribute,It is that decision table increases increment object postcondition attribute after changing With the equivalence relation Increment Matrix of decision attribute,For the opposite Knowledge Granulation of decision table after variation, GPU(D|C) For change before decision table opposite Knowledge Granulation,For the opposite Knowledge Granulation of increment object set, | U ∪ UX| it is variation Afterwards in decision table all objects number, | U | be the number of all objects in decision table before changing, | UX| it is that incremental data is concentrated The number of all objects, Sum ()t×nFor the sum of matrix all elements, | |2Represent square of object number, RCIt indicates in item The equivalence relation of part property set C making policy decision table, RC∪DIt indicates to close in conditional attribute collection C and the of equal value of decision kind set D making policy decision table System.
CN201810756077.XA 2018-07-11 2018-07-11 Object and attribute while increased matrix Dynamic Attribute Reduction method Pending CN109062867A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810756077.XA CN109062867A (en) 2018-07-11 2018-07-11 Object and attribute while increased matrix Dynamic Attribute Reduction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810756077.XA CN109062867A (en) 2018-07-11 2018-07-11 Object and attribute while increased matrix Dynamic Attribute Reduction method

Publications (1)

Publication Number Publication Date
CN109062867A true CN109062867A (en) 2018-12-21

Family

ID=64815834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810756077.XA Pending CN109062867A (en) 2018-07-11 2018-07-11 Object and attribute while increased matrix Dynamic Attribute Reduction method

Country Status (1)

Country Link
CN (1) CN109062867A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111124516A (en) * 2019-12-22 2020-05-08 北京浪潮数据技术有限公司 Server parameter reduction method and device and computer readable storage medium
CN113012775A (en) * 2021-03-30 2021-06-22 南通大学 Incremental attribute reduction Spark method for classifying red spot electronic medical record pathological changes
CN114638550A (en) * 2022-05-12 2022-06-17 国网江西省电力有限公司电力科学研究院 Index screening method and system for energy storage power station configuration scheme
CN114675818A (en) * 2022-03-29 2022-06-28 江苏科技大学 Method for realizing measurement visualization tool based on rough set theory

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111124516A (en) * 2019-12-22 2020-05-08 北京浪潮数据技术有限公司 Server parameter reduction method and device and computer readable storage medium
CN111124516B (en) * 2019-12-22 2021-12-03 北京浪潮数据技术有限公司 Server parameter reduction method and device and computer readable storage medium
CN113012775A (en) * 2021-03-30 2021-06-22 南通大学 Incremental attribute reduction Spark method for classifying red spot electronic medical record pathological changes
CN113012775B (en) * 2021-03-30 2021-10-08 南通大学 Incremental attribute reduction Spark method for classifying red spot electronic medical record pathological changes
CN114675818A (en) * 2022-03-29 2022-06-28 江苏科技大学 Method for realizing measurement visualization tool based on rough set theory
CN114675818B (en) * 2022-03-29 2024-04-19 江苏科技大学 Method for realizing measurement visualization tool based on rough set theory
CN114638550A (en) * 2022-05-12 2022-06-17 国网江西省电力有限公司电力科学研究院 Index screening method and system for energy storage power station configuration scheme
CN114638550B (en) * 2022-05-12 2022-10-11 国网江西省电力有限公司电力科学研究院 Index screening method and system for energy storage power station configuration scheme

Similar Documents

Publication Publication Date Title
CN109062867A (en) Object and attribute while increased matrix Dynamic Attribute Reduction method
CN108764273A (en) A kind of method, apparatus of data processing, terminal device and storage medium
CN104573106A (en) Intelligent urban construction examining and approving method based on case-based reasoning technology
Shao et al. The construction of attribute (object)-oriented multi-granularity concept lattices
CN102262681A (en) Method for identifying key blog sets in blog information spreading
CN104834479A (en) Method and system for automatically optimizing configuration of storage system facing cloud platform
WO2022089652A1 (en) Method and system for processing data tables and automatically training machine learning model
CN107408114A (en) Based on transactions access pattern-recognition connection relation
Nin et al. Speed up gradual rule mining from stream data! A B-Tree and OWA-based approach
CN105139282A (en) Power grid index data processing method, device and calculation device
CN113392580A (en) Combined optimization solving method and system based on mixed quantum algorithm and solver framework
Özpeynirci et al. An interactive algorithm for multiple criteria constrained sorting problem
Iskakova et al. Dynamical study of a novel 4D hyperchaotic system: An integer and fractional order analysis
Li et al. The parametric modified limited penetrable visibility graph for constructing complex networks from time series
Agarwal et al. I/O-efficient batched union-find and its applications to terrain analysis
Zhang et al. Revisiting bound estimation of pattern measures: A generic framework
Wang et al. A weighted symmetric graph embedding approach for link prediction in undirected graphs
CN108153585A (en) A kind of method and apparatus of the operational efficiency based on locality expression function optimization MapReduce frames
Abouali et al. MATLAB Hydrological Index Tool (MHIT): A high performance library to calculate 171 ecologically relevant hydrological indices
CN116502234A (en) Vulnerability value dynamic evaluation method and device based on decision tree
CN104731639A (en) Confidence level re-check method for safety indexes of simulation models
CN104462139A (en) User behavior clustering method and system
Wang et al. Norm approximation of mamdani fuzzy system to a class of integrable functions
CN104135510A (en) Distributed computing environment performance prediction method and system based on mode matching
Durán-Rosal et al. Simultaneous optimisation of clustering quality and approximation error for time series segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination