Intelligent Information Ma nagement, 2011, 3, 7586 doi:10.4236/iim.2011.33010 Published Online May 2011 (http://www.SciRP.org/journal/iim) Copyright © 2011 SciRes. IIM On Granularity in Information Systems Based on Binary Relation Weihua Xu1, Shihu Liu1, Xiao yan Zhang1, Wenxiu Zhang2 1School of Mathematics and Statistics, Chongqing University of Technology, Chongqing, China 2School of Science, Xi’an Jiaotong University, Xi’an, China Email: chxuwh@gmail.com, liush02@126.com Received January 21, 2011; revised March 15, 2011; accepted March 20, 2011 Abstract In this paper, some important issues of granularity are discussed mainly in information systems (ISs) based on binary relation. Firstly, the vector representation method of knowledge granules is proposed in an infor mation system based on binary relation to eliminate limitations of set representation method. Secondly, op erators among knowledge granularity are introduced and some important properties of them are studied carefully. Thirdly, distance between two knowledge granules is established and granular space is constructed based on it. Fourthly, axiomatic definition of knowledge granularity is investigated, and one can find that some existed knowledge granularities are special cases under the definition. In addition, as an application of knowledge granular space, an example is employed to validate some results in our work. Keywords: Binary Relation, Granular Space, Information System, Rough Set 1. Introduction Rough set theory, proposed by Pawlak in the early 1980s [16], is an extension of the classical set theory and can be regarded as a soft computing tool to deal with uncer tainty or imprecise information. It was well known that this theory is based upon the classification mechanism, in which case the classification can be viewed as an equivalence relation and knowledge granules induced by the equivalence relation can be viewed as a partition of universe. For this reason, it has been applied widely and successfully in feature selection [22], uncertainty rea soning [6], granular computing [9,14,2933], date analy sis [15,17,18] and data mining [2427], etc. Equivalence relation, as an important and primitive concept in Pawlak's original rough set theory, still has many limitations. In order to eliminate these limitations resulted from equivalence relation and broaden its appli cation fields, some meaningful works have been done in the past [3,4,10,21,28]. A knowledge granule, which is viewed as a partition of universe, plays an important role in investigating the information system. In ref. [33], L. A. Zadeh thought that nearly every field was permeated by granule and Hobss discussed some properties with re spect to knowledge granules in ref. [5]. In addition, L. A. Zadeh [34] pointed out that granulation is one of three basic concepts that underlie human cognition; the other two are organization and causation. Informally, granula tion involves decomposition of whole into parts, organi zation involves integration of parts into whole and causa tion involves association of causes with effects. Hence, how to characterize the process of granulation has been a crucial problem. In other words, the validity of distin guishable ability, which is used to create the knowledge granules, should be examined because the knowledge granules in an information system are finite. Shannon [20], Beaubouef [1], Qian [19] and Liang [11,12] etc. used some useful methods to evaluate the uncertainty of information and L. A. Zadeh applied the notion of granularity to do this work, which presents a more visual and easily understandable description for a partition on the universe. Moreover, the relationships between sev eral measures on knowledge in an information system were discussed in ref. [13]. These measures include granulation measure, information entropy, rough entropy, and knowledge granulation. Especially, closely associ ated with granularity, Xu [23] carefully discussed the properties of every granularity mentioned above. It is known that these measures have become effective mechanisms for evaluating uncertainty in rough set the ory. In this paper, our main contribution is to study the re
76 W. H. XU ET AL. lations among knowledge granules and the distinguish able ability of general binary relation which was used to create the knowledge granules by the vector representa tion method in an information system. Firstly, four op erators among knowledge granules expressed as vectors are proposed and the granular distance is defined. Then, an axiomatic definition of knowledge granularity is pro posed by introducing a new binary relation. The rest of this paper is organized as follows. Some preliminary concepts required in our work are briefly recalled in section 2. In section 3, four operators among knowledge granules are introduced and important prop erties of them are acquired. While the focus of section 4 is the distance between any two knowledge granules and granular space is constructed based on the distance. Definition of knowledge granularity is proposed in sec tion 5 and some of its important properties are discussed. The validity of some results obtained is examined by introducing an example in section 6. 2. Preliminaries In this section, we shall begin our work with some nec essary concepts required in the sequel of this paper. De tailed description of the theory can be found in refs. [11,12,23]. Definition 2.1([37]) An information system is a tetrad ,,, UAVf ,,Uuu , where ● 12 is a nonempty finite set of ob jects called universe. , n u ● 12 ,, m aa a is a nonempty finite set of at tributes. ● is the domain of attribute. , l l a aA VV l a V ● :, UA V ,,aAxUfx called an information function, ,a V l lla An information system with decision is a special case of an information system ,,, UAVf, in which case attribute set CD is the union of conditional attributes C and decision attributes D, with CD . Any subset R of is called a relation on U. For any UU , yUU, , if yR , we say x has relation R with y, and denote this relation as xR y. For an informa tion system ,,, UAV f and , if denote BA ,,yf ,,ya,,yU B Rxx aBxaf , then R means a general binary relation with respect to B on U and the system is a general information system where “” be “”, “” or “=”. Obviously, RB is an equiva lence relation and the system is classical information system when “” be “=”. Remark 1 Unless otherwise specified, information systems appeared in the subsequent sections are general information systems. Let , ijij B uuuuR B and Bi UR u i uU. UR would be called a knowledge granules of U with respect to attribute set B. Definition 2.2 ([35,38]) Let ,,, UAVf A be an information system and . 12 ,BB 1) i uU , if 1 i2 i B uu, we say that 1 UR is equal to 2 UR , denoted by 12 B 2) i UR UR. uU , if 1 ii 2 B uu, we say that 1 UR is finer than 2 UR , denoted by 12 B UR UR. 3) If 12 B UR and UR 12 ii B for some uu i uU , we say that 1 UR is properly finer than , denoted by 12 B UR . UR Definition 2.3 ([35,38]) Let ,,, UAVf be an information system, U and . Given BA ,, ,, BB BB RX uuXuU RX uuXuU then B RX and B RX are called the lower and upper approximate sets of X, respectively, with respect to B. Definition 2.4([2]) Let E be a nonempty set and “ ” be a binary relation on E. If “” satisfy the following properties 1) Exx ; (Reflexive) 2) Ifand, then Exyyx xy ; (Antisymmetric) 3) ,,Ifand, then yz Exyy zx z ; (Transitive) then the binary relation “ ” is called a partial order and the nonempty set E is a partially ordered set or a poset, denoted by ,E . Definition 2.5 ([36]) Let be a poset, if there exist two operators ,L , on such that 2L:LL 1) , ; abba abba 2) , ; ab ca bc ab ca bc 3) , ; abb ba abb ab then L is called a lattice. And if 4) , ; abcab ac abcab ac then L is called an assignment lattice. Furthermore, if 5) ,,..,aLasta a and abb a , Copyright © 2011 SciRes. IIM
W. H. XU ET AL. 77 then L is called a complemented lattice. Definition 2.6 ([7, 8]) Let X be a nonempty finite set, and for any , yX ,0y , if there exists one real number , such that ,dxy 0 , 1) , and iff x = y; dx ,dxy (Nonnegative) 2) ; (Symmety) ,dxy dyx 3) ; ,,,,dxydxzdzy zX (Triangle inequality) then we have that X is a distance space, denoted by , d, and represents the distance between x and y. ,dxy Example 2.1 Consider an ordered information system in Table 1. From the table we can have a dominance relation RA with respect to attribute set 12345 ,,,, aaaaa and uaaA ,, ijjlill A uufuaf , , ,, , in which case we have that 1142 2 33 44 55 ,, ,, . AA AA A uuuuu uu uu uu If take , we have that 123 ,,Baaa 11342 235 33 44 55 ,, ,,, , ,, . BB BB B uuuu uuuu uu uu uu In addition, let , we have that 345 ,,Caaa 114 2124 33 414 535 ,, ,, , ,. CC C C C uuu uuuu uu uuu uuu It is obviously that AB UR and URA UR C UR, which mean that knowledge granules A UR is finer than knowledge granules UR and C UR. 3. Operators among Knowledge Granules From section 2, we can find that set representation Table 1. An ordered information system. U a1 a2 a3 a4 a5 u1 1 0 2 1 1 u2 2 1 0 1 0 u3 2 1 2 0 2 u4 1 2 2 1 1 u5 2 2 0 0 2 method of knowledge granules doesn't reflect all infor mation included in the knowledge granules, because two different objects may have the same neighborhood. However, neighborhood of every different object makes important rules in the knowledge granule UR . In order to eliminate shortages of set representation method, we can use vector representation method to express the knowledge granules in this section. Definition 3.1 Let ,,, UAVf be an informa tion system based on binary relation, and . Vector BA 12 ,, n BB uu u is called vector representation of knowledge granule UR, denoted by . Obviously, dimension of knowledge granules is the cardinality of universe U and it’s any correspondent component is just the neighborhood of each object in U. Generally, there may exist many knowledge granules with respect to a given information system. For the sake of simplicity, the notation “GS”, named “granular clus ter”, is proposed to denote all the knowledge granules of information system ,,, UAVf. That is, 2,1,2, ,2 i AA Bi GSK Bi. What is more, one knowledge granule can be denoted by if and only if the general binary relation, denoted by R , is an empty relation, i.e., ,, , U K . Corre spondingly, One knowledge granule can be denoted by δ if and only if the general binary relation, denoted by δ R, is a full relation, that is, δ,,, U UU U . And, the knowledge granule can be denoted by 12 ,,, n uu u and the corresponding general binary relation is R . Definition 3.2 Let ,,, UAVf be an information system based on binary relation, and 12 , BB KGS . 1) If 1 ii 2 B uu for any , then we say that i uU 1 is equal to 2 , denoted by 12 B K. 2) If 12 ii B uu for any , then we say that i uU 1 is finer than 2 , denoted by 12 B K or 21 B K. 3) If 21 B K, and 12 jj B uu for some j uU , then we say that 1 is properly finer than 2 , denoted by 12 B K or 21 B K. 4) If 12 ii B uu for some j uU but 21 jj B uu 1 for other , then we say that the relation between j uU and 2 is vague, denoted by 12 B K . From above, we have that there exist four relations (equal, finer, properly finer and vague) among knowl edge granules in the granular cluster GS. Copyright © 2011 SciRes. IIM
78 W. H. XU ET AL. Theorem 3.1 is a poset. ,GS Proof. 1) (Reflexive) For any B GS, that is 12 ,,, Bn BB Kuuu. Since iii BB uuuU , we have that B K . 2) (Antisymmetric) Let , BC KGS. If C K and CB K, then ii C and uu ii CB uu for any . By Definition 3.2, so i uU ii C uu, that is C K. 3) (Transitive) Let 123 ,, BBB KK GS. If 12 B K and 23 , BB K then 12 ii B uu and 23 ii B uu for any . By Definition 3.2, we have that i uU 13 ii B13 uu, i.e., B K. For a given information system, the knowledge gran ules can be induced by some relations, and they can also be obtained from known knowledge granules by opera tion. Hence, operator, as one of basic mathematical con cepts, has to be mentioned. Operators in information systems can be divided into two types: operators among neighborhoods of objects and operators among knowl edge granules. The former operators ,,, are based on classical sets while the later operators are per formed through knowledge resolving or knowledge composing in essence. So, operators among knowledge granules should be proposed. Definition 3.3 Let ,,, UAVf be an informa tion system based on binary relation, and , BC KGS . Four operators, denoted by , , c and , can be defined as follows. (1) C 112 2 ,,, nn CB CBC uuuu uu, (2) C K 1122 ,,, nn BCBCB C uuuu uu , (3) C K 112 2 ,,, nn BCB CBC uuuu uu , (4) c 11 ~,~,,~ n BB B uu u, where ~ii B uUu . One knowledge granule can be generated from two or more different knowledge granules in granular cluster of an information system. For example, there exist another knowledge granule B GS such that 12 BB KK for 12 , BB KGS, then we say that the knowledge gran ule is generated from 1 and 2 . So as the instance of 1 B K 2 . Proposition 3. 1 Let ,,,AIU be an information system based on binary relation and 123 Vf ,, BBB KK GS , then operators , , c and satisfy the following properties. (1) 1 1 =1 , 11 . 1 BB KK (Law of idempotency) (2) 1 2 =2 1 , 122 . 1 BBB KKK (Law of commutation) (3) 1 2 3 B = 1 2 B 3 , 12312 . 3 BBBB B KKKK K (Law of association) (4) 1 2 3 B =1 , 112 . 1 BB B KK K (Law of assimilation) (5) 1 2 3 B = 1 2 B 1 3 B , 1 2 3 B = 1 2 B 1 3 B . (Law of distribution) (6) 1. c c 1 B K (Law of double negative) (7) 1 2 c B K=12 cc B K, 1 2 c B K=1 c 2 c . (De.Morgan’s laws) (8) 1 1 c = , 1 1 c =δ . (Law of zero or unity) By Definition 3.2, one can find that the relation be tween every two knowledge granules can't always be characterized by finer or coarse, sometimes the relation may be vague in information systems. Therefore, we make a formal regulation for the relations among knowledge granules to give a more clear explanation. Definition 3.4 Let ,,,IUAVf be an information system based on binary relation. 12 ,, BB B KK GS . 1 means either 12 finer , BB KKB K B K or 2 B K . And 1 coarser , B K2 BB KK means either 1 B K or 2 B K. From above, we can obtain the following results. Proposition 3.2 Let ,,,IUAVf be an informa tion system based on binary relation and 12 , BB KGS . (1) 1 2 K 12 finer , BB KK. (2) 1 2 12 coarser , BB KK. Proposition 3.3 Let ,,,IUAVf be an informa tion system based on binary relation and 12 , BB KGS . (1) 1 = , and 1 δ =1 . (2) 1 =1 , and 1 δ =δ (3) If 21 B K , then 12 cc B K. Copyright © 2011 SciRes. IIM
W. H. XU ET AL. 79 Theorem 3.2 Let ,,,IUAVf be an information system based on binary relation and 12 , BB KGS. Then we can have that (1) 1 2 K=1 if and only if 12 B K. (2) 1 2 K=1 if and only if 21 B K. Proof. It can be proved by Definition 3.2 and 3.3. Theorem 3.3 Let ,,,IUAVf be an information system based on binary relation, then , , GS is an assignment lattice. Proof. By Theorem 3.1, we have that ,GS is a poset. And terms (1), (2) and (4) in Definition 2.5 are obvious from (2), (3) and (5) in Proposition 3.1. In addition, let , BC KGS, then for any i uU , C =C , iii cc ii cB CB uuu uu KK C =C . iii cc ii BC CB uuu uu KK Thus, , , GS is an assignment lattice. Theorem 3.4 Let ,,,IUAVf be an information system based on binary relation, then , , GS , is a complemented lattice. c Proof. By Theorem 3.3, we have that , , GS , is an assignment lattice. And from (6) in Proposition 3.1, one can get that B c c c K. Moreover, from (4) in Definition 3.3, one has that C for any . CB ii i Bc iB cc ic uu u K Uu u K The theorem was proved. Example 3.1 (Continued from Example 2.1) From vector representation method we have that 142345 134235345 14124 31435 ,, ,, ,, ,,, ,,,,,, ,,,,, ,,,, A B C K uuuuuu Kuuuuuuuuu K uuuuuuuuuu . Thus, by calculating, we obtain that C = 142345 ,, ,, ,uuuuuu , C 1343 14 15 ,, ,,,, ,,.uuuUuuuuu So, the following is obvious. C finer ,, BC KK C coarser,. BC KK Furthermore, by the Definition of “c” we have that 23513451245 1235 1234 ,,, ,,,, ,,,, ,,,, ,,,, c A uuu uuuu uuuu uuuu uuuu and 25 141245 1235 1234 ,, ,, ,,,, ,,,, ,,,. c B uu uuuuuu uuuu uuuu Obviously, cc A K. 4. Granular Space Yao [30] proposed the concept of set closeness between two classical sets to measure the degree of the sameness of them. For the idea, distance between two different knowledge granules is investigated in this section to characterize the relationship among knowledge granules by the vector representation method. Moreover, we con struct granular space with the distance to characterize the relationship among knowledge granules by the vector representation method. Definition 4.1 Let ,,,IUAVf B be an information system based on binary relation. Let be a map from GS to . For any 0,1n GS , we define 12 ,,, BB B Bn 12 n hu huhu, where i i Bi u hu U . We call that the vector is ranular vector of knowledge granule g , denoted by , that is to say, 12 12 ,,, n BBBBBn Khuhu hu . Definition 4.2 Let ,,,IUAVf be an information system based on binary relation and , BC KGS. Four operators among granular vectors, denoted by ,, and , can be defined as follows. \ (1) BC KK (2) CBC KKK (3) c B K (4) \ CBC KKK In next, granular distance is proposed to measure rela tionship between any two knowledge granules. Definition 4.3 Let ,,,IUAVf elation an be an information system based on binary r d, BC KGS Granlaristance between u d and C is defined as , BC dKK , denoted by , BC dKK, where 1 1 1 , i p ii pBCBi Ci puU dKKhu hu U Copyright © 2011 SciRes. IIM
80 W. H. XU ET AL. and p > 0. So, dp is Minkowski distance [8]. In particular, if p = 1, then d1 is Hamming distance which is 1 1 , i ii CBi uU dKKhu hu U Ci and if p = 2, then d2 is Euclid distance which is 12 2 212 1 , i ii BCBi Ci uU dKKhu hu U Granular distance has the following important proper ties. Theorem 4.1 (Nonnegativity) Let ,,,IUAVf be an information system based on binary relation. holds for any , pBC dKK0, BC KGS. Proof. Let , BC KGS, we have that 0 ii Bi Cii huhuuU Then, 1 1 1 ,0 i p p ii pBCBi Ci puU dKKhu hu U . The theorem was proved.□ Theorem 4.2 (Symmetry) Let ,,,IUAVf be an information system based on binary relation. , BC KdK , CB dKK holds for any , BC KGS. Proof. Let , BC KGS, we have 1 1 1 1 1 , 1 , i i p ii pBCBiCi puU p ii Ci Bi puU pCB dKKhu hu U hu hu U dKK The theorem was proved.□ Theorem 4.3 (Monotonicity) Let ,,,IUAVf be an information system based on binary relation and , BC KGS. (1) If C K, B K , then ,, pB pC dKK dKK . (2) If C K, then . δδ ,, pB pC dKKdKK Proof. (1) Since C K, we have that ii iCi hu hu, where 1, 2,,i.U That is to say, 11 0. ii Bi Ci hu hu UU By B K , we can obtain that 11 ii Bi Ci hu hu UU Thus, 1 1 1 1 11 , 11 ,. i i p i pB Bi puU p i Ci puU pC dKKhuU U hu U U dKK 2) It can be proved in the same way as (1).□ Theorem 4.4 (Invariability) Let ,,,IUAVf be an information system based on binary relation. , cc , BCpBC dKK dKK holds for any , BC KGS . Proof. By Definition 3.3, 4.1 and 4.2, we have that 12 12 1,1 ,,1 U c BB BB U Khuhuhu , 12 12 1,1,,1 U c CC CC U Khuhuhu. Therefore, 1 1 1 1 1 1 1 ,11 1 1 ,. i i i p cci i pBCBi Ci puU p p ii Ci Bi puU p p ii Bi Ci puU pBC dKKhu hu U hu hu U hu hu U dKK The theorem was proved.□ Theorem 4.5 (Boundedness) Let ,,,IUAVf be an information system based on binary relation. 0, pBC dKK 1 holds for any , BC KGS. Proof. For any i uU , we have that 01 i Bi hu and . 01 i Ci hu That is， 01 ii Bi Ci hu hu. Therefore, 1 1 1 01 i p p ii Bi Ci puUhuhu U. The theorem was proved.□ In particular, one can obtain the following properties. Proposition 4.1 Let ,,,IUAVf be an informa tion system based on binary relation and , BC KGS . , pBC dKK 0 if and only if C K . Proposition 4.2 Let ,,,IUAVf be an informa tion system based on binary relation and , BC KGS . , pBC dKK 1 if and only if for some i uU , 1 i Bi hu and i Ci hu 0 , while for other j uU , 0 j Bj hu and j Cj hu 1 . Theorem 4.6 (Triangle inequality) Let ,,,IUAVf be an information system based on binary relation and 123 ,, BBB KK GS , we have that Copyright © 2011 SciRes. IIM
W. H. XU ET AL. 81 13 1223 12 1323 23 1213 ,, ,, ,, pBB pBBpBB pBBpBBpBB pBB pBBpBB dKK dKKdKK dKK dKK dKK dKKdKKdKK ,, ,, ,. Proof. By the Minkowski inequality ([8] Chap. 6, 1, Th. 4), we have that 13 12 23 1 1 1 . i i i p p ii Bi Bi uU p p ii Bi Bi uU p ii Bi Bi uU hu hu hu hu hu hu Hence, 13 1223 ,, pBB pBBpBB dKK dKKdKK,. Similarly, we have that 12 1323 ,, pBB pBB pBB dKK dKK dKK , and 23 12 13 ,, pBB pBBpBB dKKdKKdKK,. The theorem was proved.□ Specially, when p = 1 and p = 2, one can obtain the following properties. Corollary 4.1 Let ,,,IUAVf be an information system based on binary relation and 1 , 2 , 3 B U. If 123 BB KK, then 13 1223 111 ,, BB BBBB dK KdK KdKK,. Corollary 4.2 Let ,,,IUAVf be an information system based on binary relation. Then we have that 11δ ,, BB dKK dKK 1. Corollary 4.3 Let ,,,IUAVf be an information system based on binary relation, we can obtain the fol lowing properties. (1) 1δ 1 ,1dKK U (2) 1δ,1dKK (3) 1 1 ,dKK U Corollary 4.4 Let ,,,IUAVf be an information system based on binary relation, and 123 ,, BBB KK GS . If there exist two nonnegative real numbers c1 and c2 which satisfy (1) Both of them are not equal to zero at the same time. (2) For any , i uU 21 32 12 ii ii Bi BiBiBi ch uhuchuh u Then, 13 1223 222 ,, BB BBBB dKKdKKdKK Theorem 4.7 Let ,,,IUAVf be an information system based on binary relation. , GS d is a distance space. Proof. (1) The properties of Nonnegativity and sym metry have been proved in Theorem 4.1 and 4.2. (2) The property of triangle inequality has been proved in Theorem 4.6. The theorem was proved.□ Distance space , GS d would be called generalized granular space, because every element in GS is knowl edge granule and dp is the distance between two knowl edge granules. Example 4.1 Let us consider the information system in Example 2.1. By computing, we have that 11 12 22 33 ,,, 25 25 41 ,,,, 25 5 15 30 ,,, 25 25 AB BC AC AB BC AC dKKdKK dKK dKK dKK dKK , . when p =1, 2 the following is obvious. ,,, ,, ,,, pAB pBCpAC pACpAB pCB pBC pABpAC dKK dKKdKK dKK dKK dKK dKK dKK dKK , ,, . Furthermore, we can obtain that 22 15 ,, 25 cc BC BC dKK dKK . If we take 2345 ,,,Daaaa, then we can calculate 22111 ,,,, 55555 D K And the following holds 111 , ,, ADC ACAD DC ,. KK dKK dKKdKK 5. Knowledge Granularity ,. Intuitively, knowledge granules can represent the distin guishable ability of the general binary relation based on set of attributes in an information system. To some ex tent, the stronger its distinguishable ability is, the smaller the cardinality of every object’s neighborhood, while it is difficult to qualitatively depict distinguishable ability of some binary relations when relation among knowledge granules are vague. The reason is that the partial relation “ ” only considers the inclusion relation of the neighborhood with the same sequence in knowledge granules. In order to eliminate the limitations and to dis cover the essence of distinguishable ability, a new binary Copyright © 2011 SciRes. IIM
82 W. H. XU ET AL. relation between knowledge granules, denoted by “ ”, is firstly introduced in this section. In brief, is applied to denote one of new sequences of 12 *,,, n Ci ii CC C Kuuu 12 ,,, Cn CC C Kuu u, where n is the cardinality of U. Definition 5.1 Let ,,,IUAVf be an information system based on binary relation and , BC KGS. The new binary relation “” between knowledge granules is defined as follows. (1) If j ji BC uu for any , then then we say that 1, 2,,jn is finer than C , denoted by C K . (2) If j ji BC uu for any and 1, 2,,jn l li BC uu for some l, then we say that is properly finer than C , denoted by C K . (3) If j ji BC uu for any , then we say that 1, 2,,jn is rough equal to C , denoted by C K. (4) If j ji BC uu for any , then we say that 1, 2,,jn is identically unequal to C , denoted by C K . Theorem 5.1 Let ,,,IUAVf be an information system based on binary relation. We have that ,GS is a poset. Proof. The proof is similar to Theorem 3.1.□ From above, one can find that binary relation “ ” compares the relation of two knowledge granules with the cardinality of neighborhood. In a broad sense, it im proved the limitations of the partial relation “”. By the depiction of “”, we can have following properties. Corolla ry 5.1 The partial relation “” is a special in stance of the relation “ ”. Corollary 5.2 Let ,,,IUAVf be an information system based on binary relation and , BC KGS. We have that (1) C finer , , BC KK (2) C coarser , C K Definition 5.2 Let ,,,IUAVf be an information system based on binary relation. For any B GS , if there exists a real number rB GK satisfying (1) ; 0 rB GK (2) For any , C GS if rB rC GK GK C K ; (3) For any ,, CD KGS , rB rC rD GK GK GK if C , then is called a knowledge granularity of rB GK with respect to B in the information system. Theorem 5.2 Let ,,,IUAVf be an information system based on binary relation and , BC KGS . C K if and only if . GK GK rB rC Proof. It can be proved directly by Definition 5.2.□ Corollary 5.3 Let ,,,IUAVf be an information system based on binary relation, and , BC KGS . If C K , then rC GK rB GK . Remark 2 If r GK GK rB , one couldn't obtain C C K all the time. Example 5.1 Consider an information system ,,,AVIUf and 345 ,,U uu 12 ,,uuu. Take 12 345 , , 3 ,,, B uu uuuu and 1 45 , 23 ,, , C uuuuu. Obviously, rC GK rB , but GK isn't equal to C . In fact, we have that C K. Theorem 5.3 (Boundedness) Let ,,,AVfIU be an information system based on binary relation. δ KG rr Br B GK KG holds for any GS . Proof. From Definition 3.2 and Corollary 5.1, we have δ . Hence, δ G rrBr K GK GK holds by (3) of Definition 5.2. The theorem was proved.□ From above, one can find that any knowledge granule has its upper bound and lower bound in an information system ,,,AIUVf . In particular, if δ, KGS , then the following properties hold. Theorem 5.4 (Extremum) Let ,,,IUAVf B be an information system based on binary relation and GS , the following results are true. (1) For any C GS , max rC GK rB GK if and only if δB K ; (2) For any C GS , min GK rB rC GK if and only if B K . Proof. (1) Obviously, one can have for any C GS , K δ GK max rr GCB . If δ K, then max rB rC GK GK. Otherwise, if max rB rC GK GK for any C GS , it means rC GK rB GK . In particular, K δrrB . We have GK Gδ by Definition 5.2. From Definition 5.1, δB K holds. (2) The proof is similar to (1). The theorem was proved.□ Theorem 5.5 (Knowledge composed) Let ,,,AVIUf be an information system based on binary relation and i B GS , where . If a new knowledge granule, denoted by 1, 2,i , can be com Copyright © 2011 SciRes. IIM
W. H. XU ET AL. 83 posed of i , i.e., i B i K, then we have that . B GK K r G i rB Proof. Since i B i K, we have that i . So we can obtain by Definition 5.2. i rB GK rB GK The theorem was proved.□ Theorem 5.6 (Knowledge resolved) Let ,,,IUAVf be an information system based on binary relation and B GS. If knowledge granule can be resolved into two or more new knowledge granules, denoted by i , where that is to say, 1, 2,i i B i K, then we have . i rB GK G B K r Proof. The proof is similar to Theorem 5.5.□ Definition 5.3 ([24]) Let ,,,IUAVf be an in formation system based on binary relation and B GS . Knowledge granulation of knowledge granule , which is denoted by G, can be defined as 2 1. Bi M Ki uU Gu U From above, we can have that G still satisfies all the properties from Theorem 5.2 to Theorem 5.6. Corollary 5.3 Let ,,,IUAVf be an information system based on binary relation and , then BA G in Definition 5.3 is a knowledge granularity under Defi nition 5.2. Definition 5.4([24]) Let ,,,IUAVf be an in formation system based on binary relation and B GS Rough entropy of knowledge granule , which is de noted by r E, can be defined as 2 11 log . Bi r KuU i EUu By definition of rough entropy, we can find that r E still satisfies all the properties from Theorem 5.2 to Theorem 5.6. Corollary 5.4 Let ,,,IUAVf be an information system based on binary relation and B GS, then r E in Definition 5.4 is a knowledge granularity under Defi nition 5.2. Example 5.2 (Continued from Example 2.1) By computing, we obtain that knowledge granularity after making operators and among different knowledge in the system of Example 2.1, which are ex pressed in Table 2 and 3, respectively. Table 2. Operation on . KA K B KC KA 0.24 0.36 0.40 KB 0.36 0.36 0.52 KC 0.40 0.52 0.40 Obviously, the following hold. . BCB MMMM KKKKK GGGG C In addition, construction of the Knowledge granularity can be illustrated from Figure 1. Remark 3 In this example we only consider the op eration and , respectively. Other operation, such as “” and “–”, can be similarly considered. 6. Case Study In order to illustrate the vector representation method of knowledge granules and knowledge granularity, the in formation system about CTR [38](Car Test Results), see Table 4, is introduced into this section. The set of attributes 12 9 ,,, aa a in the system are showed as follows. Table 3. Operation on . KA K B KC KA 0.24 0.24 0.24 KB 0.24 0.36 0.24 KC 0.24 0.24 0.40 C , K A K C B K D B , K A K B A , K A K B , K A K C , K B K C Figure 1. Construction of knowledge granularity in Exam ple 5.2. Table 4. CTR (Car Test Results). U/A a1a2a3a4a5 a6 a7 a8a9 u1 c 6 y E m h h a m u2 c 6 n E m m h mam u3 c 6 n E m h h mam u4 c 4 y E m h h mal u5 c 6 n E m m m mam u6 c 6 n Bm m m a he u7 c 6 n E m m h mahe u8 s 4 n Bsm h lo mal Copyright © 2011 SciRes. IIM
84 W. H. XU ET AL. a1 — size, over all length, a2 — number of cylinders, a3 — presence of a turbocharger, a4 — type of fuel system, a5 — engine displacement, a6 — compression, a7 — power, a8 — type of transmmion, a9 — weight. And values of attributes mean as follows. c — compact, s — subcompact, sm — small, n — no, E — EFI, B — 2BBL, y — yes, m — medium, ma — manual, h — high, he — heavy, l — light, lo — low, a — auto. Let and it will obtain an equiva lence relation 12 34 ,,,Baaaa R, that is B ijljlil R uufufuxB , . Thus, we have that 12357 2357 4 23576 23578 ,,,, ,,,,, ,,, ,,,,, , B uuuuu uuuu u uuuuuuuuuu and 14414141 ,,,,,,,. 888888 88 B K If we denote , then we have that 56789 ,,,,Caaaaa 12345678 ,,,,,,, C uuuuuuuu and 11111111 ,,,,,,, . 88888888 C K Moreover, if we denote , then we have that 3456 ,,,Daaaa 142573 14 257 62578 ,,,,, ,, ,, ,,,, ,, D , uu uuuuuu uuu u uuuu and 23123131 ,,,,,,, . 88888888 D K By computing, we can obtain the following properties. (1) ,, , CBCDBDDB KK KKKK K . (2) C , C K C . C K (3) finer , BD KK. (4) coarser , BD KK. And we have that 11 12 22 , 0.1875,,0.1250, ,0.1250,, 0.2652, , 0.1654,, 0.1654 BC BD CD BC BD CD dKK dKK dKK dKK dKK dKK and 7 4474747 ,,,,,,, , 88888888 c B K 65765757 ,,,,,,, . 88888 8 8 8 c D K Obviously, the following hold. (5) 11 ,0.1250, . cc BD BD dKK dKK (6) ,,, iBC iBD iCD dKK dKKdKK, ,, , ,,, iBD iBCiDC iCDiBC iBD dKKdKK dKK dKK dKK dKK , . where 1, 2i . Moreover, we can have Table 5 about knowledge granularity of the system in Table 4 by computing. In what follows Figure 2 is received to illustrate con struction of knowledge granules mentioned in system about CTR. 7. Conclusions Rough set theory is a powerful soft computing tool to deal with uncertainty and imprecision information. How to represent knowledge granules in information systems based on binary relation is one of the important research Table 5. Knowledge granularity of the system in Table 4. K* KB K C KD KB KD KB KD G 0.31250.12500.2500 0.2188 0.3438 B B K D B K C Figure 2. Construction of know ledge granularity in Example CTR. Copyright © 2011 SciRes. IIM
W. H. XU ET AL. 85 tasks. In this paper, the vector representation method is proposed to eliminate limitations of set representation method, in which case the granular space is constructed by defining the distance between any two knowledge granules. In addition, knowledge granularity is investi gated and some of its important properties are discussed carefully. As an application of knowledge granular space, an example is applied to illustrate the validity of some results obtained in our work. 8. References [1] T. Beaubouef, F. E. Petry and G. Arora, “Informa tiontheoretic Measures of Uncertainty for Rough Sets and Rough Relational Databases,” Information Society, Vol. 109, 1998, pp. 185195. doi:10.1016/S00200255(98)00019X [2] T. S. Blyth, “Lattices and Ordered Algebraic Structures,” SpringerVerlag, London, 2005. [3] Z. Bonikowski, E. Bryniarski and U. Wybraniec, “Exten sions and Intentions in The Rough Set Theory,” Informa tion Society, Vol. 107, 1998, pp. 149167. doi:10.1016/S00200255(97)100469 [4] S. Greco, B. Matarazzo and R. Slowingski, “Rough Ap proximation of a Preference Relation by Dominance Re lation,” European Journal of Operation Research, Vol. 117, 1999, pp. 6368. doi:10.1016/S03772217(98)001271 [5] R. Hosbs, “Granularity,” Proceedings of the ninth inter national joint conference on artificial intelligence, Los Angeles, California, 1985, pp. 432435. [6] Q. Hu and D. Yu, “Entropies of Fuzzy Indiscernibility Relation and Its Operations,” International Journal of Uncertainty, Fuzziness and KnowledgeBased Systems, Vol. 12, No. 5, 2004, pp. 575589. doi:10.1142/S0218488504003089 [7] Z. J. Jiang and S. L. Sun, “Functional Analysis,” Higher education press, Beijing, 2005. [8] Z. J. Jiang and Z. Q. Wu, “Theory of Real Variable Func tions,” People's education press, Beijing, 1973. [9] G. J. Klir, “Basic Issues of Computing with Granular Probabilities,” Proceedings of 1998 IEEE International Conference on Fuzzy Systems, Alaska, USA, 1998, pp. 101105. [10] M. Kryszkiewicz, “Rough Set Approach to Incomplete Information Systems,” Information Society, Vol. 112, 1998, pp. 3949. doi:10.1016/S00200255(98)100191 [11] J. Y. Liang, K. S. Chin, C. Y. Dang and C. M. Yam, “A new Method for Measuring Uncertainty and Fuzziness in Rough Set Theory,” International Journal of General Systems, Vol. 31, No. 4, 2002, pp. 331342. doi:10.1080/0308107021000013635 [12] J. Y. Liang and Y. H. Qian, “Axiomatic Approach of Knowledge Granulation in Information System,” In: A. Sattar and B. H. Kang (Eds. ), Lecture Notes in Computer Science, Springer, Berlin, 2006, pp. 10741078. [13] J. Y. Liang, Z. Z. Shi and D. Y. Li, “Information Entropy, Rough entropy and Knowledge Granulation in Incom plete Information Systems,” International Journal of General Systems, Vol. 35, No. 6, 2006, pp. 641654. doi:10.1080/03081070600687668 [14] T. Y. Lin, “From Rough Sets and Neighborhood Systems to Information Granulation and Computing with Words,” European Congress on Intelligent Techniques and Soft Computing, 1997, pp. 16021606. [15] T. Y. Lin, “Introduction to Special Issues on Data Mining and Granular Computing,” International Journal of Ap proximate Reason i n g , Vol. 40, 2005, pp. 12. doi:10.1016/j.ijar.2004.11.010 [16] Z. Pawlak, “Rough Sets,” International Journal of com puter and Information Society, Vol. 11, No. 5, 1982, pp. 341356. [17] Z. Pawlak, “Rough Sets,” Theoretical Aspects of Rea soning about Date, Kluwer Academic Publishers, Boston, 1991. [18] Z. Pawlak, “Rough Sets Approach to Multiattribute De cision Analysis,” European Journal of operational Re search, Vol. 72, 1994, pp. 443459. doi:10.1016/03772217(94)904154 [19] Y. H. Qian and J. Y. Liang, “Combination Entropy and Combination Granulation in RoughSet Theory,” Interna tional Journal of Uncertainty, Fuzziness and Knowl edgeBased Systems, Vol. 16, No. 2, 2008, pp. 179193. doi:10.1142/S0218488508005121 [20] C. E. Shannon, “The Mathematical Theory of Communi cation,” The Bell System Technical Journal, Vol. 27, No. 34, 1948, pp. 373423. [21] A. Skowron and J. Stepaniuk, “Tolerance Approximation Space,” Fundamental Information, Vol. 27, 1996, pp. 245253. [22] W. Swiniarski and A. Skowron, “Rough Set Methods in Feature Selection and Recognition,” Pattern Recognition Letters Vol. 24, No. 6, 2003, pp. 883849. [23] W. H. Xu, X. Y. Zhang and W. X. Zhang, “Knowledge Granulation, Knowledge Entropy and Knowledge Uncer tainty Measure in Information Systems,” Applied Soft Computing, Vol. 9, 2009, pp. 12441251. doi:10.1016/j.asoc.2009.03.007 [24] W. H. Xu, J. M. Zhong, X. Y. Zhang and W. X. Zhang, “Attribute Reduction in Ordered Information Systems Based on Evidence Theory,” Knowledge and Information Systems, Vol. 25, 2010, pp. 169184. doi:10.1007/s1011500902485 [25] W. H. Xu and W. X. Zhang, “Knowledge Reduction and Matrix Computation in Inconsistent Ordered Information Systems,” International Journal of Business Intelligence and Data Mining, Vol. 3, 2008, pp. 409425. doi:10.1504/IJBIDM.2008.022737 [26] W. H. Xu, M. W. Shao and W. X. Zhang, “Knowledge Reduction Based on Evidence Reasoning Theory in Or dered Information Systems,” Lecture Notes in Artificial Intelligence, 2006, pp. 535547. [27] W. H. Xu and W. X. Zhang, “Methods for Knowledge Copyright © 2011 SciRes. IIM
W. H. XU ET AL. Copyright © 2011 SciRes. IIM 86 Reduction in Inconsistent Ordered Information Systems,” Journal of Applied Mathematics and Computing, Vol. 26, No. 12, 2008, pp. 313323. doi:10.1007/s1219000700143 [28] Y. Y. Yao, “Relational Interpretations of Neighborhood Operators and Rough set Approximation Operators,” In formation Sciences, Vol. 101, 1998, pp. 239259. [29] Y. Y. Yao, “Granular Computing on Basic Issues and Possible Solutions,” Proceedings of the Fifth Interna tional Conference on Computing and Information, Ku wait, 2000, pp. 186189. [30] Y. Y. Yao, “Information Granulation and Rough Set Ap proximation,” International Journal of Intelligent Sys tems, Vol. 16, No. 1, 2001, pp. 87104. doi:10.1002/1098111X(200101)16:1<87::AIDINT7>3. 0.CO;2S [31] Y. Y. Yao, “A partition Model of Granular Computing,” LNCS Transactions on Rough Sets I, 2004, pp. 232253. doi:10.1007/9783540277941_11 [32] Y. Y. Yao, “Three Perspectives of Granular Computing,” The Proceedings, International Forum on Theory of GrC from Rough Set Perspective Journal of Nanchang Insti tute of Technology, Vol. 25, No. 2, 2006, pp. 1621. [33] L. A. Zadeh, “Fuzzy sets and Information Granularity, Advances in fuzzy set theory and application,” North Holland Publishing, Amsterdam, 1979. [34] L. A. Zadeh, “Towards a Theory of Fuzzy Information Granulation and Its Centrality in Human Reasoning and Fuzzy Logic,” Fuzzy sets and systems, Vol. 19, 1997, pp. 111127. doi:10.1016/S01650114(97)000778 [35] W. X. Zhang and Y. Leung, “Information System and Knowledge Discovery,” Science Press, Beijing, 2003. [36] W. X. Zhang, J. S. Mi and W. Z. Wu, “Knowledge Re ductions in Inconsistent Information Systems,” Interna tional Journal of Intelligent Systems, Vol. 18, 2003, 9891000. doi:10.1002/int.10128 [37] W. X. Zhang and W. Z. Wu, “An Introduction and a Sur vey for the Studies of Rough Set Theory,” Fuzzy Systems and Mathematics, Vol. 14, No. 4, 2000, pp. 112. [38] W. X. Zhang, Y. Y. Yao and Y. Leung, “Rough Sets and Concept Lattices,” Xi'an Jiaotong University Press, Xi'an, 2006.
