International Journal of Computational Intelligence Systems

Volume 10, Issue 1, 2017, Pages 540 - 554

Incremental approximation computation in incomplete ordered decision systems

Authors
Guanglei Gou1, 2, ggl@cqut.edu.cn, Guoyin Wang3, *, wanggy@cqupt.edu.cn
*Corresponding author.
Corresponding Author
Received 24 October 2016, Accepted 17 December 2016, Available Online 1 January 2017.
DOI
10.2991/ijcis.2017.10.1.37How to use a DOI?
Keywords
Incomplete Ordered Decision Systems; Confidential dominance relation; Approximations; Incremental updating
Abstract

Approximation computation is a critical step in rough sets theory used in knowledge discovery and other related tasks. In practical applications, an information system often evolves over time by the variation of attributes or objects. Effectively computing approximations is vital in data mining. Dominance-based rough set approach can handle information with preference-ordered attribute domain, but it is not able to handle the situation of data missing. Confidential Dominance-based Rough Set Approach (CDRSA) is introduced to process Incomplete Ordered Decision System (IODS). This paper focuses on incremental updating approximations under dynamic environment in IODS. With the CDRSA, the principles of incremental updating approximations are discussed while the variation of attribute sets or the union of subsets of objects and the corresponding incremental algorithms are developed. Comparative experiments on data sets of UCI and results show that the proposed incremental approaches can improve the performance of updating approximations effectively by a significant shortening of the computational time.

Copyright
© 2017, the Authors. Published by Atlantis Press.
Open Access
This is an open access article under the CC BY-NC license (http://creativecommons.org/licences/by-nc/4.0/).

1. Introduction

Rough set theory is proposed by Pawlak1 to deal with inconsistency problems, which is useful in fields such as knowledge discovery2,3,4, decision analysis5,6, data mining7,8, etc. Classic Rough Set (CRS) is based on equivalence relation, which is used to deal with discrete attributes values. However, CRS can not be able to discover uncertainty from attributes with preference order domains, which is important to the multi-criteria decision analysis, e.g., risk evaluation, pollution rating. In order to solve this problem, Greco, Matarazzo and Słowinski proposed Dominance-based Rough Set Approach (DRSA), which is based on dominance relation instead of equivalence relation9,10. DRSA attracted much attention in recent years, and has been applied to multi-criteria classification11, attribute reduction12,13, customer behavior prediction14, bankrupt risk prediction15, water quality evaluation16, etc.

DRSA has been introduced to handle the crisp ordered information system, however, Real life applications face data missing due to various causes. Greco et al. extended dominance relation, which requires the referent object has no missing data, to deal with missing data in rough set analysis of multiattribute and multi-criteria decision problem17,18. Błaszczynski et al. discussed different ways of handling missing values in sorting problems with monotonicity constraints in DRSA19. He et al. investigated an extend dominance relation to discover knowledge from approximations20. Shao and Zhang applied extended DRSA to reasoning, rule extracting and knowledge reduction in incomplete ordered information system21. In order to avoid the comparison of two objects have no common known attribute value, Hu and Liu discussed a limited extended dominance22 and generalized extended dominance relation23. Generalized extended dominance relation contains limited extended dominance by chosen the parameter. Chen et al. studied k-degree extended dominance characteristic (k-degree EDCR), which is considering tow case of missing data that are “do not known” and “ do not care”24. Du et al. proposed the characteristic-based dominance relation to deal with incomplete ordered information25. Yang et al. proposed a similarity dominance relation and studied knowledge reduction under incomplete ordered information system26. Luo et al. combined the limited extended dominance relation with similarity dominance relation and proposed a limited extended dominance relation considering maximum and minimum values in dominating or dominated relation when comparing their lost27. It is too strict to be practical because maximum and minimum attribute values are hard to known. Yang et al. introduced a valued dominance relation considering the probability that an object is dominating another one by using the information given unknown values28. In order to avoid semantic contradiction on the ordered relation, Gou et al. proposed confidential dominance relation to extend DRSA to deal with missing data in Incomplete Ordered Decision System (IODS)29.

The information system commonly evolves with time, typically, the set of attributes, the set of objects and attribute values may dynamically change. The approximations may alter by the variation of the information system. The traditional methodologies to update approximations are inefficient because they need to recomputing from scratch. Incremental update is a feasible and effective in processing dynamic information since previous data structures and knowledge are optimized for updating approximations. In the case of variation of attributes, Chan proposed an incremental learning method for maintaining approximations in CRS by added into or deleted from one attribute30. Li et al. proposed an incremental approach for updating approximations under rough set based the characteristic relation when adding or removing some attributes in incomplete information system31. Zhang et al. presented matrix approaches for dynamic attribute variation in set-valued information system32. Li et al. studies the principle of updating dominating sets and dominated sets and proposed incremental approaches for updating approximations33. Zhang et al. investigated incremental updating of rough approximations in interval-valued information systems34. Liu et al. presented approaches for incremental updating approximations in probabilistic rough sets under the variation of attributes35. Yang et al. designed fast algorithms for updating the multigranulation rough approximations with increasing of the granular structures36. In the case of variation of objects, Shan et al. introduced an incremental modification for classification rules37. Zheng and Wang developed a rough set and rule tree based incremental knowledge acquisition algorithm, which updates rules by adding and pruning the rule tree incrementally38. Zhang et al. designed parallel algorithms based on MapReduce for calculating approximations of rough sets39,40. Luo et al. proposed a matrix approach to decision-theoretic rough sets for evolving data41. Luo et al. presented efficient updating of probabilistic approximations with incremental objects42. Li et al. developed incremental algorithms for updating approximations of composite rough sets when adding or removing some objects43. Li et al. studied the dynamic maintenance of approximations in DRSA when one object is added or removed44, furthermore, parallel algorithms are developed to speed up computing approximations in DRSA45. Błaszczynski and Słowinski proposed an incremental algorithm for induction rules based on the Apriori under the variable consistency DRSA46. Wang et al. presented efficient updating rough approximations with multi-dimensional variation of ordered data47. In the case of variation of attributes values, Zeng et al. developed dynamical updating fuzzy rough approximations for hybrid data under the variation of attribute values48. Chen et al. proposed an incremental method for updating approximations while objects dynamically alter and attributes values vary in variable precision rough set model49. And Chen et al. discussed the principles of maintenance of approximations and algorithms for incremental updating approximations are designed when attribute’s values coarsening or refining in IODS24. However, incremental approaches for updating approximations under IODS when the variation of attributes or objects have not been taken into account until now. In real-life application, such as monitoring domain, attributes and objects are always dynamic changed instead of attribute values. Thus, this paper focuses on incremental methods for updating approximations of confidential dominance-based rough set under the variation of attributes or objects in IODS.

The paper is organized as follows. In Section 2, basic concepts of IODS and dominance relation are reviewed and confidential dominance-based rough set approach is introduced. In Section 3, we discuss the principles of incremental updating approximations and propose algorithms when some attributes are added into or deleted from. In Section 4, incremental approach for updating approximations is given when subsets of objects are merged. In Section 5, some experiments are conducted to evaluate the performance of proposed approaches. The paper ends with conclusion and future work in Section 6.

2. Confidential Dominance-based Rough Set Approach

In this section, some basic concepts of decision systems, including IODS, confidential dominate relation are introduced.

Definition 1 24

A decision system is a 4-tuple IODS = (U, A, V, f). U is a finite non-empty set of objects, called the universe. A is a non-empty finite set of attributes, A = Cd where C and d denote the sets of condition attributes and decision attribute. V is a domain of attributes. The domain of C and d is ranked according to an increasing or decreasing preference. The attributes are criteria. f : U × AV is an information function. f = {f (xi, a)|f (xi, a) : xiva, aC, xiU, 1 ≤ i ≤ |U|}, where | · | denotes the cardinality of a set. f (xi, a) = va ( 1 ≤ i ≤ |U| ) denotes the attribute value of object xi under a. If all attributes? value are know in the decision system, it is a complete ordered decision system. If exists any missing data, it is an incomplete ordered decision system (IODS). All the missing values are denoted by “*”.

Definition 2 9,10

Let x, yU, PC. IfaP, f (y, a) ⪰ f (x, a) then we denote yDPx. The relation is called a dominance relation. Here, f (y, a) ⪰ f (x, a) means y at least as good as (outranks) x with respect to criterion a, which is defined by f (y, a) ≥ f (x, a) according to increasing preference ordering, and f (y, a) ≤ f (x, a) for decreasing ordering.

Definition 3

Let IODS = (U, A, V, f), PC, BP(x) = {b|bPf (x, b) ≠ ∗}. A confidential dominance relation (CDR) is defined as follows.

CDR(P)={(x,y)U2|(BP(x)BP(y))qP((f(x,q)=*f(y,q)=*)(f(x,q)=*f(y,q)*)f(y,a)f(x,a)}
yDPCDRx denotes that y is dominates x in CDR, which means the object y is more detailed than referent x, and y is weakly preferred over x with respect to each criterion q for which the evaluation of referent x is known.

Definition 4

For PC, xU, DPCDR+(x)={yU|yDPCDRx} is P confidential dominating set of x. DPCDR(x)={yU|xDPCDRy} is P confidential dominated set of x.

The set of decision attributes d partitions U into a finite number of classes. Let Cl = {Clt|t ∈ {1,2,…,n}}, where ClnCln−1 ≻ … ≻ Cl1. An upward union and downward union of classes are defined respectively as follows: Clt=stCls , Clt=stCls , where t, s ∈ {1,2,…,n}. xClt means x at least belongs to class Clt, while xClt means x at most belongs to class Clt

Definition 5

Let IODS = (U, A, V, f), PC, xU. The P upper and lower approximations of Clt and Clt under the CDR are defined respectively as follows.

P¯(Clt)={xU|DPCDR+(x)Clt}P¯(Clt)=xCltDPCDR+(x)={x|DPCDR(x)Clt}P¯(Clt)={xU|DPCDR(x)Clt}P¯(Clt)=xCltDPCDR(x)={x|DPCDR+(x)Clt}

The boundary region is defined as follows.

BnP(Clt)=P¯(Clt)P¯(Clt)BnP(Clt)=P¯(Clt)P¯(Clt)

The objects in the boundary region are inconsistent, which means an object x is confidential dominating another object y, however, x is assigned to a class worse than y.

Definition 6

For any xU in an IODS = (U, A, V, f), δ(x) =< lP(x),uP(x) > is called P generalized decision of the object x, where lP(x)=min{t|DPCDR+(x)Clt} and uP(x)=max{t|DPCDR(x)Clt} .

Property 1

Let IODS = (U, A, V, f), PC. For any xU, the following items hold.

  1. (1)

    if lP(x) ≥ Clt, then xP¯(Clt)

  2. (2)

    if uP(x) ≤ Clt, then xP¯(Clt)

  3. (3)

    if lP(x) ≤ Clt, then xP¯(Clt)

  4. (4)

    if uP(x) ≥ Clt, then xP¯(Clt)

Proof.

  1. (1)

    U is partitioned by d into decision classes Cl = {Clt|t ∈ {1,2,…,n}}. Since lP(x)=min{t{1,2,n}:DPCDR+(x)Clt} , for yDPCDR+(x) we have f (y, d) ≥ Clt if lP(x) ≥ t. Hence, DPCDR+(x)CltxP¯(Clt) .

    (2)-(4) are similar to (1).

According to Property 1, approximations of Clt and Clt under the CDR are defined as follows.

P¯(Clt)={xU|lP(x)Clt}P¯(Clt)={xU|uP(x)Clt}P¯(Clt)={xU|uP(x)Clt}P¯(Clt)={xU|lP(x)Clt}

Example 1

Let us consider an incomplete ordered decision system IODS = (U, A, V, f) presented in Table 1, where U = {x1,x2,⋯,x10}, A = Cd, C = {a1,a2,a3}, Vd = {1,2,3,4}.

a1 a2 a3 d
x1 4 4 3 4
x2 3 2 3 3
x3 4 * 2 3
x4 2 2 2 2
x5 2 1 2 2
x6 3 1 2 1
x7 * 2 2 2
x8 4 1 2 2
x9 3 * 2 3
x10 4 3 3 3
Table 1.

An IODS

From Table 1, upward union of decision is calculated as follows.

Cl4={x1},Cl3={x1,x2,x3,x9,x10},Cl2={x1,x2,x3,x4,x5,x7,x8,x9,x10},Cl1=U.
C confidential dominating sets are calculated as follows.
DCCDR+(x1)={x1},DCCDR+(x2)={x1,x2,x10},DCCDR+(x3)={x1,x3,x8,x10},DCCDR+(x4)={x1,x2,x4,x10},DCCDR+(x5)={x1,x2,x4,x5,x6,x8,x10},DCCDR+(x6)={x1,x2,x6,x8,x10},DCCDR+(x7)={x1,x2,x4,x7,x10},DCCDR+(x8)={x1,x8,x10},DCCDR+(x9)={x1,x2,x3,x6,x8,x9,x10},DCCDR+(x10)={x1,x10}.

Thus, approximations of upward union Clt with respect to C confidential dominating sets are computed as follows.

C¯(Cl4)={x1},C¯(Cl3)={x1,x2,x10},C¯(Cl2)={x1,x2,x3,x4,x7,x8,x10},C¯(Cl1)=U.C¯(Cl4)={x1},C¯(Cl3)={x1,x2,x3,x6,x8,x9,x10},C¯(Cl2)=U,C¯(Cl1)=U.

Analogously, approximations of C confidential dominated set with respect to upward union Clt can be calculated. For simplicity and clarity, the following examples discuss CDRSA only with the increasing preference order.

3. Incremental updating approximations under the variation of attribute sets

Approximations computing is a critical step for applying CDRSA in knowledge discovery. Traditional approach of approximations computing is time-consuming when attributes are changed because it should recompute from scratch. In this section, approaches of incremental updating approximations under the variation of attributes are discussed.

3.1. Incremental updating approximations when new attributes added

Lemma 1

Let IODS = (U, A, V, f), PC and QCP. For any xU, we have

  1. (1)

    DPQCDR+(x)=DPCDR+(x)DQCDR+(x)

  2. (2)

    DPQCDR(x)=DPCDR(x)DQCDR(x)

Proof.

  1. (1)

    Assume yDPCDR+(x)DQCDR+(x) , namely yDPCDR+(x) and yDQCDR+(x) , which means (x, y) ⊆ CDR(P) and (x, y) ⊆ CDR(Q). It follows yDPQCDR+(x) . DPQCDR+(x)=DPCDR+(x)DQCDR+(x) holds.

  2. (2)

    It is similar to (1).

By Lemma 1, Confidential dominating/dominated set of x can be easily obtained when attribute set Q is added into P.

Theorem 1

Let IODS = (U, A, V, f), PC and QCP. For each Clt , we have

  1. (1)

    PQ¯(Clt)=P¯(Clt)T(Clt) ,

    where T(Clt)={x|lPQ(DPQCDR+(x))Clt:xBnP(Clt)}

  2. (2)

    PQ¯(Clt)=P¯(Clt)K(Clt) ,

    where K(Clt)={x|uPQ(DPQCDR(x))<Clt:xP¯(Clt)}

Proof.

  1. (1)

    PQ¯(Clt)={xU|DPQCDR+(x)Clt} .

    Since DPQCDR+(x)=DPCDR+(x)DQCDR+(x) , we have PQ¯(Clt)={xU|DPCDR+(x)DQCDR+(x)Clt}={xU|DPCDR+(x)Clt}{xUP¯(Clt)|DPQCDR+(x)Clt} . And {xUP¯(Clt)}={x(BnP(Clt)(UP¯(Clt))} , it follows {xUP¯(Clt)|DPQCDR+(x)Clt}={xBnP(Clt)|lPQ(DPQCDR+(x))Clt} . Let T(Clt)={x|lPQ(DPQCDR+(x))Clt:xBnP(Clt)} . PQ¯(Clt)=P¯(Clt)T(Clt) holds.

  2. (2)

    Since PQ¯(Clt)=xCltDPQCDR+(x)=xClt(DPCDR+(x)DQCDR+(x)) and DPCDR+(x)DQCDR+(x)=DPCDR+(x)(UDPQCDR+(x)) , we have xClt(DPCDR+(x)DQCDR+(x))=xClt(DPCDR+(x)(UDPQCDR+(x)))=xCltDPCDR+(x)xClt(UDPQCDR+(x))) . According to Property 1.(4), we have PQ¯(Clt)=P¯(Clt){uPQ(DPQCDR(x))<Clt:xP¯(Clt)} . Let K(Clt)={x|uPQ(DPQCDR(x))<Clt:xP¯(Clt)} . PQ¯(Clt)=P¯(Clt)K(Clt) holds.

Theorem 2

Let IODS = (U, A, V, f), PC and QCP. For each Clt , we have

  1. (1)

    PQ¯(Clt)=P¯(Clt)Y(Clt) ,

    where Y(Clt)={x|uPQ(DPQCDR(x))Clt:xBnP(Clt)}

  2. (2)

    PQ¯(Clt)=P¯(Clt)Z(Clt) ,

    where Z(Clt)={x|lPQ(DPQCDR+(x))>Clt:xP¯(Clt)}

Proof.

It is similar to proof of Theorem 1.

Based on Theorems 12, an algorithm of updating approximation of CDRSA is proposed when new attributes are added into the IODS (See algorithm 1 IUA).

Example 2

(Continued from Example 1)

Let P = {a1, a3}, Q = {a2}, C = PQ.

DPCDR+(x1)=DPCDR+(x10)={x1,x10},DPCDR+(x2)={x1,x2,x10},DPCDR+(x3)=DPCDR+(x8)={x1,x3,x8,x10},DPCDR+(x4)=DPCDR+(x5)=U{x7},DPCDR+(x6)=DPCDR+(x9)={x1,x2,x3,x6,x8,x9,x10},DPCDR+(x7)=U.DQCDR+(x1)={x1},DQCDR+(x2)=DQCDR+(x4)=DQCDR+(x7)={x1,x2,x4,x7,x10},DQCDR+(x3)=DQCDR+(x9)=U,DQCDR+(x5)=DQCDR+(x6)=DQCDR+(x8)=U{x3,x9}.

Approximation of Clt with respect to P confidential dominating sets are computing as follows.

P¯(Cl4)=,P¯(Cl3)={x1,x2,x10},P¯(Cl2)={x1,x2,x3,x8,x10},P¯(Cl1)=U.P¯(Cl4)={x1,x10},P¯(Cl3)={x1,x2,x3,x6,x8,x9,x10},P¯(Cl2)=U,P¯(Cl1)=U.

According to Lemma 1, C = PQ confidential dominating sets are easily calculated as follows. Take x1 and x4 for example.

DCCDR+(x1)=DPQCDR+(x1)=DPCDR+(x1)DQCDR+(x1)={x1}.DPQCDR+(x4)=DPCDR+(x4)DQCDR+(x4)={x1,x2,x4,x10}.

Based on Theorem 2, approximations of Clt with respect to PQ confidential dominating sets are computing as follows.

T(Cl4)={x|lPQ(DPQCDR+(x))Cl4:xBnP(Cl4)}={x1},T(Cl3)=,T(Cl2)={x4,x7},T(Cl1)=.K(Cl4)={x|uPQ(DPQCDR(x))<Cl4:xP¯(Cl4)}={x10},K(Cl3)=K(Cl2)=K(Cl1)=.PQ¯(Cl4)=P¯(Cl4)T(Cl4)={x1},PQ¯(Cl3)={x1,x2,x10},PQ¯(Cl2)={x1,x2,x3,x4,x7,x8,x10},PQ¯(Cl1)=U.PQ¯(Cl4)=P¯(Cl4)K(Cl4)={x1},PQ¯(Cl3)={x1,x2,x3,x6,x8,x9,x10}.P¯(Cl2)=P¯(Cl1)=U.

3.2. Incremental updating approximations when attributes deleted

Lemma 2

Let IODS = (U, A, V, f), PC and QPC. For any xU, we have

  1. (1)

    DPQCDR+(x)=DPCDR+(x)AQ(x) ,

    where AQ(x)={yUDPCDR+(x)|(UDPCDR+(x))DPQCDR+(x)} .

  2. (2)

    DPQCDR(x)=DPCDR(x)AQ(x) ,

    where AQ(x)={yUDPCDR(x)|(UDPCDR(x))DPQCDR(x)} .

Proof.

  1. (1)

    Proof by contradiction. Assume yDPQCDR+(x) satisfies yUDPCDR+(x) and yAQ(x) , namely, y{UDPCDR+(x)DPQCDR+(x)} , then yDPCDR+(x) or yUDPQCDR+(x) . (a) yDPCDR+(x) is in contradiction with the assumption yUDPCDR+(x) ; (b) yUDPQCDR+(x) , we have (x,y)UDPQCDR+(x) which means yDPQCDR+(x) . Contradiction. DPQCDR+(x)=DPCDR+(x)AQ(x) holds.

  2. (2)

    It is similar to Lemma 2(1).

Theorem 3

Let IODS = (U, A, V, f), PC and QPC. For any xU, we have

  1. (1)

    PQ¯(Clt)=P¯(Clt)W(Clt) ,

    where W(Clt)={x|lPQ(AQ(x))<Clt,xP¯(Clt)}

  2. (2)

    PQ¯(Clt)=P¯(Clt)R(Clt) ,

    where R(Clt)={AQ(x):xP¯(Clt)}

Proof.

  1. (1)

    PQ¯(Clt)={xU|DPQCDR+(x)Clt}=P¯(Clt){xP¯(Clt)|DPQCDR+(x)Clt} . Let W(Clt)={xP¯(Clt)|DPQCDR+(x)Clt}={x|lPQ(AQ(x))<Clt,xP¯(Clt)} . PQ¯(Clt)=P¯(Clt)W(Clt) holds.

  2. (2)

    PQ¯(Clt)=xCltDPQCDR+(x)=P¯(Clt){AQ(x):xP¯(Clt)} . Let R(Clt)={AQ(x):xP¯(Clt)} . PQ¯(Clt)=P¯(Clt)R(Clt) holds.

Require: P confidential dominating/dominated sets; P¯(Clt) , P¯(Clt)
Ensure: PQ¯(Clt) , PQ¯(Clt)
1:
Compute Q confidential dominating/dominated sets
2:
for i=1:|U| do
3:
  
DPQCDR+(x)=DPCDR+(x)DQCDR+(x)
4:
  
DPQCDR(x)=DPCDR(x)DQCDR(x)
5:
end for
6:
for 
i=1:|Clt|
 do
7:
  Compute 
T(Clt)
 and 
K(Clt)
8:
  Compute 
Y(Clt)
 and 
Z(Clt)
9:
PQ¯(Clt)=P¯(Clt)T(Clt)
 and 
PQ¯(Clt)=P¯(Clt)Y(Clt)
10:
PQ¯(Clt)=P¯(Clt)K(Clt)
 and 
PQ¯(Clt)=P¯(Clt)Z(Clt)
11:
end for
Algorithm 1

(IUA) Incremental Updating approximations when new attributes Added

Theorem 4

Let IODS = (U, A, V, f), PC and QPC. For any xU, we have

  1. (1)

    PQ¯(Clt)=P¯(Clt)S(Clt) ,

    where S(Clt)={x|uPQ(AQ(x))>Clt,xP¯(Clt)}

  2. (2)

    PQ¯(Clt)=P¯(Clt)V(Clt) ,

    where V(Clt)={AQ(x):xP¯(Clt)}

Proof.

It is similar to proof of Theorem 3.

Based on Theorems 34, an algorithm is developed for updating approximations of CDRSA when some attributes are deleted from the IODS (see Algorithm 2 IUD).

Example 3

(Continued from Example 1)

Let xU, P = CQ.

According to Lemma 2, for any x, AQ(x) is easily calculated as follows.

AQ(x1)={x10},AQ(x2)=AQ(x3)=AQ(x9)=AQ(x10)=,AQ(x4)={x3,x5,x6,x8,x9,x10},AQ(x5)=AQ(x6)={x3,x9},AQ(x7)={x3,x5,x6,x8,x9},AQ(x8)={x3}

Approximation of Clt with respect to P = CQ confidential dominating sets are computing as follows.

W(Cl4)={x|lCQ(AQ(x))<Cl4,xP¯(Cl4)}={x1},W(Cl3)=,W(Cl2)={x4,x7},W(Cl1)=.R(Cl4)={AQ(x):xP¯(Cl4)}={x10},R(Cl3)={x3,x9,x10},R(Cl2)={x3,x5,x6,x8,x9,x10},R(Cl1)={x3,x5,x6,x8,x9,x10}.P¯=CQ¯(Cl4)=C¯W(Cl4)=,CQ¯(Cl3)={x1,x2,x10},CQ¯(Cl2)={x1,x2,x3,x8,x10},CQ¯(Cl1)=UP¯(Cl4)=CQ¯(Cl4)=C¯(Cl4)R(Cl4)={x1,x10},P¯(Cl3)={x1,x2,x3,x6,x8,x9,x10},P¯(Cl2)=U,P¯(Cl1)=U

4. Incremental updating approximations when subsets of objects are merged

In this section, incremental updating approximations of CDRSA is discussed when subsets of objects are merged into IODS.

Assume the universe of IODS = (U, A, V, f) is composed by m subsets of objects, where U=i=1mUi , i = 1,2,⋯,m.

Definition 7

Let UkU, k = 1,2,⋯,m. Cltk and Cltk in subset Uk are defined as follows.

Cltk={xUk|f(x,d)Clt}Cltk={xUk|f(x,d)Clt}

Require: P confidential dominating/dominated sets; P¯(Clt) , P¯(Clt)
Ensure: PQ¯(Clt) , PQ¯(Clt)
1:
for i=1:|U| do
2:
  Compute 
AQ(x)
 and 
AQ(x)
3:
  
DPQCDR+(x)=DPCDR+(x)AQ(x)
4:
  
DPQCDR(x)=DPCDR(x)AQ(x)
5:
end for
6:
for 
i=1:|Clt|
 do
7:
  Compute 
W(Clt)
 and 
R(Clt)
8:
  
PQ¯(Clt)=P¯(Clt)W(Clt)
 and 
PQ¯(Clt)=P¯(Clt)S(Clt)
9:
  
PQ¯(Clt)=P¯(Clt)R(Clt)
 and 
PQ¯(Clt)=P¯(Clt)V(Clt)
10:
end for
Algorithm 2 (IUD)

Incremental Updating approximations when some attributes Deleted

Definition 8

Let UiU, i = 1,2,⋯,m. The P confidential dominating/dominated sets of the object x related to Ui are defined as follows.

DPCDR+(x)i={yUi|yDPCDRx}DPCDR(x)i={yUi|xDPCDRy}

Definition 9

Let UkU, i,k = 1,2,⋯,m. Lower and upper approximations of Cltk related to Ui are defined as follows.

P¯(Cltk)i={xUi|DPCDR+(x)iCltk}P¯(Cltk)i=xCltkDPCDR+(x)i={xUi|DPCDR(x)iCltk}

Similarly, lower and upper approximations of Cltk related to Ui are defined as follows.

P¯(Cltk)i={xUi|DPCDR(x)iCltk}P¯(Cltk)i=xCltkDPCDR(x)i={xUi|DPCDR+(x)iCltk}

Theorem 5

Let i = 1,2,⋯,m, t = 1,2,⋯,n, we have

  1. (1)

    Clt=i=1mClti

  2. (2)

    Clt=i=1mClti

Proof.

  1. (1)

    Since Clt={xU|f(x,D)Clt}={xi=1mUi|f(x,D)Clt}=i=1m{xUi|f(x,D)Clt}=i=1mClti , Cli=k=1mClti holds.

  2. (2)

    It is similar to proof Theorem 5(1).

Theorem 6

Let xU,i = 1,2,⋯,m, we have

  1. (1)

    DPCDR+(x)=i=1mDPCDR+(x)i

  2. (2)

    DpCDR(x)=i=1mDPCDR(x)i

Proof.

  1. (1)

    Since DPCDR+(x)={xU|yDPCDRx}={xi=1mUi|yDPCDRx}=i=1m{xUi|yDPCDRx}=i=1mDPCDR+(x)i , DPCDR+(x)=i=1mDPCDR+(x)i holds.

  2. (2)

    It is similar to proof Theorem 6(1).

Theorem 7

Given t = 1,2, ⋯ , n,i = 1,2,⋯ , m, the following items hold.

  1. (1)

    P¯(Clt)=i=1m(k=1mP¯(Cltk)i)

  2. (2)

    P¯(Clt)=i=1mk=1mP¯(Cltk)i

Proof.

  1. (1)

    Since P¯(Clt)={xU|DpCDR+(x)Clt}={xi=1mUi|k=1mDpCDR+(x)kk=1mCltk}=i=1m{xUi|DpCDR+(x)1Clt1DpCDR+(x)mCltm}=i=1m(k=1m{xUi|DpCDR+(x)kCltk})=i=1m(k=1mP¯(Cltk)i) , P¯(Clt)=i=1m(k=1mP¯(Cltk)i) holds.

  2. (2)

    Since P¯(Clt)={xU|DpCDR(x)Clt}={xi=1mUi|k=1mDpCDR(x)kk=1mCltk}=i=1m{xUi|DpCDR(x)1Clt1DpCDR(x)mCltm}=i=1m(k=1m{xUi|DpCDR(x)kCltk})=i=1mk=1mP¯(Cltk)i , P¯(Clt)=i=1mk=1mP¯(Cltk)i holds.

Theorem 8

Given t = 1,2,⋯,n,i = 1,2,⋯,m, the following items hold.

  1. (1)

    P¯(Clt)=i=1m(k=1mP¯(Cltk)i)

  2. (2)

    P¯(Clt)=i=1mk=1mP¯(Cltk)i

Proof.

It is similar to proof of Theorem 7.

Based on Theorems 58, an algorithm is designed for updating approximations of CDRSA when subsets of objects are merged IODS (see Algorithm 3 IUM).

Example 4

(Continued from Example 1)

Let U = U1U2, where U1 = {x1,x2,⋯,x7} and U2 = {x8,x9,x10}.

According to Definition, k = 1,2. Cltk in subset Uk are calculated as follows.

Cl41={x1},Cl31={x1,x2,x3},Cl21={x1,x2,x3,x4,x5,x7},Cl11=U1.Cl32={x9,x10},Cl22={x8,x9,x10},Cl12=U2.

The C confidential dominating sets of the object x related to Ui(i = 1,2) are calculated as follows.

DCCDR+(x1)1={x1},DCCDR+(x2)1={x1,x2},DCCDR+(x3)1={x1,x3},DCCDR+(x4)1={x1,x2,x4},DCCDR+(x5)1={x1,x2,x4,x5,x6},DCCDR+(x6)1={x1,x2,x6},DCCDR+(x7)1={x1,x2,x4,x7}.DCCDR+(x1)2=,DCCDR+(x2)2={x10},DCCDR+(x3)2={x8,x10},DCCDR+(x4)2={x10},DCCDR+(x5)2={x8,x10},DCCDR+(x6)2={x8,x10},DCCDR+(x7)2={x10}.DCCDR+(x8)1={x1},DCCDR+(x9)1={x1,x2,x3,x6},DCCDR+(x10)1={x1}.DCCDR+(x8)2={x8,x10},DCCDR+(x9)2={x8,x9,x10},DCCDR+(x10)2={x10}.

Lower approximations of Cltk related to Ui, where i,k = 1,2, are computed as follows.

C¯(Cl41)1={x1},C¯(Cl42)1={x1},C¯(Cl31)1={x1,x2,x3},C¯(Cl32)1={x1,x2,x4,x7},C¯(Cl21)1={x1,x2,x3,x4,x7},C¯(Cl22)1=U1.C¯(Cl41)2=,C¯(Cl42)2={x8,x10},C¯(Cl31)2={x8,x10},C¯(Cl32)2={x10},C¯(Cl21)2={x8,x10},C¯(Cl22)2=U2.

Upper approximations of Cltk related to Ui, where i,k = 1,2, are computed as follows.

C¯(Cl41)1={x1},C¯(Cl31)1={x1,x2,x3},C¯(Cl21)1=C¯(Cl11)1=U1C¯(Cl41)2=,C¯(Cl31)2=C¯(Cl21)2=C¯(Cl11)2={x1,x2,x3,x6}.C¯(Cl32)1=C¯(Cl22)1={x8,x9,x10}.C¯(Cl32)2=C¯(Cl22)2={x8,x9,x10}.

Approximations of Clt when U1 and U2 merged are updated as follows.

C¯(Clt)=i=12(k=12C¯(Cltk)i)={x1},C¯(Cl3)={x1,x2,x10},C¯(Cl2)={x1,x2,x3,x4,x7,x8,x10},C¯(Cl1)=U1U2.

Approximations of Clt when U1 and U2 merged are updated as follows.

C¯(Cl4)=i=12k=12C¯(Cl4k)i={x1},C¯(Cl3)={x1,x2,x3,x6,x8,x9,x10},C¯(Cl2)=U,C¯(Cl1)=U.

5. Expriment

Some experiments are conducted to evaluate the performance of the incremental approaches. Six UCI data sets are selected from UCI Machine Learning Repository50, described in Table 2, as benchmark for performance test. The incremental algorithms and non-incremental algorithm are developed by Matlab 2014Ra on Macbook Pro 2015 with OS X EI Capitan on Intel Core i5 2.7GHz and 16G memory.

There are two classes of the experiments: experiments on updating approximations of CDRSA under the variation of attributes and experiments on updating approximations when two subsets are combined.

ID Data set |U| |C| class
1 heart-disease(cleveland) 303 13 5
2 lonosphere 351 34 2
3 arrhythmia 452 279 12
4 winequality(white) 4898 11 7
5 statlog (Landsat Satellite) 6435 36 6
6 mushroom 7910 22 7
Table 2.

Data sets description

Require: approximations of Clti related to every Ui;
Ensure: P¯(Clt) , P¯(Clt)
1:
for each xUi do
2:
  Compute 
DpCDR+(x)i
3:
end for
4:
for i=1:m do
5:
  for each ki in 
Cltk
, 
Cltk
 do
6:
    compute 
P¯(Cltk)i
 and 
P¯(Cltk)i
7:
    compute 
P¯(Cltk)i
 and 
P¯(Cltk)i
8:
  end for
9:
end for
10: P¯(Clt)=i=1m(k=1mP¯(Cltk)i) and P¯(Clt)=i=1m(k=1mP¯(Cltk)i)
11: P¯(Clt)=i=1mk=1mP¯(Cltk)i and P¯(Clt)=i=1mk=1mP¯(Cltk)i
Algorithm 3 (IUM)

Incremental Updating approximations when subsets of objects are Merged

5.1. The variation of attributes

For this case, a comparison of incremental updating approximations and non-incremental computing approximations is investigated to evaluate the performance of updating approximations of CDRSA under the variation of the attribute set. We make two groups of experiments on each data sets when some attributes are deleted and added correspondingly.

In the first group of experiments, some attributes are randomly selected from the condition attribute C to constitute a deleted attribute set Q. The size of Q is from 10% to 50% in percentage in C, where the step length is 10%. The remaining attributes are P = CQ. Time consuming is recorded to compare the incremental approach of updating approximations IUD and non-incremental method of computing approximations with respect to P when attribute set Q is deleted from C.

In the second group of experiments, the deleted attribute set Q are added back into P. Thus, we have C = PQ. Similarly, computational time of incremental approach IUA is compared with non-incremental method when attribute set Q is added into P.

The experimental results of the six data sets are shown in Figure 1. In each of sub-figures, the x-coordinate pertains to percentage of the size of Q in C from 10% to 50%, and y-coordinate pertains the time consuming. The ‘non-incr-add’ denotes non-incremental method of computing approximations with respect to C, while the ‘non-incr-del’ denotes non-incremental computing approximations with respect to P.

Fig. 1.

A comparison of incremental and non-incremental of computing approximations under the variation of attribute set

From the Figure 1, the incremental approaches IUA and IUD are always faster than non-incremental approach ‘non-incr-add’ and ‘non-incrdel’ respectively. Furthermore, IUA is always effective than IUD because calculation of the confidential dominating/dominated sets in IUA is faster.

The experimental results of data sets arrhythmia and statlog (Landsat Satellite) with large condition attributes, as in Figure 1 (c) and Figure 1(e), show that the trend of time consuming of both IUD and ‘non-incr-del’ goes down when the cardinality of deleted attribute set increases.

In addition, from the results of winequality(white), statlog (Landsat Satellite) and mushroom with more objects, the trend of time consuming of IUA goes down when the cardinality of added attribute set increases. Intuitively, the trend of time consuming should go up when the cardinality of added attribute set increases. However, the boundary of remain attributes may decrease after the cardinality of deleted attributes increases. According to IUA, it follows that time consuming may decrease.

5.2. Two subsets are merged

For this section, we focus on update approximations when two subsets are merged. Analogously, time consuming of computation is recorded to compare the incremental approach IUM with non-incremental method.

The strategy of the experiment is the universe of each data sets are divided into two subsets. A group experiments are conducted, one subset is partitioned in percentage of universe from 50% to 90% and the other is remaining.

The experimental results of the six data sets are shown in Figure 2. In each of sub-figures, the xcoordinate pertains to percentage of the one subset in the universe, and y-coordinate pertains the time consuming of updating approximations when the other subset is combined into the one.

Fig. 2.

A comparison of incremental and non-incremental of updating approximations when two subsets of objects are merged

Clearly, from Figure 2(a)-(f), IUM is more effective than non-incremental approach when two subsets are combined into the universe. And the trend of time consuming of IUM goes down when percentage of the one subset in the universe increases.

6. Conclusion and future work

An information system evolves commonly in the dynamic data environment. It is important to accelerate the calculation of approximations in CDRSA in such dynamic environment. In this paper, we focus on the incremental approaches while the variation of attributes or the objects in IODS. The properties of confidential dominating/dominated sets under the variation of attributes or objects are discussed. Furthermore, incremental approaches of updating approximations are proposed when attributes or objects are dynamic changed. With the result of experiment, incremental approaches of updating approximations of CDRSA are feasible and effectively reduce the time consuming.

Our future work will focus on speeding up these approaches on the distributed computing environment for massive data sets.

Ackownledgements

This work is supported by the National Natural Science Foundation of China (Nos. 61272060 and 61572091), National key research and development program of China (No. 2016YFB1000905), Scientific and Technological Research Program of Chongqing Municipal Education Commission(No. KJ1600933), and Scientific Research Foundation of Sichuan Provincial Education Department (No. 16ZA0329)

References

1.Z Pawlak, Rough sets, International Journal of Computer and Information Sciences, Vol. 11, No. 5, 1982, pp. 341-356. http://dx.doi.org/10.1007/bf01001956
2.J Li, Y Ren, C Mei, et al., A comparative study of multigranulation rough sets and concept lattices via rule acquisition, Knowledge-Based Systems, Vol. 85, 2015, pp. 1-23.
3.CH Weng and CK Huang, Knowledge discovery of customer purchasing intentions by plausible-frequent itemsets from uncertain data, Applied Intelligence, Vol. 43, No. 3, 2015, pp. 598-613. http://dx.doi.org/10.1007/s10489-015-0669-7
4.TR Li, D Ruan, YJ Shen, et al., A new weighting approach based on rough set theory and granular computing for road safety indicator analysis, Computational Intelligence, Vol. 32, No. 4, 2016, pp. 517-534. http://dx.doi.org/10.1111/coin.12061
5.R Bello and JL Verdegay, Knowledge Engineering for Rough Sets Based Decision-Making Models, International Journal of Intelligent Systems, Vol. 29, No. 9, 2014, pp. 823-835. http://dx.doi.org/10.1002/int.21665
6.JT Yao and N Azam, Web-Based Medical Decision Support Systems for Three-Way Medical Decision Making With Game-Theoretic Rough Sets, IEEE Transactions on Fuzzy Systems, Vol. 23, No. 1, 2015, pp. 3-15. http://dx.doi.org/10.1109/tfuzz.2014.2360548
7.TR Li, C Luo, HM Chen, et al. (editors), The Principles and Methodologies of Big Data Mining—-From the Perspectives of Granular Computing and Rough Sets, Science Press, Beijing, 2016. (In Chinese)
8.JB Zhang, Y Zhu, Y Pan, et al., Efficient Parallel Boolean Matrix Based Algorithms for Computing Composite Rough Set Approximations, Information Sciences, Vol. 329, 2016, pp. 287-302. http://dx.doi.org/10.1016/j.ins.2015.09.022
9.L Bellodi, M Battaglia, M Gasperini, et al., Rough approximation by dominance relations, International Journal of Intelligent Systems, Vol. 17, No. 2, 2002, pp. 153-171. http://dx.doi.org/10.1002/int.10014
10.S Greco, B Matarazzo, and R Słowinski, Rough sets methodology for sorting problems in presence of multiple attributes and criteria, European Journal of Operational Research, Vol. 138, No. 2, 2002, pp. 247-259. http://dx.doi.org/10.1016/s0377-2217(01)00244-2
11.J Błaszczynski, S Greco, and Słowinski, Multi-criteria classification : A new scheme for application of dominance-based decision rules, European Journal of Operational Research, Vol. 181, No. 3, 2007, pp. 1030-1044. http://dx.doi.org/10.1016/j.ejor.2006.03.004
12.T Lee, S Yeung, and C Tsang, Rough sets and ordinal reducts, Soft Computing, Vol. 10, No. 1, 2006, pp. 27-33. http://dx.doi.org/10.1007/s00500-005-0460-5
13.HM Chen, TR Li, Y Cai, et al., Parallel attribute reduction in dominance-based neighborhood rough set, Information Sciences, Vol. 373, 2016, pp. 351-368. http://dx.doi.org/10.1016/j.ins.2016.09.012
14.JJH Liou and GH Tzeng, A Dominance-based Rough Set Approach to customer behavior in the airline market, Information Sciences, Vol. 180, No. 11, 2010, pp. 2230-2238. http://dx.doi.org/10.1016/j.ins.2010.01.025
15.S Greco, B Matarazzo, and R Słowinski, A New Rough Set Approach to Evaluation of Bankruptcy Risk, Operational Tools in the Management of Financial Risks, Springer, US, 1998, pp. 121-136. http://dx.doi.org/10.1007/978-1-4615-5495-0_8
16.J Karami, A Alimohammadi, and T Seifouri, Water quality analysis using a variable consistency dominance-based rough set approach, Computers Environment and Urban Systems, Vol. 43, No. 1, 2014, pp. 25-33. http://dx.doi.org/10.1016/j.compenvurbsys.2013.09.005
17.S Greco, B Matarazzo, and R Słowinski, Handling Missing Values in Rough Set Analysis of Multi-attribute and Multi-criteria Decision Problems, N Zhong, A Skowron, and S Ohsuga (editors), RSFDGrC 1999. LNCS (LNAI), Springer, Heidelberg, Vol. 1711, pp. 146-157. http://dx.doi.org/10.1007/978-3-540-48061-7_19
18.S Greco, B Matarazzo, and R Słowinski, Dealing with Missing Data in Rough Set Analysis of Multi-Attribute and Multi-Criteria Decision Problems, Decision Making: Recent Developments and Worldwide Applications, Springer, US, 2000. http://dx.doi.org/10.1007/978-1-4757-4919-9_20
19.Induction of Ordinal Classification Rules from Incomplete Data, J Błaszczynski, R Słowinski, and M Szelag (editors), Rough Sets and Current Trends in Computing, Springer, Berlin Heidelberg, 2012, pp. 56-65. http://dx.doi.org/10.1007/978-3-642-32115-3_6
20.YQ He and SS Hu, Rough analysis method of multi-attribute decision making with incomplete information, Journal of Systems Engineering, Vol. 19, No. 2, 2004, pp. 44-57. (in Chinese)
21.MW Shao and WX Zhang, Dominance relation and rules in an incomplete ordered information system, International Journal of Intelligent Systems, Vol. 20, No. 1, 2005, pp. 13-27. http://dx.doi.org/10.1002/int.20051
22.ML Hu and SF Liu, An Analysis Method of Rough Multi-attribute Decision Making Based on Limited Extended Dominance Relation, Systems Engineering, Vol. 24, No. 4, 2006, pp. 106-110. (in Chinese)
23.ML Hu and SF Liu, Rough analysis method of multi-attribute decision making based on generalized extended dominance relation, Control and Decision, Vol. 22, No. 12, 2007, pp. 1347-1351. (in Chinese)
24.HM Chen, TR Li, and D Ruan, Maintenance of approximations in incomplete ordered decision systems while attribute values coarsening or refining, Knowledge-Based Systems, Vol. 31, No. 7, 2012, pp. 140-161. http://dx.doi.org/10.1016/j.knosys.2012.03.001
25.WS Du and BQ Hu, Dominance-based rough set approach to incomplete ordered information systems, Information Sciences, Vol. 346–347, 2016, pp. 106-129. http://dx.doi.org/10.1016/j.ins.2016.01.098
26.XB Yang, JY Yang, C Wu, et al., Dominance-based rough set approach and knowledge reductions in incomplete ordered information system, Information Sciences, Vol. 178, 2008, pp. 1219-1234. http://dx.doi.org/10.1016/j.ins.2007.09.019
27.GZ Luo, XJ Yang, and DQ Zhou, Rough Analysis Model of Multi-attribute Decision Making Based on Limited Dominance Relation, Chinese Journal of Management Science, Vol. 18, No. 4, 2009, pp. 391-396. (in Chinese)
28.XB Yang and HL Dou, Valued Dominance-Based Rough Set Approach to Incomplete Information System, Data and Knowledge Engineering, Vol. 68, No. 11, 2011, pp. 1331-1347. http://dx.doi.org/10.1016/j.datak.2009.07.007
29.GL Gou, GY Wang, J Li, et al., Confidential dominance relation based rough approximation model, Control and Decision, Vol. 29, No. 7, 2014, pp. 1325-1329. (in Chinese)
30.CC Chan, A rough set approach to attribute generalization in data mining, Information Sciences, Vol. 107, No. 1–4, 1998, pp. 169-176. http://dx.doi.org/10.1016/s0020-0255(97)10047-0
31.TR Li, D Ruan, W Geert, et al., A rough sets based characteristic relation approach for dynamic attribute generalization in data mining, Knowledge-Based Systems, Vol. 20, No. 5, 2007, pp. 485-494. http://dx.doi.org/10.1016/j.knosys.2007.01.002
32.JB Zhang, TR Li, D Ruan, et al., Rough sets based matrix approaches with dynamic attribute variation in set-valued information systems, International Journal of Approximate Reasoning, Vol. 53, No. 4, 2012, pp. 620-635. http://dx.doi.org/10.1016/j.ijar.2012.01.001
33.SY Li, TR Li, and D Liu, Incremental updating approximations in dominance-based rough sets approach under the variation of the attribute set, Knowledge-Based Systems, Vol. 40, No. 1, 2013, pp. 17-26. http://dx.doi.org/10.1016/j.knosys.2012.11.002
34.YY Zhang, TR Li, C Luo, et al., Incremental Updating of Rough Approximations in Interval-valued Information Systems under Attribute Generalization, Information Sciences, Vol. 373, 2016, pp. 461-475. http://dx.doi.org/10.1016/j.ins.2016.09.018
35.D Liu, TR Li, and JB Zhang, Incremental updating approximations in probabilistic rough sets under the variation of attributes, Knowledge-Based Systems, Vol. 73, 2015, pp. 81-96. http://dx.doi.org/10.1016/j.knosys.2014.09.008
36.XB Yang, Y Qi, HL Yu, et al., Updating multigranulation rough approximations with increasing of granular structures, Knowledge-Based Systems, Vol. 64, 2014, pp. 59-69. http://dx.doi.org/10.1016/j.knosys.2014.03.021
37.N Shan and W Ziarko, Data-based acquisition and incremental modification of classification rules, Computational Intelligence, Vol. 11, No. 2, 1995, pp. 357-370. http://dx.doi.org/10.1111/j.1467-8640.1995.tb00038.x
38.Z Zheng and GY Wang, RRIA: A rough set and rule tree-based incremental knowledge acquisition algorithm, Fundamenta Informaticae, Vol. 59, No. 3, 2004, pp. 299-313.
39.JB Zhang, TR Li, D Ruan, et al., A parallel method for computing rough set approximations, Information Sciences, Vol. 194, No. 5, 2012, pp. 209-223. http://dx.doi.org/10.1016/j.ins.2011.12.036
40.JB Zhang, TR Li, D Ruan, et al., Neighborhood rough sets for dynamic data mining, International Journal of Intelligent Systems, Vol. 27, No. 4, 2012, pp. 317-342. http://dx.doi.org/10.1002/int.21523
41.C Luo, TR Li, Z Yi, et al., Matrix approach to decision-theoretic rough sets for evolving data, Knowledge-Based Systems, Vol. 99, 2016, pp. 123-134. http://dx.doi.org/10.1016/j.knosys.2016.01.042
42.C Luo, TR Li, HM Chen, et al., Efficient updating of probabilistic approximations with incremental objects, Knowledge-Based Systems, Vol. 109, 2016, pp. 71-83. http://dx.doi.org/10.1016/j.knosys.2016.06.025
43.SY Li, TR Li, and J Hu, Update of approximations in composite information systems, Knowledge-Based Systems, Vol. 83, 2015, pp. 138-148. http://dx.doi.org/10.1016/j.knosys.2015.03.016
44.SY Li, TR Li, and D Liu, Dynamic Maintenance of Approximations in Dominance-Based Rough Set Approach under the Variation of the Object Set, International Journal of Intelligent Systems, Vol. 28, No. 8, 2013, pp. 729-751. http://dx.doi.org/10.1002/int.21599
45.SY Li, TR Li, Z Zhang, et al., Parallel computing of approximations in dominance-based rough sets approach, Knowledge-Based Systems, Vol. 87, 2015, pp. 102-111. http://dx.doi.org/10.1016/j.knosys.2015.05.003
46.J Błaszczynski and R Słowinski, Incremental induction of decision rules from dominance-based rough approximations, Electronic Notes in Theoretical Computer Science, Vol. 82, No. 4, 2003, pp. 40-51. http://dx.doi.org/10.1016/s1571-0661(04)80704-7
47.S Wang, TR Li, C Luo, et al., Efficient updating rough approximations with multi-dimensional variation of ordered data, Information Sciences, Vol. 372, 2016, pp. 690-708. http://dx.doi.org/10.1016/j.ins.2016.08.044
48.AP Zeng, TR Li, J Hu, et al., Dynamical updating fuzzy rough approximations for hybrid data under the variation of attribute values, Information Sciences, Vol. 378, 2017, pp. 363-388. http://dx.doi.org/10.1016/j.ins.2016.07.056
49.HM Chen, TR Li, D Ruan, et al., A rough-set-based incremental approach for updating approximations under dynamic maintenance environments, IEEE Transactions on Knowledge and Data Engineering, Vol. 25, No. 2, 2013, pp. 274-284. http://dx.doi.org/10.1109/tkde.2011.220
50.M Lichman, UCI Machine Learning Repository, University of California, School of Information and Computer Science, Irvine, CA, 2013. [http://archive.ics.uci.edu/ml]
Journal
International Journal of Computational Intelligence Systems
Volume-Issue
10 - 1
Pages
540 - 554
Publication Date
2017/01/01
ISSN (Online)
1875-6883
ISSN (Print)
1875-6891
DOI
10.2991/ijcis.2017.10.1.37How to use a DOI?
Copyright
© 2017, the Authors. Published by Atlantis Press.
Open Access
This is an open access article under the CC BY-NC license (http://creativecommons.org/licences/by-nc/4.0/).

Cite this article

TY  - JOUR
AU  - Guanglei Gou
AU  - Guoyin Wang
PY  - 2017
DA  - 2017/01/01
TI  - Incremental approximation computation in incomplete ordered decision systems
JO  - International Journal of Computational Intelligence Systems
SP  - 540
EP  - 554
VL  - 10
IS  - 1
SN  - 1875-6883
UR  - https://doi.org/10.2991/ijcis.2017.10.1.37
DO  - 10.2991/ijcis.2017.10.1.37
ID  - Gou2017
ER  -