scispace - formally typeset
Open AccessBook ChapterDOI

Granularity, multi-valued logic, Bayes' theorem and rough sets

Reads0
Chats0
TLDR
The relationship between some ideas of Lukasiewicz's multi-valued logic, Bayes' Theorem and rough sets will be pointed out and the consequences of granularity of knowledge for reasoning about imprecise concepts will be discussed.
Abstract
Granularity of knowledge attracted attention of many researchers recently. This paper concerns this issue from the rough set perspective. Granularity is inherently connected with foundation of rough set theory. The concept of the rough set hinges on classification of objects of interest into similarity classes, which form elementary building blocks (atoms, granules) of knowledge. These granules are employed to define basic concepts of the theory. In the paper basic concepts of rough set theory will be defined and their granular structure will be pointed out. Next the consequences of granularity of knowledge for reasoning about imprecise concepts will be discussed. In particular the relationship between some ideas of Lukasiewicz's multi-valued logic, Bayes' Theorem and rough sets will be pointed out.

read more

Content maybe subject to copyright    Report

Granularity, Multi-valued Logic,
Bayes’ Theorem and Rough Sets
Zdzislaw Pawlak
Institute for Theoretical and Applied Informatics
Polish Academy of Sciences
ul. Baltycka 5, 44 000 Gliwice, Poland
e-mail:zpw@ii.pw.edu.pl
Abstract. Granularity of knowledge attracted attention of many researchers re-
cently. This paper concerns this issue from the rough set perspective. Granularity
is inherently connected with foundation of rough set theory. The concept of the
rough set hinges on classification of objects of interest into similarity classes, which
form elementary building blocks (atoms, granules) of knowledge. These granules
are employed to define basic concepts of the theory. In the paper basic concepts of
rough set theory will be defined and their granular structure will be pointed out.
Next the consequences of granularity of knowledge for reasoning about imprecise
concepts will be discussed. In particular the relationship between some ideas of
Lukasiewicz’s multi-valued logic, Bayes’ Theorem and rough sets will be pointed
out.
1 Introduction
This paper is an extended version of [15].
Information (knowledge) granulation, discussed recently by Prof. Zadeh
[27, 28, 29] seems to be a very important issue for computing science, logic,
philosophy and others.
In this note we are going to discuss some problems connected with granu-
larity of knowledge in the context of rough sets. First, discussion of granula-
tion of knowledge in connection with rough and fuzzy sets has been presented
by Dubois and Prade in [8]. Recently, an interesting study of information
granulation in the framework of rough sets can be found in Polkowski and
Skowron [16] and Skowron and Stepaniuk [20].
In rough set theory we assume that with every object some information
is associated, and objects can be ”seen” through the accessible information
only. Hence, object with the same information cannot be discerned and ap-
pear as the same. This results in, that indiscernible objects of the universe
form clusters of indistinguishable objects (granules, atoms, etc.). Thus from
the rough set view the granularity of knowledge is due to the indiscernibility
of objects caused by lack of sufficient information about them. Consequently
granularity and indiscerniblity are strictly connected and the concept of in-
discernibility seems to be prior to granularity.

2Zdzislaw Pawlak
Current state of rough set theory and its application can be found in [19].
Indiscernibility attracted attention of philosophers for a long time and its
first formulation can be attributed to Leibniz (cf. Forrest [9]), and is known
as the principle of ”the identity of indiscernibles”. The principle says that
not two objects have exactly the same properties, or in other words if all
properties of objects x and y are the same then x and y are identical.
But what are ”properties of objects”? and what does it mean ”all prop-
erties”? A lot of philosophical discussions have been devoted to answer these
questions (cf. e.g., Black [3], Forrest [9]), but we will refrain here from philo-
sophical debate. Let us observe only that Leibniz approach to indiscernibility
identifies indiscernibility with identity. The later is obviously an equivalence
relation, i.e., it leads to partition of the universe into equivalence classes
(granules) of objects which are indistinguishable in view of the assumed prop-
erties. Thus in the rough set approach granulation is a consequence of the
Leibniz principle.
It is worthwhile to mention that indiscernibility can be also viewed in a
wider context, as pointed out by Williamson [25]: ”Intelligent life requires
the ability discriminate, but not with unlimited precision”. This is a very
interesting issue however it lays outside the scope of this paper.
In rough set theory we assume empiristic approach, i.e., we suppose that
properties are simply empirical data which can be obtained as a result of
measurements, observations, computations, etc. and are expressed by values
of a fixed, finite set of attributes, e.g., properties are attribute-value pairs,
like (size, small), (color, red) etc. The idea could be also expressed in more
general terms assuming as a starting point not a set of specific attributes but
abstract equivalence relation, however, the assumed approach seems more
intuitive.
Equivalence relation is the simplest formalization of the indiscernibility
relation and is sufficient for many applications. However, more interesting
seems to assume that the indiscernibility relation is formalized as a tolerance
relation, i.e., transitivity of indiscernibilty is denied in this case, for, if x is
indiscernible from y and y is indiscernible from z then not necessarily x is
indiscernible from z. Many authors have proposed tolerance relation as a ba-
sis for rough set theory (cf. e.g., Skowron and Stepaniuk [19]). This causes,
however, some mathematical complications as well philosophical questions,
because it leads to vague granules, i.e., granules without sharp boundaries,
closely related to the boundary-line approach to vagueness (cf. e.g., Chatte-
brjee [7], Sorensen [22]).
Besides, instead of tolerance relation also more sophisticated mathemati-
cal models of indiscernibility, as a basis for rough set theory, have been pro-
posed (cf. e.g., Krawiec, Slowinski, and Vanderpooten [11], Yao and Wong,
[26], Ziarko [30]). Interested readers are advised to consult the mentioned
above references, but for the sake of simplicity we will adhere in this paper

Granularity, Multi-valued Logic 3
to the equivalence relation as a mathematical formalization of the indiscerni-
bility relation.
Since granules of knowledge can be considered as a basic building blocks
of knowledge about the universe it seems that natural mathematical model
for granulated knowledge can be based on ideas similar to that used in mere-
ology proposed by Le´sniewski [12], in which part of is the basic relation of this
theory. Axioms of mereology, in particular in a version proposed by Suppes
[23], seem to be natural candidate for this purpose. Moreover, rough mereol-
ogy, extension of classical mereology proposed by Polkowski and Skowron in
[17, 18], seems to be exceptionally suited to analyze granules of knowledge
with not sharp boundaries (cf. Polkowski and Skowron [16], Skowron and
Stepaniuk [20]).
It also worthwhile to mention in this context that granularity of knowledge
has been also pursued in quantum physics. Its relation to fuzzy sets and rough
sets has been first mentioned by Cattaneo [5, 6].
Recently a very interesting study of rough sets, granularity and foun-
dations of mathematics and physics has been done by Apostoli and Kanda
[2].
Besides, it is also interesting to observe that computations and measure-
ments are very good examples of granularity of information, for they are based
in fact not on real numbers but on intervals, determined by the accuracy of
computation or measurement.
2 Basic Philosophy of Rough Sets
The rough set philosophy is founded on the assumption that with every object
of the universe of discourse we associate some information (data, knowledge).
E.g., if objects are patients suffering from a certain disease, symptoms of the
disease form information about patients. Objects characterized by the same
information are indiscernible (similar) in view of the available information
about them. The indiscernibility relation generated in this way is the math-
ematical basis of rough set theory.
Any set of all indiscernible (similar) objects is called an elementary con-
cepts, and forms a basic granule (atom) of knowledge about the universe. Any
union of some elementary concepts is referred to as crisp (pre cise) concept
otherwise the set is rough (imprecise, vague).
Consequently each rough concept has boundary-line cases, i.e., objects
which cannot be with certainty classified neither as members of the con-
cept nor of its complement. Obviously crisp concepts have no boundary-line
elements at all. That means that boundary-line cases cannot be properly
classified by employing the available knowledge.
Thus, the assumption that objects can be ”seen” only through the infor-
mation available about them leads to the view that knowledge has granular
structure. As a consequence vague concepts, in contrast to precise concepts,

4Zdzislaw Pawlak
cannot be characterized in terms of elementary concepts. Therefore in the
proposed approach we assume that any vague concept is replaced by a pair
of precise concepts called the lower and the upper approximation of the
vague concept. The lower approximation consists of all elementary concepts
which surely are included in the concept and the upper approximation con-
tains all elementary concepts which possibly are included in the concept.
Obviously, the difference between the upper and the lower approximation
constitutes the boundary region of the vague concept. Approximations are
two basic operations in rough set theory.
3 Indiscernibility and Granularity
As mentioned in the introduction, the starting point of rough set theory
is the indiscernibility relation, generated by information about objects of
interest. The indiscernibility relation is intended to express the fact that due
to the lack of knowledge we are unable to discern some objects employing
the available information. That means that, in general, we are unable to deal
with single objects but we have to consider clusters of indiscernible objects,
as fundamental concepts of knowledge.
Now we present above considerations more formally.
Suppose we are given two finite, non-empty sets U and A,whereU is the
universe,andA –asetattributes. With every attribute a A we associate a
set V
a
,ofitsvalues, called the domain of a. The pair S =(U, A) will be called
an information system. Any subset B of A determines a binary relation I
B
on
U, which will be called an indiscernibility relation, and is defined as follows:
xI
B
y if and only if a(x)=a(y) for every a A,
where a(x) denotes the value of attribute a for element x.
Obviously I
B
is an equivalence relation. The family of all equivalence classes
of I
B
, i.e., the partition determined by B, will be denoted by U/I
B
,orsim-
ply U/B; an equivalence class of I
B
, i.e., the block of the partition U/B,
containing x will be denoted by B(x).
If (x, y) belongs to I
B
we will say that x and y are B-indisc ernible.Equiv-
alence classes of the relation I
B
(or blocks of the partition U/B) are referred
to as B-elementary concepts or B-granules.
In the rough set approach the elementary concepts are the basic building
blocks (concepts) of our knowledge about reality.
4 Approximations and Granularity
Now the indiscernibility relation will be used to define basic operations in
rough set theory, which are defined as follows:
B
(X)=
xU
{B(x):B(x) X},

Granularity, Multi-valued Logic 5
B
(X)=
xU
{B(x):B(x) X = ∅},
assigning to every X U two sets B
(X)andB
(X) called the B-lower and
the B-upper approximation of X, respectively.
Hence, the B-lower approximation of a concept is the union of all B-
granules that are included in the concept, whereas the B-upper approximation
of a concept is the union of all B-granules that have a nonempty intersection
with the concept. The set
BN
B
(X)=B
(X) B
(X)
will be referred to as the B-boundary region of X.
If the boundary region of X is the empty set, i.e., BN
B
(X)=,thenX
is crisp (exact) with respect to B; in the opposite case, i.e., if BN
B
(X) = ,
X is referred to as rough (inexact) with respect to B.
Rough sets can be also defined using a rough membership function, defined
as
µ
B
X
(x)=
card(B(x) X)
card(B(x))
.
Obviously
µ
B
X
(x) [0, 1].
Value of the membership function µ
B
X
(x) is kind of conditional probability,
and can be interpreted as a degree of certainty to which x belongs to X (or
1 µ
B
X
(x), as a degree of uncertainty).
The rough membership function, can be used to define approximations
and the boundary region of a set, as shown below:
B
(X)={ x U : µ
B
X
(x)=1},
B
(X)={ x U : µ
B
X
(x) > 0},
BN
B
(X)={x U :0
B
X
(x) < 1}.
The rough membership function can be generalized as follows (cf. Polkowski
and Skowron [17]):
µ(X, Y )=
card(X Y )
card X
where X, Y U, X = and µ(Φ, Y )=1.
Function µ(X, Y )isanexampleofarough inclusion [14] and expresses
the degree to which X is included in Y . Obviously, if µ(X, Y )=1, then
X Y.
If X is included in a degree k we will write X
k
Y.
The rough inclusion function can be interpreted as a generalization of the
mereological relation ”part of”, and reads as ”part in a degree”.

Citations
More filters
Book ChapterDOI

The Art of Granular Computing

TL;DR: This work recasts the existing studies in a wider context and proposes a unified framework of granular computing that extends results obtained in the set-theoretic setting and extracts high-level common principles from a wide range of scientific disciplines.
Book ChapterDOI

Web intelligence meets brain informatics

TL;DR: This chapter outlines a vision of Web Intelligence research from the viewpoint of Brain Informatics (BI), a new interdisciplinary field that systematically studies the mechanisms of human information processing from both the macro and micro viewpoints by combining experimental cognitive neuroscience with advanced information technology.
Proceedings ArticleDOI

A hypergraph model of granular computing

TL;DR: The hypergraph model is an effective representation method of granular structures and a useful way for problem solving.
Proceedings ArticleDOI

Theorize with data using rough sets

TL;DR: It is shown that the granularity of data can be represented in a form of a flow graph, and the relationship between granules obeys Bayes' theorem, which leads to a new method of data analysis.
References
More filters

Stanford Encyclopedia of Philosophy

TL;DR: To understand the central claims of evolutionary psychology the authors require an understanding of some key concepts in evolutionary biology, cognitive psychology, philosophy of science and philosophy of mind.

Lecture Notes in Artificial Intelligence

P. Brezillon, +1 more
TL;DR: The topics in LNAI include automated reasoning, automated programming, algorithms, knowledge representation, agent-based systems, intelligent systems, expert systems, machine learning, natural-language processing, machine vision, robotics, search systems, knowledge discovery, data mining, and related programming languages.
Journal ArticleDOI

Variable precision rough set model

TL;DR: A generalized model of rough sets called variable precision model (VP-model), aimed at modelling classification problems involving uncertain or imprecise information, is presented and the main concepts are introduced formally and illustrated with simple examples.
Journal ArticleDOI

Tolerance approximation spaces

TL;DR: In tolerance approximation spaces the lower and upper set approximations are defined and the tolerance relation defined by the so called uncertainty function or the positive region of a given partition of objects have been chosen as invariants in the attribute reduction process.
Frequently Asked Questions (9)
Q1. What have the authors contributed in "Granularity, multi-valued logic, bayes’ theorem and rough sets" ?

This paper concerns this issue from the rough set perspective. In the paper basic concepts of rough set theory will be defined and their granular structure will be pointed out. 

rough mereology, extension of classical mereology proposed by Polkowski and Skowron in [17, 18], seems to be exceptionally suited to analyze granules of knowledge with not sharp boundaries (cf. Polkowski and Skowron [16], Skowron and Stepaniuk [20]). 

As mentioned in the introduction, the starting point of rough set theory is the indiscernibility relation, generated by information about objects of interest. 

a set of attributes D depends totally on a set of attributes C, denoted C ⇒ D, if all values of attributes from D are uniquely determined by values of attributes from C. 

The generalizations of both inference rules consist in replacing logical values of truth and falsehood with their probabilities in accordance with the total probability theorem (3),(4) and the Bayes’ theorem (5),(6). 

Since granules of knowledge can be considered as a basic building blocks of knowledge about the universe it seems that natural mathematical model for granulated knowledge can be based on ideas similar to that used in mereology proposed by Leśniewski [12], in which part of is the basic relation of this theory. 

With every decision rule Φ → Ψ the authors associate a certainty factorπS(Ψ |Φ) = card(||Φ ∧ Ψ ||S) card(||Φ||S) ,which is the conditional probability that Ψ is true in S given Φ is true in S with the probability πS(Φ). 

In rough set theory the authors assume that with every object some information is associated, and objects can be ”seen” through the accessible information only. 

Indiscernibility attracted attention of philosophers for a long time and its first formulation can be attributed to Leibniz (cf. Forrest [9]), and is known as the principle of ”the identity of indiscernibles”.