^{1}

^{*}

^{1}

^{*}

In this paper, by axiomatic way, a form of information entropy will be presented on crisp and fuzzy setting. Information entropy is the unavailability of information about a crisp or fuzzy event. It will use measure of information defined without any probability or fuzzy measure: for this reason it is called
*general information*.

The setting of entropy was statistical mechanics: in [

Now, we recall this definition. Let

Basic notions and notations can be found in [

Shannon’s entropy is

and it is measure of uncertainty of the system

Another entropy was introduced by Rényi, called entropy of order

and it was used in many problems [

In generalizing Bolzmann-Gibbs statistical mechanics, Tsallis’s entropy was introduced [

We note that all entropies above are defined through a probability measure.

In 1967 J. Kampé de Feriét and B. Forte gave a new definition of information for a crisp event, from axiomatic point of view, without using probability [

In this paper we propose a class of measure for the entropy of an information for a crisp or fuzzy event, without using any probability or fuzzy measure.

We think that not using probability measure or fuzzy measure in the definition of entropy of the information of an event, can be an useful generalization in the applications in which probablility is not known.

So, in this note, we use the theory explained by Khinchin in [

The paper is organized as follows. In Section 2 there are some preliminaries about general information for crisp and fuzzy sets. The definitions of entropy and its measure are presented in Section 3. Section 4 is devoted to an application. The conclusion is considered in Section 5.

Let

such that

1)

2)

In analogous way [

such that

1)

2)

Using general information recalled in Section 2, in this paragraph a new form of information entropy will be introduced, which will be called general information entropy. Information entropy means the measure of un- availability of a given information.

In the crisp setting as in Section 2, given information

Definition 3.1. General information entropy for crisp sets is a mapping

1) monotonicity:

2) universal values:

The universal values can be considered a consequence of monotonicity.

So, general information entropy

general information entropy.

It is possible to extend the definition above to fuzzy sets.

Given

Definition 3.2. General information entropy for fuzzy sets is a mapping

1) monotonicity:

2) universal values:

The universal values can be considered a consequence of monotonicity.

So, general information entropy

In this paragraph, an application of information entropy will be indicated: it concerns the value of information entropy for the union of two disjoint crisp sets. The procedure of solving this problem is the following: first, the presentation of the properties, second the translation of these properties in functional equations, by doing so, it will be possible to solve these systems [

It is possible to extend this application also to the union of two disjoint fuzzy sets.

On crisp setting as in Section 2, let

(u_{3})

Information entropy of the union

where

Setting:

We are looking for a continuous function

Proposition 4.1. A class of the solutions of the system

where

Proof. The proof is based on the application of the theorem of Cho-Hsing Ling [

From (1) and (2) information entropy of the union of two disjoint set is expressed by

where

By axiomatic way, a new form of information entropy has been introduced using information theory without probability given by J. Kampé De Fériet and Forte. For this measure of information entropy, called by us, general because it doesn’t contain any probability or fuzzy measure, it has been given a class of measure for the union of two crisp disjoint sets.

This research was supported by research center CRITEVAT of “Sapienza” University of Roma and GNFM of MIUR (Italy).