^{1}

^{*}

^{2}

^{*}

In 1953, Rènyi introduced his pioneering work (known as α-entropies) to generalize the traditional notion of entropy. The functionalities of α-entropies share the major properties of Shannon’s entropy. Moreover, these entropies can be easily estimated using a kernel estimate. This makes their use by many researchers in computer vision community greatly appealing. In this paper, an efficient and fast entropic method for noisy cell image segmentation is presented. The method utilizes generalized α-entropy to measure the maximum structural information of image and to locate the optimal threshold desired by segmentation. To speed up the proposed method, computations are carried out on 1D histograms of image. Experimental results show that the proposed method is efficient and much more tolerant to noise than other state-of-the-art segmentation techniques.

Instinctively, image segmentation is the process of dividing an image into different regions such that each region is homogeneous while not the union of any two adjacent regions. An additional requirement would be that these regions have a correspondence to real homogeneous regions belonging to objects in the scene [

Due to the increasing number of medical images, taking advantage of computers to facilitate the processing and analyzing of this huge number of images has become indispensable. Especially, algorithms for the delineation of anatomical structures and other regions of interest are a key component in assisting and automating specific radiological tasks. These algorithms, named image segmentation algorithms, play a fundamental role in many medical imaging applications such as the quantification of tissue volumes [3,4], diagnosis [

There is currently no single segmentation technique that gives satisfactory results for each medical image.

Since the pioneering work by Shannon [16,17] in 1948, entropy appears as an attention-grabbing tool in many areas of data processing. In 1953, Rènyi [

The outline of this paper is as follows. The next section discusses the generalized form of α-entropies especially generalized Rényi entropy. The proposed entropic segmentation method is explained in Section 3. Section 4 is to present the experimental results that validate the use of the proposed method. Advantages of our method and concluding remarks are outlined in Section 5.

Entropy has first appeared in thermodynamics as an information theoretical concept which is intimately related to the internal energy of the system. Then it has applied across physics, information theory, mathematics and other branches of science and engineering [

Suppose be a finite discrete probability distribution that satisfies these conditions

and. The amount of uncertainty of the distribution, is called the entropy of the distribution, P. The Shannon entropy of the distribution, P, a measure of uncertainty and denoted by), is defined as

It should be noted that the Shannon entropy given by Equation (1) is additive, i.e. it satisfies the following relation:

for any two distributions and. Equation (2) states one of the most important properties of entropy, namely, its additivity: the entropy of a combined experiment consisting of the performance of two independent experiments is equal to the sum of the entropies of these two experiments. The formalism defined by Equation (1) has been shown to be restricted to the Boltzmann-GibbsShannon (BGS) statistics. However, for nonextensive systems, some kind of extension appears to become necessary. Rènyi entropy, which is useful for describing the nonextensive systems, is defined as Entropic segmentation for noisy mammography image.

where and. The real number is called an entropic order that characterizes the degree of nonextensivity. This expression reduces to Shannon entropy in the limit. We shall see that in order to get the fine characterization of Rànyi entropy, it is advantageous to extend the notion of a probability distribution, and define entropy for the generalized distributions. The characterization of measures of entropy (and information) becomes much simpler if we consider these quantities as defined on the set of generalized probability distributions.

Suppose be a probability space that is, an arbitrary nonempty set, called the set of elementary events, and P a probability measure, that is, a non-negative and additive set function for which. Let us call a function which is defined for, where. If we call an ordinary (or complete) random variable, while if

we call an incomplete random variable. Evidently, an incomplete random variable can be interpreted as a quantity describing the result of an experiment depending on chance which is not always observable, only with probability. The distribution of a generalized random variable is called a generalized probability distribution. Thus a finite discrete generalized probability distribution is simply a sequence of nonnegative numbers such that setting and taking

where is the weight of the distribution and. A distribution that has a weight less than 1 will be called an incomplete distribution. Now, using Equation (3) and Equation (4), the Rànyi entropy for the generalized distribution can be written as

Note that Rànyi entropy has a nonextensive property for statistical independent systems, defined by the following pseudo additivity entropic formula

Image segmentation problem is considered to be one of the most holy grail challenges of computer vision field especially when done for noisy images. Consequently it has received considerable attention by many researchers in computer vision community. There are many approach for image segmentation, however, these approach are still inadequate. In this work, we propose an entropic method that achieves the task of segmentation in a novel way. This method not only overcomes image noise, but also utilizes time and memory optimally. This wisely happens by the advantage of using the Rànyi entropy of generalized distributions to measure the structural information of image and then locate the optimal threshold depending on the postulation that the optimal threshold corresponds to the segmentation with maximum structure (i.e., maximum information content of the distribution). The implementation steps of the proposed segmentation method are shown in the block diagram of

Preprocessing ultimately aims at improving the image in ways that increase the opportunity for success of the other ulterior processes [17,23]. In this step, we apply a Gaussian filter to the input image prior to any process in order to reduce the amount of noise in an image.

Suppose be the probability distribution for the image. At the threshold, this distribution is divided into two sub distributions; one for the foreground (class f) and the other for the background (class b) given by and respectively. Thus, the generalized Rànyi entropies for the two distributions as functions of are given as

Thresholding is the most often used technique to distinguish objects from background. In this step an input image is converted by threshed into a binary image so that the objects in the input image can be easily separated from the background. To get the desired optimum threshold value t^{*}, we have to maximize the total entropy,. When the function is maximized, the value of parameter that maximizes the function is believed to be the optimum threshold value [

In image processing, dilation, erosion, closing and opening are all well-known as morphological operations. In this step we aim at improving the results of the previous thresholding step. Due to the inconsistency within the color of objects, the resulting binary image perhaps includes some holes inside. By applying the closing morphological operation, we can get rid of the holes form the binary image. Furthermore Opening operation with small structure element can be used to separate some objects that are still connected in small number of pixels [25,26].

In this step we attempt to remove the overlapping between objects that perhaps happened through extensively applying the previous morphological operations. To perform this, we first get the Euclidean Distance Transform (EDT) of the binary image. Then we apply the wellknown watershed algorithm [27,28] on the resulting EDT image. The EDT ultimately converts the binary image into one where each pixel has a value equal to its distance to the nearest foreground pixel. The distances are measured in Euclidean distance metric. The peaks of the distance transform are assumed to be in the centers of the objects. Then the overlapping objects can be yet easily separated.

This step helps in removing incorrect objects according to the object size. Sizes of objects are measured in comparison to the total size of image. Each tiny noise object of size less than a predefined minimum threshold can be discarded. Also each object whose size is greater than the maximum threshold size can be removed as well. Note that thresholds of size used herein are often dependent on the application, and so they are considered as user-defined data.

In this section, the results of the proposed approach are presented. First to investigate the proposed approach for image segmentation we began by different image histograms. Each of these histograms describes the “objects” and the “background”. Additionally, to verify the benefit of using the generalized Rènyi entropy, we have tried using another formula of entropy (e.g. Tsallis entropy) which is given by

The results of segmentation have testified to the higher efficiency of our entropic segmentation approach especially when generalized Rènyi entropy is used.

In

In

In this paper, we introduced a new method for cell image segmentation based on generalized -entropy. The proposed method has achieved the task of segmentation in a novel way. This method has been shown to provide good results in most cases and perform well when applied to noisy cell images. The experimental results show that using generalized Rènyi formalism of entropy is more viable than using Tsallis counterpart in segmentating cell image. The chief advantages of the method are its high

rapidity and its tolerance to image noise.