Journal of Signal and Information Processing, 2013, 4, 120-124
doi:10.4236/jsip.2013.43B021 Published Online August 2013 (http://www.scirp.org/journal/jsip)
Discrete Entropic Uncertainty Relations Associated with
FRFT*
Guanlei Xu1, Xiaotong Wang2, Lijia Zhou1, Limin Shao1, Xiaogang Xu2
1Ocean department of Dalian Naval Academy, Dalian, China, 116018; 2Navgation department of Dalian Naval Academy, Dalian,
China, 116018.
Email: xgl_86@163.com
Received March, 2013.
ABSTRACT
Based on the definition and properties of discrete fractional Fourier transform (DFRFT), we introduced the discrete
Hausdorff-Young inequality. Furthermore, the discrete Shannon entropic uncertainty relation and discrete Rényi en-
tropic uncertainty relation were explored. Also, the condition of equality via Lagrange optimization was developed, as
shows that if the two conjugate variables have constant amplitudes that are the inverse of the square root of numbers of
non-zero elements, then the uncertainty relations reach their lowest bounds. In addition, the resolution analysis via the
uncertainty is discussed as well.
Keywords: Discrete Fractional Fourier Transform (DFRFT); Uncertainty Principle; Rényi Entropy; Shannon En tropy
1. Introduction
Uncertainty principle not only holds in analog signals,
but also in discrete signals [1,2]. Recently, with the de-
velopment of fractional Fourier transform (FRFT), ana-
log generalized uncertainty relations associated with
FRFT have been carefully explored in some papers such
as [3,4,16], which effectively enrich the ensemble of
FRFT. However, up till now there has been no reported
article covering the d iscrete generalized uncertainty rela-
tions associated with discrete fractional Fourier trans-
form (DFRFT). From the viewpoint of engineering ap-
plication, discrete data are widely used and seem to be
more profitable than the analog ones. Hence, there is
enough need to explore discrete generalized uncertainty
relations. DFRFT is the discrete version of FRFT [5,6],
which is applied in practical engineering fields. In this
article we will discuss the entropic uncertainty relations
[7,8] associated with DFRFT. In this article, we made
some contributions such as follows. The first contribu-
tion is that we extend the traditional Hausdorff-Young
inequality to the FRFT domain with finite supports. It is
shown that these bounds are connected with lengths of
the supports and FRFT parameters. The second contribu-
tion is that we derived the Shannon entropic uncertainty
principle in FRFT domain for discrete case, based on
which we also derived the conditions when these uncer-
tainty relations have the equalities via Lagrange optimi-
zation. The third contribution is that we derived the
Renyi entropic uncertain ty principle in FRFT domain for
discrete case. The final contribution is that we discussed
the resolution in multiple FRFT domains as a succession
of above derivative, including new proofs. In a word,
there have been no reported papers covering these gener-
alized discrete entropic uncertainty relations on FRFT.
2. Preliminaries
2.1. FRFT and DFRFT
First, confirm that you have the correct template for your
paper size. This template has been tailored for output on
the custom paper size (21 cm * 28.5 cm).
Before discussing the uncertainty principle, we intro-
duce some relevant preliminaries. Here we first briefly
review the definition of FRFT. For given analog signal
and
)()()( 21 RLRLtf 1)( 2tf , its FRFT [5,6] is
defined as

()()()(,)
F
uFftftK utdt
 


22
cot cot
sin
22
(1cot) /2()
() 2
()(2 1)
iut
iu it
ieeeftdt
ft n
ft n

n







,
(1)
where Z
nand is the complex unit, i
is the trans-
form parameter defined as that in [5,6]. In addition,
*This work was fully supported by the NSFC (61002052) and partly
supported by the N SFC (61250006).
Copyright © 2013 SciRes. JSIP
Discrete Entropic Uncertainty Relations Associated with FRFT 121
. If
 ,

)()( tftfFF

dutuKu ),()(

N
CNx )(,),3
,
 
)()( tfFtfFF

i.e., the inverse FRFT reads: .
Ftf

)(
xxx ),2(),1(
Let N
be a discrete time series with cardinality N and
 
(xxxxX ,,,, 321
1
2.
Assume its DFRFT under
the transform parameter
X
N
NCx  ˆ
,
xxx ,
ˆ
,
ˆ
,
ˆ321 X
ˆ
.
Then the DFRFT [5] can be written as
2
2co
2
)/ ik 2
cot
t
sin 2
cot ()
in
ikn
NN
1
ˆ() (1
N
l
x
ki

1(,) (
N
luknx

N
kn
eeexn
,
)nN
,1
XU
. (2)
Also, we can rewrite the definition (2) as
X
ˆ,
where .

NN
nkuU
),(


Clearly, for DFRFT we have the following proper-
ties[5]:
XUX k


XU

UUXUU

UU

2,

XX UX k
2
,
1
2X
ˆ
2UX
.
In the follows, we will assume that the transform pa-
rameter
20  and
. The max difference
between the discrete and analog definitions is the support:
one is finite and discrete and the other one is infinite and
analog.
2.2. Shannon Entropy and Rényi Entropy
For any discrete random variable () and
its probability density function , the Shannon en-
tropy [9] and the Rényi Entropy [10] is defined as, re-
spectively
n
x)
n
xNn ,,1
(p

)(ln)(
1n
N
nnn xpxpxH
,


N
nnn xpxH 1)(ln
1
1
.
Hence, in this article, we know that for any DFRFT
, the Shannon entropy and
the Rényi Entropy associated with DFRFT is defined as,
respectively
N
NCxxxxX ˆ
,,
ˆ
,
ˆ
,
ˆ
ˆ321

2
2
1)(
ˆ
ln)(
ˆˆnxnxxH N
n

,


N
nnxxH 1
2
)(
ˆ
ln
1
1
ˆ

.
Clearly, if ,1
 

xHxH ˆˆ .
2.3. Discrete Hausdorff-Young Inequality
Associated with DFRFT
Let
be a discrete time series with cardinality N and its
DFRFT with
and the transform parameter
 
N
NCNxxxxxxxxX  )(,),3(),2(),1(,,,, 321 

N
NCxxxxX  ˆ
,,
ˆ
,
ˆ
,
ˆ
ˆ321
XUX

ˆ
.
Since 1
2X, 1
ˆ2
2 XUX

from Parseval’s
theorem. Here
2
1
2
1
2
N
nn
xX . Clearly, we can ob-
tain 1
XMXU

 
UM with .
)(sup luU r
l
Here with

NlluUr,,1,)(
.
Hence, we have 1
XMXU

with

UM.
Then from Riesz’s theorem [11,12], we can obtain the
discrete Hausdorff-Young inequality
p
p
p
qXMXU
2

with and 21  p1
11 qp .
Set

UU , then , we obtain

  UUUU
p
p
p
qXMXUU
2

with


UUUM .
Let XUY
, then . In addition, from the
property of DFRFT we can have
YUX
)sin(
1




 N
UUM. Hence we obtain
p
p
p
qYUMYU

2 with )sin(
1



N
M.
Since the value of
can be taken arbitrarily in ,
can also be taken arbitrarily in . Therefore, we
can obtain the following lemma.
N
C
YN
C
Lemma 1: For any given discrete time series
N
NCNxxxxxxxxX )(,),3(),2(),1(,,,, 321
with cardinality N and 1
2,
(
) is the
DFRFT transform matrix associated with the transform
parameter
XU U
(
, respectively), then we can obtain the
generalized discrete Hausdorff-Young inequality

p
p
p
qXUNXU


2
2
)sin(

with 21
p and 1
11 qp.
Clearly, this is the discrete version of Hausdorff-
Young inequality. In the next sections, we will use this
lemma to prove the new uncertainty relations.
3. The Uncertainty Relations
3.1. Shannon Entropic Principle
Theorem 1: For any given discrete time series
)(,),3(),2(),1(,,,, 321NxxxxxxxxX N CN
with cardinality N and 1
2,
(
) is the DFRFT
series associated with the transform parameter
Xx
ˆx
ˆ
(
,
respectively),
(
N) counts the non-zero elements
of
(
, respectively), then we can obtain the gener-
alized discrete Shannon entropic uncertainty relation
N
x
ˆx
ˆ
)sin(ln
ˆˆ


NxHxH , (3)
Copyright © 2013 SciRes. JSIP
Discrete Entropic Uncertainty Relations Associated with FRFT
122
where



 N
nnxnxxH 1
22 )(
ˆ
)(
ˆ
ln
ˆ

,


 N
mmxmxxH 1
22 )(
ˆ
)(
ˆ
ln
ˆ

,
which are Shannon entropies. The equality in (3) holds
iff
N
x1
ˆ and
N
x1
ˆ.
Proof: From lemma 1, we have

1
)(
ˆ
)(
ˆ
)sin(
1
11
1
1
2
2

p
p
N
np
p
p
N
m
p
p
p
nx
mxN

.
Take natural logarithm in both sides in above inequal-
ity, we can obtain 0)( pT ,
where


1
21
ˆ
()lnsin()ln()
2
p
N
m
p
TpNx m
pp


1
1
1ˆ
ln() p
Np
n
pxn
p



.
Since and
21  p1
2X,we know 0)2(
T0)( .
Note if . Hence,
0)( pT 21  p'
pT if
. Since
2p



22
1
1
1
1
21
1
1
1
1
11
ˆ
'( )lnsin()ln()
ˆˆ
ln ()()
1
ˆ()
1ˆ
ln()
ˆˆ
()ln ()
1,
(1) ˆ()
p
N
m
p
N
m
p
N
m
p
Np
n
p
Np
n
p
Np
n
Tp Nxm
pp
xm xm
pxm
xn
p
xn xn
pp xn













we can obtain the final result in theorem 1.
Now consider when the equality holds. From theorem
1, that the equality holds in (3) implies that


xHxH ˆˆ reaches its minimum bound, which
means that
Minimize


xHxH ˆˆ subject to 1
ˆˆ 2
2

xx,
i.e.
Minimize



   N
m
N
nmxmxnxnx 1
22
1
22 )(
ˆ
)(
ˆ
ln)(
ˆ
)(
ˆ
ln

,
subject to 1)(
ˆ
)(
ˆ1
2
1
2  
N
n
N
nnxnx

.
To solve this problem let us consider the following
Lagrangian

22
22
11
ˆˆˆˆ
ln()()ln( )( )
NN
nm
Lxnxn xmx




 




m
2
2
12
11
ˆˆ
() 1() 1
NN
nn
xn xn



 

.
In order to simplify the computation, we set
n
pnx
2
)(
ˆ and
n
pnx
2
)(
ˆ. Hence we have
01ln1
n
n
p
p
L,
01ln 2
n
n
p
p
L,
1
1
N
nn
p
,
1
1
N
nn
p
.
Solving the above equations, we finally obtain
N
nx 1
)(
ˆ,
N
nx1
)(
ˆ. From the definition of
Shannon entropy, we know that if


NxH ln
ˆ
and

NxH ln
ˆ
, the equality in (3) holds. In addition, we
also have )sin(


N NN . From the proof we
know that
N
nx 1
)(
ˆ and
N
nx1
)(
ˆ means
that can be complex, and only if their
amplitudes are constants, the equality will hold. Now we
can obtain the following corollary out of above analysis.
)(
ˆ
)(
ˆnxandmx

Corollary 1: For any given discrete time series
 )(,),3(),2(),1(,,,,
~
321 NxxxxxxxxX N CN with
cardinality N and 1
~
2,
(
) is the DFRFT se-
ries associated with the transform parameter
X x
ˆx
ˆ
(
, re-
spectively), () counts the non-zero elements
N
N
of (, respectively), if
x
ˆ
x
ˆ
N
nx 1
)(
ˆ and
N
nx1
)(
ˆ, then we have
)sin(


NNN
and
)sin(ln
ˆˆ


 NxHxH .
3.2. Rényi Entropic Principle
Theorem 2: For any given discrete time series
)(,),3(),2(),1(,,,,321NxxxxxxxxX N CN with
cardinality N and 1
2,
(
) is the DFRFT se-
ries associated with the transform parameter
X x
ˆx
ˆ
(
, re-
spectively),
(
N) counts the non-zero elements of
(
, respectively), then we can obtain the generalized
discrete Renyi entropic uncertainty relation
N
x
ˆx
ˆ
)sin(ln
ˆˆ


 NxHxH
with 1
2
1
and 2
11 

. (4)
where
Copyright © 2013 SciRes. JSIP
Discrete Entropic Uncertainty Relations Associated with FRFT 123

N
mmxxH 1
2
)(
ˆ
ln
1
1
ˆ

,

N
nnxxH 1
2
)(
ˆ
ln
1
1
ˆ

,
which are Rényi entropies.
Proof: In lemma 1, set
2q and
2p, we have
1
2
1
and 2
11 

. Then from lemma 1, we
obtain



2
1
1
2
2
1
2
1
1
2)(
ˆ
)sin()(
ˆ
 N
n
N
mnxNmx .
Take the square of the above inequality, we have



1
1
2
1
1
1
2)(
ˆ
)sin()(
ˆ
 N
n
N
mnxNmx .
Take the power
1 of both sides in above ine-
quality, we obtain



  1
1
1
2
1
1
1
1
2)(
ˆ
)sin()(
ˆN
n
N
mnxNmx ,
i.e.,


1
)(
ˆ
)(
ˆ
)sin(
1
1
1
2
1
1
1
2
1


N
m
N
n
mx
nxN . (5)
Take natural logarithm in both sides of (5), we can
obtain


22
11
11
ˆˆ
ln( )ln()
11
NN
nm
xn xm







ln sin()N

.
Clearly, as 1
and 1
, the Renyi entropy
reduces to Shannon entropy, thus the Renyi entropic un-
certainty relation in (4) reduces to the Shannon entropic
uncertainty relation (3). Hence the proof of equality in
theorem 2 is trivial according to the proof of theorem 1.
Note that although Shannon entropic uncertainty rela-
tion can be obtained by Rényi entropic uncertainty rela-
tion, we still discuss them separately in the sake of inte-
grality.
4. Resolution Analysis in Time-Frequency
Domain
In many cases [13-15], we often discuss the data concen-
tration in both time domain and frequency domain,
therefore, we will adopt a new measure on entropy:


xHpxpHHpˆ
)1(
ˆ
,
where
(
) is the DFRFT series associated with the
transform parameter
x
ˆx
ˆ
(
, respectively) for time series

)(,),3(),2(,
),1(,,,
~
3
x
21xNxxxxxXNx,
xH ˆ
and
x
ˆ
x
ˆ
H are the Shannon entropies for the DFRFT
series ().
Theorem 3: For any given discrete time series
x
ˆ
)(,),3(),2(),1(,,,,
~
321NxxxxxxxxX N  with
cardinality N and 1
~
2,
(
) is the DFRFT se-
ries associated with the transform parameter
X x
ˆx
ˆ
(
, re-
spectively),
(
N) counts the non-zero elements of
(
, respectively), then we can obtain the generalized
discrete entropic uncertainty relation
N
x
ˆx
ˆ
))sin(ln(
2
1

 NH , (6)
Proof: For variable , we have for
x
ˆ
p0
p
N
n
p
pnxx
1
1)(
ˆˆ

.
Set
)(
ˆ
max
ˆnxx Nn

. Therefore, we have

q
q
q
N
nq
q
q
qx
nx
x
nx
x
dq
d
1
1
1
11
1
1
1ˆ
)(
ˆ
log
ˆ
)(
ˆ
ˆ
log
, (7)
2
1
2ˆ
log q
d
x
dq
2
111
1111
11
1111
ˆˆˆˆ
()()()()
1log log
ˆˆˆˆ
qqq
NN
qqq
nn
qqq
xnxnxn xn
txxxx




 

 


 
 

 


1
q
q
q
.
Taking into account of the non-negative second de-
rivative, therefore we know that q
x1
ˆ
log
is convex.
Therefore, we obtain

11
1
2
ˆˆ ˆ
loglim log
q
qq
q
dd
Hx xx
dq dq



log sin()N

. (8)
Since 1
~
ˆˆ 2
2
2 Xxx

and
ˆ
x
1
1ˆ
sin( )
x
N

 and the Riesz-Thorin
theorem [4], one has
q
q
qN
x
x
 2
1
1
1
1
)sin(
ˆ
ˆ

 0
ˆ
,1
2
1
xq
By applying negative logarith m to both sides of above
inequality, we obtain the following relation
1
11
ˆˆ
loglog q
q
xx






1log sin()
2
qN

 

 ,
1
2
1q.
Taking into account of 1
~
ˆˆ 2
2
2 Xxx

, then
both sides of above inequality are equal to zero for
2
1
q. Hence, (7) and (8) imply



)sin(ln
2
1
ˆ
2
1
ˆ
2
1


 NxHxH.
Thus the proof is completed. Note that this proof can
Copyright © 2013 SciRes. JSIP
Discrete Entropic Uncertainty Relations Associated with FRFT
Copyright © 2013 SciRes. JSIP
124
be obtained via the similar manner with Section 3 A,
however, we still give a new proof so that we can under-
stand this point out of different aspect. On the other hand,
different proofs yield the same result, as validates the
conclusion derived here.
5. Conclusions
In this article, we extended the entropic uncertainty rela-
tions in DFRFT domains. We first introduced the gener-
alized discrete Hausdorff-Young inequality. Based on
this inequality, we derived the discrete Shannon entropic
uncertainty relation and discrete Rényi entropic uncer-
tainty relation. Interestingly, when the variable’s ampli-
tude is equal to the con stant, i.e. the inverse of the squ are
root of number of non-zero elements, the equality holds
in the uncertainty relation. In addition, the product of the
two numbers of non-zero elements is equal to
)sin(

N, i.e., )sin(


 NNN .
REFERENCES
[1] R. Ishii and K. Furukawa, “The Uncertainty Principle in
Discrete Signals,” IEEE Trans Circuits and Systems, Vol.
33, No. 10, 1986, pp. 1032-1034.
[2] L. C. Calvez and P. Vilbe, “On the Uncertainty Principle
in Discrete Signals,” IEEE Trans Circuits and Systems-II:
Analog and Digital Signal Processing, Vol. 39, No. 6,
1992, pp. 394-395. doi:10.1109/82.145299
[3] S. Shinde and M. G. Vikram, “An Uncertainty Principle
for Real Signals in the Fractional Fourier Transform Do-
main,” IEEE Transactions Signal Processing, Vol. 49, No.
11, 2001, pp. 2545-2548. doi:10.1109/78.960402
[4] G. L. Xu, X. T. Wang and X. G. Xu, “Generalized En-
tropic Uncertainty Principle on Fractional Fourier trans-
form,” Signal Processing, Vol. 89, No. 12, 2009, pp.
2692-2697. doi:10.1016/j.sigpro.2009.05.014
[5] R. Tao, B. Deng and Y. Wang, “Theory and Application
of the Fractional Fourier Transform,” Tsinghua Univer-
sity Press, Beijing, 2009
[6] S.C. Pei and J. J. Ding, “Eigenfunctions of Fourier and
Fractiona l Fourier Transforms Wit h Complex Offsets and
Parameters,” IEEE Trans Circuits and Systems-I: Regular
Papers, 2007, Vol. 54, No. 7, pp. 1599-1611.
[7] T. M. Cover and J. A. Thomas, “Elements of Information
Theory, Second Edition,” John Wiley &Sons, Inc.,2006.
[8] H. Maassen, “A Discrete Entropic Uncertainty Relation,”
Quantum Probability and Applications, V,
Springer-Verlag, New York, 1988, pp. 263-266.
[9] C. E. Shannon. “A Mathematical Theory of Communica-
tion,” The Bell System Technical Journal, Vol. 27, 1948,
pp. 379-656
[10] A. Rényi, On Measures of Information and Entropy, In:
Proceedings of the Fourth Berkeley Symposium on
Mathematics, Statistics and Probability, 1960, p. 547.
[11] G. Hardy, J. E. Littlewood and G. Pólya Inequalities. 2nd
edition, Press of University of Cambridge, 1951.
[12] D. Amir, T. M. Cover and J. A. Thomas, “Information
Theoretic Inequalities,” IEEE Trans Information Theory,
Vol. 37, No. 6, 2001, pp. 1501-1508.
[13] D. Victor, O. Murad and P. Tomasz, “Resolution in
Time–Frequency,” IEEE Transactions on Signal Proc-
essing, Vol. 47, No. 3, 1999, pp. 783-788.
doi:10.1109/78.747783
[14] T. Przebinda, V. DeBrunner and M. Özaydın, “The Op-
timal Transform for the Discrete Hirschman Uncertainty
Principle,” IEEE Transactions on information theory, Vol.
47, No. 5, 2001, pp. 2086-2090.
doi:10.1109/18.930948
[15] X. D. Zhang, “Modern Signal Processing, “second edition)
Tsinghua university press, Beingjing, 2002.
[16] G. L. Xu, X. T. Wang and X. G. Xu, “The Logarithmic,
Heisenberg’s and Short-time Uncertainty Principles As-
sociated with Fractional Fourier Transform,” Signal Proc-
essing, Vol. 89, No. 3, 2009, pp. 339-343.
doi:10.1016/j.sigpro.2008.09.002