
N. Ren et al.
IR-UWB radar can avoid these problems. Unfortunately, till now, researches of gesture recognition using IR-
UWB radar could be rarely found. Because of the features IR-UWB technology owns, gesture recognition using
IR-UWB radar is very likely to be achieved. In this paper, a gesture recognition algorithm using an IR-UWB
radar sensor is presented. The algorithm mainly observes and analyzes the moving direction, moving distance
and the frontal surface area of hand towards IR-UWB radar sensor. The algorithm could recognize 6 different
hand gestures and performs quite well according to the experimental results.
In this paper, we describe the signal model in Section 2. In Section 3, we introduce our gesture recognition
algorithm. The experimental results will be analyzed in Section 4. Finally we conclude the paper.
2. Signal Model
The signal received from the IR-UWB radar sensor can be represented as a summation of the multipath signals,
target signals, and additive white Gaussian noise (AWGN) as follows [1]:
1
0
()() ()
f
N
nn
n
rtast nt
τ
−
=
= −+
∑
. (1)
In Equation (1),
and
represent the amplitude and time delay of the nth received signal respectively.
Whereas
is white Gaussian noise of the channel,
is transmitted template signal, and f
is the
number of the signal path.
3. Gesture Recognition Algorithm
The gesture recognition algorithm in this paper is based on the moving direction of the hand and the change of
the frontal surface area of hand towards IR-UWB radar sensor. The algorithm is about how we define different
gestures when using only one hand, and while moving this hand, how we analyze the location change, and
energy change of the hand part of the signal received from radar, which is called raw signal. After removing the
clutter signal from raw signal, we get background subtracted signal.
To enter the main part of the gesture recognition algorithm, a human has to first stand still for a few seconds
(to distinguish from people who just walk on by). After that, this human has to put out his hand towards the ra-
dar sensor for a while, arm straight or bended, palm towards the radar sensor. This will be considered as a ready
action, to distinguish from people who just stand still in the observation range without doing anything. After
these steps, one can start doing further gestures which will be discussed below. All of the gestures we designed
by using the proposed algorithm are 2-step gestures.
Detecting the existence of human and hand is the first important part in the gesture recognition algorithm.
Here we compare the signal with the threshold determined by us. If a moving object shows up in the observation
range, the signal of that part will exceed the threshold. Then we say something moving is detected. If this lasts
for a while, at the same time if the part of the signal that exceed the threshold is almost the human body size, we
say human body is detected. When analyzing the change of the frontal surface area of hand towards radar sensor,
we utilize information from raw signal such as peak value of amplitude, energy. By comparing the location
change, energy change of moving object in raw signal, the gesture recognition algorithm could distinguish 6
kinds of gestures. The logic flow of the gesture recognition algorithm is shown as follows:
1) Human detect. Keep comparing received signal with the threshold. When the moving object stays at one
place for over a few seconds, we recognize this object as human body and then go to step II.
2) Ready action detect. Ignore the signals of the human body and the part behind it. Then same as in step II, if
hand ready action is detected, go to step III.
3) Hand gesture recognition. The 6 kinds of gestures we proposed are all 2-step gestures, which means after
ready action is detected, this hand should move two more times to make this whole gesture process recog-
nized by the gesture recognition system.
a) Through first sub step, we will analyze the moving direction of the hand. Therefore the hand could move
forwards or backwards or stay at the same place during next few seconds (keep palm towards the radar all
the time).
b) Through second sub step, we will analyze the change of the frontal surface area of the hand (more details
will be described in the example below).
4) By combining the changes mentioned in step III we may create many multi-step gestures, here in this paper