Z. TOMORI ET AL. 335
3. Experiment
We used the described HRT system and as the sample we
took droplets of liquid crystal (6CB or 8CB) dyed with
Nile red and dispersed in water. The droplets were ma-
nipulated by two laser beams of total power 3W. NUI
module was programmed in the way that the open hand
represents non active trap, closing of the hand activates
the trap on the given position. Moving of the closed
hands intuitively corresponds to the moving of trapped
objects. Closed hand with the index finger up increases
the sensitivity of detection. The angle between thumb
and index can intuitively suggest the function of a real
tweezers. One possibility is to exploit this gesture for the
focusing. If the angle between index and thumb is mini-
mal, the system uses only XY coordinates to move ob-
jects. If this angle is above some threshold, the system
evaluates Z-coordinates and changes the focus according
to the distance of the hand from the screen. We found
that the fingertip navigation is more sensitive compared
to the traditional mouse control in exp eriments where we
excited the whispery gallery modes [12] in the LC droplets
by navigating the laser beam precisely on the droplet
edge.
If both hands were busy by manipulating the objects,
the voice commands were very helpful to control other
functions of the system described in Table 1 . We placed
the clicker window above the button RAMAN of the
control dialog and thus voice command “click” switches
device to the Raman spectra measurement mode. We
could easily add commands for fast movement of micro-
positioning stage corresponding to SHIFT+Arrow key
however this function was rarely used in experiment.
4. Conclusions
Extremely fast progress in the NUI technology brings an
intensive search for possible applications in various areas.
In our opinion, one of such areas is optical microma-
nipulation with microobjects where the three dimensional
positions of trapped objects are intuitively controlled by
fingertips positions combined with gestures and voice
commands. For this purpose we exploited very recent
technology (Gesture camera and Leap Motion sensor).
Unlike to the solution based on Microsoft Kinect sensor
[7] our solution allows the convenient work in sitting
position with elbows sup ported by the table.
Comparison of both sensors mentioned above is out of
scope of this paper and it would require more extensive
testing. However, our experiments showed that Leap
Motion is more precise, faster, reliable and has simpler
SDK. On the other hand, Gesture Camera and SDK from
Intel have broader range of NUI functions, it generates
color images and depth maps (not only fingertips coor-
dinates) and exploits OpenCV library proper for image
processing applications. Comparison of prices ($70 and
$150) does not make sense in this application.
Voice commands are helpful especially if both hands
are occupied. Our proof-of-concept experiment showed
that NUI increases the efficiency of the tweezers control
compared to mouse based trapping cca. 2 times. How-
ever, this number can be higher with increasing experi-
mental experience. The efficiency is also image depend-
ent and task dependent. Anyway, application of NUI
methods is the way how to improve interactive microma-
nipulation techniques with respect to expected stan-
dardization in this area.
Future Work
Our software was designed to remain open for future
improvements. Further experiments should determine
optimal set of gestures and voice commands. We plan to
extend the software to full network version allowing re-
mote control of tweezers ("NUI teletweezing"). This ex-
tension assumes streaming of live images from the mi-
croscope camera and sending them to the client. Then
semi-automated methods of optical trapping based on the
image analysis would be possible. We plan additional
testing of the other NUI software tools in order to
achieve better control. We will try to define a set of spe-
cific gestures for optical tweezers.
5. Acknowledgements
This work was supported by Slovak research grant agen-
cies APVV (gran t 0526-11 ) and V EGA (gr ant 2-191- 11),
Slovak Academy of Science in frame of CEX NAN-
OFLUID and Agency for structural funds of EU (projects
26220120033 and 262 20220061). We thank to Leap Mo-
tion for pro vi ding the protot y p e of sensor.
REFERENCES
[1] K. C. Neuman and S. M. Block, “Optical Trapping,” Re-
view of Scientific Instruments, Vol. 75, No. 9, 2004, pp.
2787-2809.doi:10.1063/1.1785844
[2] R. Bowman, D. Preece, G. Gibson and M. Padgett,
“Stereoscopic Particle Tracking for 3D Touch, Vision and
Closed-loop Control in Optical Tweezers,” Journal of
Optics, Vol. 13, No. 4, 2011, p. 044003.
doi:10.1088/2040-8978/13/4/044003
[3] J. E. Curtis, B. A. Koss and D. G. Grier, “Dynamic Holo-
graphic Optical Tweezers,” Optics Community,Vol. 207,
No. 1-6, 2002, pp. 169-175.
doi:10.1016/S0030-4018(02)01524-9
[4] G. Whyte, G. Gibson, J. Leach, M. Padgett, D. Robert,
and M. Miles, “An Optical Trapped Microhand for Ma-
nipulating Micron-sized Objects,” Optics Express, Vol.
14, No. 25, 2006, pp. 12497-12502.
doi:10.1364/OE.14.012497
Copyright © 2013 SciRes. OPJ