common HCI tasks such as mouse movement, click
actions and text entry (in conjunction with the “On
Screen Keyboard”). Based on the quantitative results
presented, head based HCI cannot be regarded as a
substitute for the conventional mouse, since the
speed and accuracy of performing most HCI tasks is
below the standards achieved when using a
conventional mouse. However, in most cases the
performance of the proposed system is comparable
to the performance obtained when using touch pads
of portable computer systems. Even though the
accuracy and speed of accomplishing various HCI
tasks with a touch pad is less than in the case of
using a conventional mouse, a significant number of
computer users use regularly touch pads. We are
convinced that computer users will also find the
proposed hands free computing approach useful.
The proposed system does not require person-
specific training, since the system adapts and learns
the visual characteristics of the features to be
tracked, during the initialisation phase. The only
case that person-specific training is required is when
the “Voice Command” mode is used. The training
procedure in those cases requires about 20 minutes
to be completed.
The proposed system is particularly useful for
paraplegics with limited (or without) hand mobility.
Such users are able to use a computer system based
only on head movements and speech input. During
the system development phase we have provided the
system to members of the Cyprus Paraplegics
Organization, who tested the system and provided
valuable feedback related to the overall system
operation and performance. Currently a number of
paraplegic computer users are using the hands-free
system described in this paper.
An important feature of the system is the
provision of alternative methods for performing a
task, so that at any time the user can choose the most
appropriate way to perform an action. For example if
the user wishes to run the Internet Explorer, he/she
has the ability to perform the action using only head
movements or by using speech commands or by
using a combination of the two input media (i.e
move the cursor to the appropriate icon using head
movements and run the application by using sound
clicks).
In the future we plan to upgrade the Voice
Command mode in order to allow text entering
based on speech input. Also we plan to stage a large-
scale evaluation test is order to obtain concrete
conclusions related to the performance of the
proposed system. Since the hands-free system is
primarily directed towards paraplegics, we plan to
include evaluation results from paraplegics in our
quantitative evaluation results.
ACKNOWLEDGEMENTS
The work described in this paper was supported by
the Cyprus Research Promotion Foundation. We are
grateful to members of the Cyprus Paraplegics
Organization for their valuable feedback and
suggestions.
REFERENCES
Assistive Technology Solutions. Retrieved October 4,
2005, from http://www.abilityhub.com/mouse/
CameraMouse: Hands Free Computer Mouse. Retrieved
October 4, 2005, from http://cameramouse.com/
EyeTech Digital Systems-Eye Tracking Device. Retrieved
October 4, 2005, from http://www.eyetechds.com/
Hands Free Mouse – Assistive Technology Solution.
Retrieved October 4, 2005, from
http://eyecontrol.com/smartnav/
Gorodnichy D.O. and Roth G, 2004. Nouse ‘Use Your
Nose as a Mouse’ – Perceptual Vision Technology for
Hands-Free Games and Interfaces, Image and Vision
Computing, Vol. 22, No 12, pp 931-942.
Kettebekov S, and Sharma R , 2001. Toward Natural
Gesture/Speech Control of a Large Display, M. Reed
Little and L. Nigay (Eds.): EHCI 2001, LNCS 2254,
pp. 221–234, Springer-Verlag Berlin Heidelberg.
Krahnstoever N, Kettebekov S, Yeasin M, Sharma R.
2002. A Real-Time Framework for Natural
Multimodal Interaction with Large Screen Displays.
Proc. of Fourth Intl. Conference on Multimodal
Interfaces (ICMI 2002).
Mateos G.G. 2003. Refining Face Tracking With Integral
Projections. Procs. Of the 4th International Conference
on Audio and Video-Based Biometric Person
Identification, Lecture Notes in Computer Science,
Vol 2688, pp 360-368.
Microsoft Speech – Speech SDK5.1 For Windows
Applications. Retrieved July 10, 2005, from
http://www.microsoft.com/speech/download/sdk51/
Mouse Vision Assistive Technologies, Retrieved October
4, 2005, from http://mousevision.com/
Origin Instruments Corporation. Retrieved October 4,
2005, from http://orin.com/index
O'Shaughnessy, D., 2003. Interacting with computers by
voice: automatic speech recognition and synthesis.
1272- 1305.
Potamianos G., Neti C., Gravier G., Garg A., Senior A.W.
2003. Recent advances in the automatic recognition of
audiovisual speech. IEEE Proceedings, Vol 91, Issue
9, pp 1306- 1326.
Toyama K. 1998. Look, Ma – No Hands! – Hands Free
Cursor Control with Real Time 3D Face Tracking,
Procs. Of Workshop on Perceptual User Interfaces, pp.
49-54.
ICEIS 2006 - HUMAN-COMPUTER INTERACTION
26