rithm stops and we must re-iniatilize the procedure by
a re-projection of fiducials on the current image using
the previous image homography.
Figure 9: Robust fiducial tracking.
8 CONCLUSION
In this paper, we were interested to robust and real
time tracking of fiducials in AR applications. We used
bar-coded object targets in order to identify fiducials
in images and extract their feature points. Thereafter,
we established the existing relationship between the
perspective model of camera, and we used 2 pose es-
timators which are the EKF and OI algorithm to de-
temine the right location of fiducials in the image. The
performance of OI algorithm were clearly better than
the EKF algorithm in term of errors and time execu-
tion. These performances justified throughly the use
of OI algorithm for the experiments, knowing that the
OI algorithm was initialized by pose parameters com-
puted by an analytic pose estimator.
We initially proposed an algorithm of feature points
tracking for a real time sequence of images based on
the object identification and pose computation. We
showed how to extend this method and make it robust
in the case of partial or total occlusion of the target by
using another algorithm based on RANSAC estima-
tor.
The obtained tracking results were precise, robust
and showed the validity of the used approach.
As perspective we will combine the camera with
an inertial measurement unit in order to locate this
hybrid system when the two target objects are com-
pletely occulted.
REFERENCES
Ababsa, F. and Mallem, M. (2004). Robust camera pose es-
timation using 2d fiducials tracking for real-time aug-
mented reality systems. Proceedings of ACM SIG-
GRAPH International Conference on Virtual-Reality
Continuum and its Applications in Industry (VR-
CAI2004).
Canny, J. (1986). A computational approach to edge de-
tection. IEEE Trans. Pattern Anal. Mach. Intell.,
8(6):679–698.
Chen, J. H., Chen, C. S., and Chen, Y. S. (2003). Fast algo-
rithm for robust template matching with m-estimators.
IEEE Transactions on Signal Processing, 51(1):36–
45.
Comport, A. I., Marchand, E., and Chaumette, F. (2003).
A real-time tracker for markerless augmented reality.
ACM/IEEE Int. Symp. on Mixed and Augmented Real-
ity (ISMAR 2003), pages 36–45.
Fischler, M. A. and Bolles, R. C. (1981). Random sample
consensus: a paradigm for model fitting with appli-
cations to image analysis and automated cartography.
Commun. ACM, 24(6):381–395.
Gennery, D. B. (1992). Visual tracking of known three-
dimensional objects. International Journal of Com-
puter Vision, 7(3):243–270.
Harris, C. and Stennett, C. (1990). Rapid – a video rate ob-
ject tracker. In Proc 1st British Machine Vision Conf,
pages 73–78.
Lowe, D. G. (1992). Robust model-based motion tracking
through the integration of search and estimation. Int.
J. Comput. Vision, 8(2):113–122.
Lu, C. P., , Hager, G. D., and Mjolsness, E. (2000). Fast
and globally convergent pose estimation from video
images. IEEE Transaction on Pattern Analysis and
Machine Intelligence, 22(6):610–622.
Maidi, M., , Ababsa, F., and Mallem, M. (2005). Vision-
inertial system calibration for tracking in augmented
reality. Proceedings of the Second International Con-
ference on Informatics in Control, Automation and
Robotics, ICINCO 2005, 3:156–162.
Naimark, L. and Foxlin, E. (1990). Circular data ma-
trix fiducial system and robust image processing for
a wearable vision-inertial self-tracker. IEEE Interna-
tional Symposium on Mixed and Augmented Reality
(ISMAR 2002), pages 27– 36.
Welch, G. and Bishop, G. (2004). An introduction to the
kalman filter. Technical Report No. TR 95-041, De-
partment of Computer Science, University of North
Carolina.
Zhang, Z. (1998). A flexible new technique for camera cal-
ibration. Technical Report MSR-TR-98-71.
ROBUST AUGMENTED REALITY TRACKING BASED VISUAL POSE ESTIMATION
351