sumption in these devices. This increase in memory
consumption can be due to the extra CUDA mod-
ules loaded by TensorFlow to accelerate the infer-
encing process. At the same time, the memory us-
age increases at a rate of 548.15% on Jetson boards
compared to Raspberry, the performance tradeoff in-
creases at a rate of 2413% more performance regard-
ing the frames per second.
6.1 Future Works
In future improvements of this work, there is a need
to measure and evaluate the energetical consumption
of the embedded devices. Comparing energy-efficient
edge-ai machines is crucial as most edge applications
rely on batteries. Energetical consumption is a con-
straint and critical aspect when inferencing neural net-
works at the edge. So to verify the actual applicability
of our neural network at the edge, it is interesting to
test the energy consumption for the completeness of
our work’s evaluation process.
ACKNOWLEDGEMENTS
The authors would like to thank FAPEMIG, CAPES,
CNPq, and the Federal University of Ouro Preto
for supporting this work. This work was partially
funded by CAPES (Finance Code 001) and CNPq
(308219/2020-1).
REFERENCES
Aqel, M. O., Marhaban, M. H., Saripan, M. I., and Ismail,
N. B. (2016). Review of visual odometry: types, ap-
proaches, challenges, and applications. SpringerPlus,
5(1):1–26.
Benseddik, H. E., Djekoune, O., and Belhocine, M. (2014).
Sift and surf performance evaluation for mobile robot-
monocular visual odometry. Journal of Image and
Graphics, 2(1):70–76.
de Sousa, F. L. M., da Silva, M. J., de Meira Santos, R.
C. C., Silva, M. C., and Oliveira, R. A. R. (2021a).
Deep-learning-based embedded adas system. In 2021
XI Brazilian Symposium on Computing Systems Engi-
neering (SBESC), pages 1–8. IEEE.
de Sousa, F. L. M., Meira, N. F. d. C., Oliveira, R. A. R.,
and Silva, M. C. (2021b). Deep-learning-based visual
odometry models for mobile robotics. In Anais Es-
tendidos do XI Simp
´
osio Brasileiro de Engenharia de
Sistemas Computacionais, pages 122–127. SBC.
Forster, C., Pizzoli, M., and Scaramuzza, D. (2014). Svo:
Fast semi-direct monocular visual odometry. In 2014
IEEE international conference on robotics and au-
tomation (ICRA), pages 15–22. IEEE.
Hansen, P., Alismail, H., Rander, P., and Browning, B.
(2011). Monocular visual odometry for robot localiza-
tion in lng pipes. In 2011 IEEE International Confer-
ence on Robotics and Automation, pages 3111–3116.
IEEE.
Jaimez, M., Monroy, J. G., and Gonzalez-Jimenez, J.
(2016). Planar odometry from a radial laser scan-
ner. a range flow-based approach. In 2016 IEEE In-
ternational Conference on Robotics and Automation
(ICRA), pages 4479–4485. IEEE.
Jaulin, L. (2019). Mobile robotics. John Wiley & Sons.
Kunii, Y., Kovacs, G., and Hoshi, N. (2017). Mobile robot
navigation in natural environments using robust ob-
ject tracking. In 2017 IEEE 26th international sym-
posium on industrial electronics (ISIE), pages 1747–
1752. IEEE.
Lee, Y.-L., Tsung, P.-K., and Wu, M. (2018). Techology
trend of edge ai. In 2018 International Symposium on
VLSI Design, Automation and Test (VLSI-DAT), pages
1–2. IEEE.
Li, R., Wang, S., Long, Z., and Gu, D. (2018). Un-
deepvo: Monocular visual odometry through unsu-
pervised deep learning. In 2018 IEEE international
conference on robotics and automation (ICRA), pages
7286–7291. IEEE.
Liu, Q., Li, R., Hu, H., and Gu, D. (2019). Using unsu-
pervised deep learning technique for monocular visual
odometry. Ieee Access, 7:18076–18088.
Mur-Artal, R., Montiel, J. M. M., and Tard
´
os, J. D.
(2015). ORB-SLAM: a versatile and accurate monoc-
ular SLAM system. IEEE Transactions on Robotics,
31(5):1147–1163.
Mur-Artal, R. and Tard
´
os, J. D. (2017). ORB-SLAM2:
an open-source SLAM system for monocular, stereo
and RGB-D cameras. IEEE Transactions on Robotics,
33(5):1255–1262.
Rublee, E., Rabaud, V., Konolige, K., and Bradski, G.
(2011). Orb: An efficient alternative to sift or surf.
In 2011 International conference on computer vision,
pages 2564–2571. Ieee.
Yousif, K., Bab-Hadiashar, A., and Hoseinnezhad, R.
(2015). An overview to visual odometry and visual
slam: Applications to mobile robotics. Intelligent In-
dustrial Systems, 1(4):289–311.
ICEIS 2022 - 24th International Conference on Enterprise Information Systems
568