Bau, O., Poupyrev, I., Israr, A., and Harrison, C. (2010).
TeslaTouch: Electrovibration for touch surfaces. In
Proceedings of the 23nd Annual ACM Symposium on
User Interface Software and Technology, UIST ’10,
page 283–292, New York, NY, USA. Association for
Computing Machinery.
Bourne, R. R., Adelson, J., Flaxman, S., Briant, P., Bottone,
M., Vos, T., Naidoo, K., Braithwaite, T., Cicinelli, M.,
Jonas, J., et al. (2020). Global prevalence of blind-
ness and distance and near vision impairment in 2020:
progress towards the vision 2020 targets and what the
future holds. Investigative Ophthalmology & Visual
Science, 61(7):2317–2317.
Granquist, C., Sun, S. Y., Montezuma, S. R., Tran, T. M.,
Gage, R., and Legge, G. E. (2021). Evaluation and
comparison of artificial intelligence vision aids: Or-
cam myeye 1 and seeing ai. Journal of Visual Impair-
ment & Blindness, 115(4):277–285.
He, L., Wang, R., and Xu, X. (2020). PneuFetch: Support-
ing blind and visually impaired people to fetch nearby
objects via light haptic cues. In Extended Abstracts of
the 2020 CHI Conference on Human Factors in Com-
puting Systems, pages 1–9.
Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D.,
Wang, W., Weyand, T., Andreetto, M., and Adam,
H. (2017). Mobilenets: Efficient convolutional neu-
ral networks for mobile vision applications. arXiv
preprint arXiv:1704.04861.
Kim, Y., Harders, M., and Gassert, R. (2015). Identification
of vibrotactile patterns encoding obstacle distance in-
formation. IEEE transactions on haptics, 8(3):298–
305.
Lim, J., Yoo, Y., and Choi, S. (2019). Guidance-based two-
dimensional haptic contour rendering for accessible
photography. In 2019 IEEE World Haptics Confer-
ence (WHC), pages 401–406.
Lin, T.-Y., Goyal, P., Girshick, R., He, K., and Doll
´
ar, P.
(2017). Focal loss for dense object detection. In
Proceedings of the IEEE international conference on
computer vision, pages 2980–2988.
Lin, T.-Y., Maire, M., Belongie, S., Hays, J., Perona, P., Ra-
manan, D., Doll
´
ar, P., and Zitnick, C. L. (2014). Mi-
crosoft COCO: Common objects in context. In Euro-
pean conference on computer vision, pages 740–755.
Springer.
Lugaresi, C., Tang, J., Nash, H., McClanahan, C., Uboweja,
E., Hays, M., Zhang, F., Chang, C.-L., Yong, M. G.,
Lee, J., et al. (2019). Mediapipe: A framework
for building perception pipelines. arXiv preprint
arXiv:1906.08172.
Nyman, S. R., Dibb, B., Victor, C. R., and Gosney, M. A.
(2012). Emotional well-being and adjustment to vi-
sion loss in later life: a meta-synthesis of qualitative
studies. Disability and rehabilitation, 34(12):971–
981.
Palani, H. P., Tennison, J. L., Giudice, G. B., and Giudice,
N. A. (2018). Touchscreen-based haptic information
access for assisting blind and visually-impaired users:
Perceptual parameters and design guidelines. In Inter-
national Conference on Applied Human Factors and
Ergonomics, pages 837–847. Springer.
R
¨
oijezon, U., Prellwitz, M., Ahlmark, D. I., van Deventer,
J., Nikolakopoulos, G., and Hyypp
¨
a, K. (2019). A
haptic navigation aid for individuals with visual im-
pairments: Indoor and outdoor feasibility evaluations
of the lasernavigator. Journal of Visual Impairment &
Blindness, 113(2):194–201.
Sadic, A., Ayyildiz, M., and Basdogan, C. (2017). Hap-
tic perception of 2d equilateral geometric shapes via
electrovibration on touch screen. In 2017 21st Na-
tional Biomedical Engineering Meeting (BIYOMUT),
pages i–iv.
Soviak, A. (2015). Haptic gloves prototype for audio-tactile
web browsing. In Proceedings of the 17th Interna-
tional ACM SIGACCESS Conference on Computers &
Accessibility, ASSETS ’15, page 363–364, New York,
NY, USA. Association for Computing Machinery.
NCTA 2022 - 14th International Conference on Neural Computation Theory and Applications
322