another well-known solution to prevent membership
inference in aggregated datasets. Differentially pri-
vate mechanisms introduce noise in order to minimize
changes in the data distribution caused by adding or
removing a user. However, differential privacy typi-
cally cannot be applied to data collections as small as
lifelogging datasets (Section 1).
More recent works have identified adversarial
neural networks as a solution to protect time series
data collected from smartphones (Malekzadeh et al.,
2019). Such networks train a release mechanism that
is used to “sanitize” the samples, concealing personal
information. Their effectiveness on mobile sensor
data suggests that they may also be used to anonymize
fitness records from wearables.
6 CONCLUSION
We demonstrated that it is possible for the adversary
to de-anonymize the records of anonymous users in an
aggregated data collection, and uniquely re-identify
minority individuals within the datasets based on their
gender, height, and BMI.
We also showed that an adversary can de-anonymize
all users (minority or majority) in the dataset based
on their daily routine with 93.5% accuracy, if she has
access to some of their fitness data.
Finally, we discussed how applying k-anonymity to
quasi-identifiers (i.e., physical characteristics) would
not guarantee users’ privacy, since the adversary is
still able to glean information on those attributes
through the presented inference model.
REFERENCES
CCS Insight (2021). Healthy outlook for wearables as users
focus on fitness and well-being.
Christovich, M. M. (2016). Why should we care what fit-
bit shares-a proposed statutroy solution to protect sen-
sative personal fitness information. Hastings Comm.
& Ent. LJ, 38:91.
Dong, Y., Hoover, A., Scisco, J., and Muth, E. (2012).
A new method for measuring meal intake in humans
via automated wrist motion tracking. Applied psy-
chophysiology and biofeedback, 37(3):205–215.
Dwork, C. (2006). Differential privacy. In International
Colloquium on Automata, Languages, and Program-
ming, pages 1–12. Springer.
Furberg, R., Brinton, J., Keating, M., and Ortiz, A.
(2016). Crowd-sourced Fitbit datasets 03.12.2016-
05.12.2016.
Harris, J. A. and Benedict, F. G. (1918). A biometric study
of human basal metabolism. Proceedings of the Na-
tional Academy of Sciences of the United States of
America, 4(12):370.
Hilts, A., Parsons, C., and Knockel, J. (2016). Every step
you fake: A comparative analysis of fitness tracker
privacy and security. Open Effect Report, 76(24):31–
33.
Kelly, D., Curran, K., and Caulfield, B. (2017). Automatic
prediction of health status using smartphone-derived
behavior profiles. IEEE journal of biomedical and
health informatics, 21(6):1750–1760.
Machanavajjhala, A., Kifer, D., Gehrke, J., and Venkita-
subramaniam, M. (2007). l-diversity: Privacy beyond
k-anonymity. ACM Transactions on Knowledge Dis-
covery from Data (TKDD), 1(1):3–es.
Malekzadeh, M., Clegg, R. G., Cavallaro, A., and Haddadi,
H. (2018). Protecting sensory data against sensitive
inferences. In Proceedings of the 1st Workshop on
Privacy by Design in Distributed Systems, pages 1–6.
Malekzadeh, M., Clegg, R. G., Cavallaro, A., and Haddadi,
H. (2019). Mobile sensor data anonymization. In Pro-
ceedings of the international conference on internet of
things design and implementation, pages 49–58.
Marchioro, T., Kazlouski, A., and Markatos, E. (2021).
User identification from time series of fitness data. In
International Conference on Security and Cryptogra-
phy (SECRYPT), pages 806–811.
OpenHumans (2016). Open humans fitbit connection. https:
//www.openhumans.org/activity/fitbit-connection.
Parate, A., Chiu, M.-C., Chadowitz, C., Ganesan, D., and
Kalogerakis, E. (2014). Risq: Recognizing smoking
gestures with inertial sensors on a wristband. In Pro-
ceedings of the 12th annual international conference
on Mobile systems, applications, and services, pages
149–161.
Samarati, P. (2001). Protecting respondents identities in mi-
crodata release. IEEE transactions on Knowledge and
Data Engineering, 13(6):1010–1027.
Sathyanarayana, A., Joty, S., Fernandez-Luque, L., Ofli, F.,
Srivastava, J., Elmagarmid, A., Arora, T., and Taheri,
S. (2016). Sleep quality prediction from wearable
data using deep learning. JMIR mHealth and uHealth,
4(4):e125.
Sweeney, L. (2002). k-anonymity: A model for protecting
privacy. International Journal of Uncertainty, Fuzzi-
ness and Knowledge-Based Systems, 10(05):557–570.
Thambawita, V., Hicks, S., Borgli, H., Pettersen, S. A., Jo-
hansen, D., Johansen, H., Kupka, T., Stensland, H. K.,
Jha, D., Grønli, T.-M., and et al. (2020). Pmdata: A
sports logging dataset.
Torre, I., Sanchez, O. R., Koceva, F., and Adorni, G. (2018).
Supporting users to take informed decisions on pri-
vacy settings of personal devices. Personal and Ubiq-
uitous Computing, 22(2):345–364.
Vitak, J., Liao, Y., Kumar, P., Zimmer, M., and Kritikos,
K. (2018). Privacy attitudes and data valuation among
fitness tracker users. In International Conference on
Information, pages 229–239. Springer.
World, D. (2017, December 1). Height chart of
men and women in different countries. disabled
world. www.disabled-world.com/calculators-charts/
height-chart.php. Online; Retrieved May 2, 2022.
SECRYPT 2022 - 19th International Conference on Security and Cryptography
348