Who Watches the Watchers: A Multi-Task Benchmark for Anomaly Detection

Phil Demetriou, Ingolf Becker, Stephen Hailes

2022

Abstract

A driver in the rise of IoT systems has been the relative ease with which it is possible to create specialized-but- adaptable deployments from cost-effective components. Such components tend to be relatively unreliable and resource poor, but are increasingly widely connected. As a result, IoT systems are subject both to component failures and to the attacks that are an inevitable consequence of wide-area connectivity. Anomaly detection systems are therefore a cornerstone of effective operation; however, in the literature, there is no established common basis for the evaluation of anomaly detection systems for these environments. No common set of benchmarks or metrics exists and authors typically provide results for just one scenario. This is profoundly unhelpful to designers of IoT systems, who need to make a choice about anomaly detection that takes into account both ease of deployment and likely detection performance in their context. To address this problem, we introduce Aftershock, a multi-task benchmark. We adapt and standardize an array of datasets from the public literature into anomaly detection-specific benchmarks. We then proceed to apply a diverse set of existing anomaly detection algorithms to our datasets, producing a set of performance baselines for future comparisons. Results are reported via a dedicated online platform located at https://aftershock. dev, allowing system designers to evaluate the general applicability and practical utility of various anomaly detection models. This approach of public evaluation against common criteria is inspired by the immensely useful community resources found in areas such as natural language processing, recommender systems, and reinforcement learning. We collect, adapt, and make available 10 anomaly detection tasks which we use to evaluate 6 state-of-the-art solutions as well as common baselines. We offer researchers a submission system to evaluate future solutions in a transparent manner and we are actively engaging with academic and industry partners to expand the set of available tasks. Moreover, we are exploring options to add hardware-in-the-loop. As a community contribution, we invite researchers to train their own models (or those reported by others) on the public development datasets available on the online platform, submitting them for independent evaluation and reporting results against others.

Download


Paper Citation


in Harvard Style

Demetriou P., Becker I. and Hailes S. (2022). Who Watches the Watchers: A Multi-Task Benchmark for Anomaly Detection. In Proceedings of the 8th International Conference on Information Systems Security and Privacy - Volume 1: ICISSP, ISBN 978-989-758-553-1, pages 579-586. DOI: 10.5220/0010915000003120


in Bibtex Style

@conference{icissp22,
author={Phil Demetriou and Ingolf Becker and Stephen Hailes},
title={Who Watches the Watchers: A Multi-Task Benchmark for Anomaly Detection},
booktitle={Proceedings of the 8th International Conference on Information Systems Security and Privacy - Volume 1: ICISSP,},
year={2022},
pages={579-586},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010915000003120},
isbn={978-989-758-553-1},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 8th International Conference on Information Systems Security and Privacy - Volume 1: ICISSP,
TI - Who Watches the Watchers: A Multi-Task Benchmark for Anomaly Detection
SN - 978-989-758-553-1
AU - Demetriou P.
AU - Becker I.
AU - Hailes S.
PY - 2022
SP - 579
EP - 586
DO - 10.5220/0010915000003120