Non-parametric sequence-based learning approach for outlier detection in IoT
Published on Nov 1, 2017in Future Generation Computer Systems6.125
· DOI :10.1016/J.FUTURE.2017.11.021
Abstract Although study on outlier detection techniques has long been an area of much research, few of those works relate to an Internet of Things (IoT) environment. In the last few years, with the advent of IoT and its numerous smart objects, data generated from sensors have increased exponentially. Since on the basis of these data many critical decisions are taken, it is therefore necessary to absolutely ensure its accuracy, correctness and integrity before any processing starts. Most algorithms in the past assumes outliers to be always an Error, that is caused by a malfunctioning sensor. However, this is not always true because an outlier could also be an important event that should not be neglected. This stresses the need to devise algorithms that is suited for an IoT environment which considers both Error, that is a result of faulty sensors or an Event, which is an indication of an abnormal phenomenon. This paper proposes a sequence based learning approach for outlier detection that works for both Error and Event. Simulations are performed on few benchmark datasets, a medical dataset and a real world dataset obtained through an experimental test bed. The results reveal exceptionally high accuracies with up to 99.65% for Error detection and 98.53% for Event detection.