Kullback-Leibler Divergence for Masquerade Detection
Published: 2013
Author(s) Name: Geetha Ranjini Viswanathan, Richard M. Low, Mark Stamp |
Author(s) Affiliation: Department of Computer Science, San Jose State University, San Jose, California, USA
Locked
Subscribed
Available for All
Abstract
A masquerader is an attacker who gains access to a
legitimate user’s credentials and pretends to be that
user so as to evade detection. Several statistical
techniques have been applied to the masquerade
detection problem, including hidden Markov models
(HMM) and one class na¨ive Bayes (OCNB). In addition,
Kullback-Leibler (KL) divergence has been used in
an effort to improve detection rates. In this paper, we
analyze masquerade detection techniques that employ
HMMs, OCNB, and KL divergence. Detailed statistical
analysis is provided to compare the effectiveness of
these various approaches.
Keywords: Masquerade Detection, Kullback-Leibler Divergence, One Class Naive Bayes, Hidden Markov Models, Intrusion Detection
View PDF