Friday, 18 Oct, 2019

011-28082485

011-47044510

+91-9899775880

Kullback-Leibler Divergence for Masquerade Detection

Journal of Applied Information Science

Volume 1 Issue 1

Published: 2013
Author(s) Name: Geetha Ranjini Viswanathan, Richard M. Low, Mark Stamp | Author(s) Affiliation: Department of Computer Science, San Jose State University, San Jose, California, USA
Locked Subscribed Available for All

Abstract

A masquerader is an attacker who gains access to a legitimate user’s credentials and pretends to be that user so as to evade detection. Several statistical techniques have been applied to the masquerade detection problem, including hidden Markov models (HMM) and one class na¨ive Bayes (OCNB). In addition, Kullback-Leibler (KL) divergence has been used in an effort to improve detection rates. In this paper, we analyze masquerade detection techniques that employ HMMs, OCNB, and KL divergence. Detailed statistical analysis is provided to compare the effectiveness of these various approaches.

Keywords: Masquerade Detection, Kullback-Leibler Divergence, One Class Naive Bayes, Hidden Markov Models, Intrusion Detection

View PDF

Refund policy | Privacy policy | Copyright Information | Contact Us | Feedback © Publishingindia.com, All rights reserved