Tuesday, 30 Nov, 2021




Framework to Process High Frequency Trading Using Complex Event Processing

International Journal of Knowledge Based Computer Systems

Volume 5 Issue 1

Published: 2017
Author(s) Name: A. Acharya, N. S. Sidnal | Author(s) Affiliation: KLS GIT, Gogte Institute of Technology, Belagavi, Karnataka, India.
Locked Subscribed Available for All


The financial services industry had always been a data intensive industry. From insurance to capital markets the role of data has been pivotal for a lot of applications like financial modeling, portfolio optimization, asset/liability matching, fraud detection and risk modeling. The big data revolution has provided a lot of options for innovation and improved efficiency in this domain. At the same time, a new set of challenges has been thrown up which need to be overcome for future growth and sustainability in the financial services industry. In recent times the securities trading market has undergone dramatic changes resulting in the growth of high velocity data. Velocity being one of the Vs of Big data, presents a unique set of challenges to the capital markets. The tradition approach of using Business Intelligence (BI) is no longer scaling especially in terms of the velocity of data. During the previous decade most of the firms in the capital markets have made significant investments in their ability to collect, store, manage and analyze (to some extent) large amount of data. Based on the benefits offered by big data analytics, financial services firms are now able to provide highly personalized and real time location based services rather than only product-based services which was possible earlier. The rise of electronic trading and the availability of real time stock prices and real time currency trading make it necessary to have real time risk analysis. Market participants who have the ability to analyze the data in real time will be able to garner a disproportionate part of the available profit pool. The availability of huge amounts of financial data, high rate of data generation, and the heterogeneity of financial data make it difficult to capture, process and perform timely analysis of data. Traditional financial systems are not designed to cope with a wide variety of data, especially unstructured data from Twitter, news, social media, blogs etc which affect market dynamics in real time. Traditional data warehousing and BI techniques like extract, transform and load (ETL) take a huge amount of time (often days) to process the large amounts of data and are thus not receptive to real time analytics. This paper discusses the implication of the rise of big data and especially that of high velocity data in the domain of High Frequency Trading (HFT), a growing niche of securities trading. We first take a brief look at the intricacies of HFT including some of the commonly used strategies used by HFT traders. The technological challenges in processing 5623HFT and responding to the real time changes in the market conditions are also discussed. Some of the potential technological solutions to solve the issues thrown up by HFT are analyzed for their effectiveness to address the real time performance requirements of HFT. We identify Complex Event Processing (CEP) as a candidate to address the HFT problem. The paper is divided into 3 parts; part A deals with understanding HFT and the challenges that it poses to the technological processing. In Part B we look at Complex Event Processing (CEP) and the types of problems it can be applied to. In Part C we show a framework to process HFT using techniques derived from CEP.

Keywords: High frequency trading, Complex event processing, Big data processing.

View PDF

Refund policy | Privacy policy | Copyright Information | Contact Us | Feedback © Publishingindia.com, All rights reserved