Managing the Complexity of Processing Financial Data at Scale -- an Experience Report
Sebastian Frischbier,
Mario Paic,
Alexander Echler and
Christian Roth
Papers from arXiv.org
Abstract:
Financial markets are extremely data-driven and regulated. Participants rely on notifications about significant events and background information that meet their requirements regarding timeliness, accuracy, and completeness. As one of Europe's leading providers of financial data and regulatory solutions vwd processes a daily average of 18 billion notifications from 500+ data sources for 30 million symbols. Our large-scale geo-distributed systems handle daily peak rates of 1+ million notifications/sec. In this paper we give practical insights about the different types of complexity we face regarding the data we process, the systems we operate, and the regulatory constraints we must comply with. We describe the volume, variety, velocity, and veracity of the data we process, the infrastructure we operate, and the architecture we apply. We illustrate the load patterns created by trading and how the markets' attention to the Brexit vote and similar events stressed our systems.
Date: 2019-08
New Economics Papers: this item is included in nep-big
References: View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://arxiv.org/pdf/1908.03206 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:1908.03206
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().