Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Trading Smarter: Leveraging Advanced Technology for Algorithmic Trading -Part I

DZone's Guide to

Trading Smarter: Leveraging Advanced Technology for Algorithmic Trading -Part I

High-frequency​ trading? yes, please! This post discusses trends in algorithmic trading markets and the technologies available for visionary firms to make that quantum leap and emerge as an industry leader

· Big Data Zone
Free Resource

Free O'Reilly eBook: Learn how to architect always-on apps that scale. Brought to you by Mesosphere DC/OS–the premier platform for containers and big data.

Low latency and advanced analytic technologies are the primary influence in today’s capital markets.  This is the first in a series of blog posts to discuss trends in algorithmic trading markets and the technologies available for visionary firms to make that quantum leap and emerge as an industry leader — a leader that not only trades faster, but makes better trading decisions faster. In short, a leader that is fast and smart.

A Few Key Facts:

Fact 1

Algorithmic trading continues to capture the majority share of trading volume.

Picture1

Fact 2

While HFT and passively managed, low-cost funds continue to generate the majority of the profits, even these sectors are experiencing margin pressures.

Graph 2_Trade Smart

By any measure, HFT techniques are the dominant component in the current market structure and likely to affect nearly all aspects of its performance.

Profit Is King

The noticeable profit constriction is the result of several factors, the most obvious of which are the emergence of new low-cost passive funds and the increased number of firms engaged in low-latency, high-frequency trading. Amid the quest for ever-faster transactions in recent years, the associated cost of speed-trading infrastructure has risen significantly.

As a result of the margin squeeze, many firms are looking to reduce costs. “Cost per simulation” is the edict of the day: use of ultra-low-cost commodity hardware, open-source analytics and even offshoring for some of the quantitative research personnel. I suggest this is a very short-term benefit with substantial increase in risk to the statistical validity of the quantitative models.

Need for Speed — Perhaps Not?

HFT is a subset of algorithmic trading, and the need for speed is unquestionable. One millisecond advantage can be worth $100 million to a major brokerage firm according to

In an effort to minimize delays due to physical proximity, trading firms vie for valuable real estate in exchange data centers to shave microseconds off. While speed is critical, the cost for an incremental millisecond savings can be very expensive, and the time savings may be short-lived. Combine the high expense of saving a few microseconds with the proposed 350ms speed bump proposed by the IEX and endorsed by the SEC, visionary trading firms will need to expand beyond just low-latency, high-frequency trading.

Big Data: World of Possibilities

Expanding beyond traditional market and options data presents both opportunities and technological obstacles for traders. Big data solutions are used for datasets that are too large and complex to manipulate or interrogate with customary methods or tools. Everyone wants to use big data to identify and test new trading signals, reducing development cycle time of new algorithmic trading strategies. Many customers have no idea how to ingest and interpret the vast amount of disparate information sources available to them, much less understand how to derive useful insights from market data.

The size and speed of big data will continue to grow exponentially with the introduction of more internet-enabled devices (think internet of things). The visionary firms that can inspect this data quickly with minimal latency should have the ability to develop a competitive advantage. Time is critical; as soon as you discover a new profitable algorithm, your window of opportunity to capitalize on it may be only weeks or months before other firms discover the same signal.

The latency associated with the variety, volume, and velocity of big data has many firms creating job-specific, analytic silos. As a result, we have seen the rise of quantitative analytic jungle, with the results of one analytic process being ingested into yet another analytic process. The result is a firm with multiple yet separate processes where data is replicated and moved repeatedly. In short, you are adding latency and cost to the organization with no improvement in your modeling degree of confidence.

Smarter Algorithmic Models

Today, we can exploit various data sources and leverage new technologies to develop new trading strategies faster, identify new trading signals and conduct more effective yet streamlined backtesting. I am speaking about the possibility of improving the profitability ratio by a few points. The profit potential could be immense, and firms may finally be able to create portfolios with sustainable alpha.

Big data may present us with the opportunity to develop new, smarter models, yet most traditional technologies cannot scale, and they lack the bandwidth necessary to take advantage of the opportunities. There are several components in creating smarter models. We will address one of following new technologies every two weeks in this “Trading Smarter” series.

  • Addressing the I/O bottleneck
  • Developing a scalable infrastructure with lower cost per simulation
  • Addressing memory-centric analytics applications
  • The new memory-storage hierarchy for HPDA
  • Big data à fast data
  • Scalable infrastructure that facilitates development of a FinTech informatics pipeline
  • GPUs and compute accelerators
  • Machine learning
  • Deep learning with neural networks
  • Graph database technology to discover new relationships between disparate data sources faster with no ETL
  • High-performance parallel file systems
  • Agile and scalable analytic infrastructure based on 100 percent open-source technology

Easily deploy & scale your data pipelines in clicks. Run Spark, Kafka, Cassandra + more on shared infrastructure and blow away your data silos. Learn how with Mesosphere DC/OS.

Topics:
high frequency trading ,supercomputing

Published at DZone with permission of Bob Gaines, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

THE DZONE NEWSLETTER

Dev Resources & Solutions Straight to Your Inbox

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

X

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}