How to Implement Artificial Intelligence and Machine Learning Technology in Capital Markets

Machine learning and artificial intelligence are exciting buzzwords right now, used frequently and perhaps inattentively at times to describe a sort of nebulous innovation that an organization is exploring.

Often it can be easy to get caught up in the excitement of modern trends and keywords rather than the applicability of a feature and how it can benefit your organization. Artificial intelligence and machine learning are technologies that can provide enormous benefits when implemented in a disciplined and directed way, otherwise organizations can be led down a wild and expensive rabbit chase.

Trying to force a machine learning use case for your organization can lead to confusion and high costs so we’ve put together guidelines of what you will need before taking a deep dive.


Developing technology for the sake of technology, though admirable, may be ultimately misaligned with the end-goal of most companies. Organizations like NASA or Google can implement heavy and sustained research in a wide breadth to reap benefits when future innovations that builds upon the existing research eventually trickles back in.

Rocket science during the Sputnik era, for example, enabled NASA scientists to use satellites to track shifts in radio signals eventually leading to the development of GPS. However, most organizations can end up incurring high costs and limited gains when developing technologies with an intangible end-goal.

Here are some direct use-cases we’ve seen organizations implement utilizing the power of AI to provide great benefits in the capital markets space:

Full Depth Book Replay Visualization for Canadian Equities

Currently, many quantitative researchers still struggle to successfully use machine learning to outperform current trading models. The use-case here isn’t to boil the ocean and implement a costly multi-year black box algorithm to automate trading using thousands of different signals from diverse datasets. Perhaps someday many organizations will be at that point but today the incremental benefit produces the most bang for buck.

The machine learning use-case for most organizations right now is to be used as a support tool for traders. For example, quants that have a point-in-time book reconstruction product can pull data using APIs and design their deep learning algorithms to find similar looking predictors and trends to the one they are already looking at on different times throughout the day. Quants can then efficiently cycle through all these matched predictors to validate their intuition and better come up with a methodology of how to trade.

A deep learning algorithm built on a book replay solution can be trained using some combination of inputs including: time on NBBO, slope of executed prices over time, time-weighted average spread analytics, or traded value to name a few ways to match predictors!


Regulators, trade surveillance groups, and compliance teams throughout banks have a responsibility to be proactive in detecting instances of spoofing in the markets. Spoofing is a form of disruptive market manipulation using high-speed programs in order to feign interest in a financial instrument without actually going through with the trade.

What traditionally occurs is that traders spoofing would enter a large bid or ask order and then cancel the order before it is filled. Other market participants, upon seeing higher liquidity on one side of the book would react accordingly either selling or buying shares without knowing this newfound liquidity is “fake”. As a result, artificial market movements occur for the party spoofing to capitalize on after they cancel their initial order entry.

DXC Technology, a company formed in 2017 from the merger of Computer Sciences Corporation (CSC) and the Enterprise Services business of Hewlett Packard Enterprise, has built a multidimensional feature set that identifies spoofing, layering, non-compliant trades, and fraudulent transactions using machine learning based analytics. They identify as well as log fast spikes and cancellations of liquidity in the order book to achieve this.

In order for DXC to provide trade surveillance analytics, they use a big data platform capable of efficient historical queries and near-real time streaming capabilities on quality data.


In the first case for quantitative researchers, big data processing tasks are required to reconstruct a point-in-time order book as well as to train algorithms to find matched trends in the order book. In the second case for compliance groups, data storage as well as near real-time processing capabilities is required to produce complex trade surveillance analytics.

There is a common thread between the different use-cases. Building useful machine learning implementations in the capital markets space requires a powerful foundation of data management.

*TickVault has been renamed to the GOLD Platform

It's kind to share


Related Posts

Scroll to Top