Make Shared Data the Cornerstone of your Architecture for FRTB
Calling all trading firms and banks… we’re here to help!
As the Fundamental review of the trading book (FRTB) implementation deadline is drawing closer (December 31, 2019* to be exact), it is important for banks to understand what technological and business strategies they need to change in order to meet compliance requirements. In this post we are going to provide a summary of what FRTB is, how it is going to impact you, and what we can do to help you.
The good news is that Big Data is here to save the day and the even better news is that Big Data technology is what we specialize in!
WHAT IS FRTB?
Essentially, FRTB is a new set of regulations that were released by the Bank for International Settlements that are meant to strengthen and standardize risk management methodologies and reporting requirements for financial institutions. In other words, regulators want to make sure that banks don’t screw up and have another 2008 crisis. They felt like they needed to revisit the current framework which they considered to be inadequate.
IMPACT OF FRTB?
There are multitude of changes that have been implemented to reinforce control over banks’ activities. We are going to highlight three of the most consequential changes that banks will have to adjust before the FRTB deadline.
Moving away from VAR (value-at-risk): The new rule will require banks to move from the commonly used value-at-risk calculation towards “Expected Shortfalls” (ES) metrics in order to calculate capital risk. The annoying thing is that the ES metrics demand more challenging computations and a massive increase in the amount of data generated/collected. The primary reason for the change of calculation is to more efficiently capture the ‘tail risk’.
Market liquidity horizon: Instead of using the one-size-fits-all approach from the current framework, market liquidity metrics takes differing levels of liquidity for various products into account.The liquidity horizons in FRTB range from 10 to 120 days depending on the complexity of the asset type. Again, regulators want to ensure banks have enough ‘cushion-money’ in case large amounts of assets are impaired. Validating liquidity horizons will require better ways to manage and analyze pricing data.
P&L attribution process: Trading desks will need to demonstrate that the risk models they have follow the adequate P&L attribution and backtesting. Banks can use the accounting or front office P&L but they definitely need to compare the model-based P&L to risk-based theoretical P&L in order to measure the risk models.
HOW CAN BIG DATA TECHNOLOGY HELP?
Now that we have established an understanding of what FRTB is and why it is such a cause for concern, let’s talk about solutions.
Big Data seems to be the answer to everything these days and once again, it is! You don’t necessarily NEED to have a big data platform to meet the FRTB requirements but it is definitely going to save you a lot of stress, time, and money. Hadoop is especially effective in meeting the FRTB requirements because it is scalable. Hadoop ecosystems can manage and consolidate massive amounts of unstructured and structured data. Hadoop can also distribute, process, and perform data analysis in order to minimize complex computing but also meet the necessary FRTB compliance reports for end-of-day and intraday trading analysis.
HOW CAN WE HELP?
TickVault*, our flagship platform, manages and consolidates all tick history of various sources into one single repository. The data pulled is transformed, enriched, normalized, and made available for risk and compliance departments’ internal usage.
To make this solution even more enticing, TickVault* can be integrated within your existing risk system by using the REST API or the Python API scripts. This will allow you to extract all the asset classes needed for calculating the “Expected Shortfall” (ES) metrics by using either the internal model approach (IMA) or the standardized approach (SA) methodology.
Furthermore, all complex calculations required for FRTB can be facilitated by leveraging all of TickSmith’s Big Data platform tools and the Hadoop ecosystem. This will prevent huge volumes of data transfer in between your systems and will improve your global SLA reporting. The final results can be pushed back into your risk system or displayed in a dashboard.
It is important to realize that TickSmith’s solution, the TickVault* platform, was made to cut costs and time. Trying to build or rebuild your own market data repository will not only cost you but will be cutting it close for the December 2019* deadline. We have a solution that integrates into your existing infrastructure.
*The deadline has been extended to January 2022.
*Our TickVault Platform is ever evolving and is now called the GOLD Platform