XBRL News about Auditchain, IASB and LEI

Here are the three most relevant developments in the world of structured reporting we became aware of in the course of last week. 1  Auditchain announces mainnet deployment of its staking contracts Auditchain’s Pacioli validating Node Staking Contracts will be deployed to Polygon Mainnet by Auditchain on March 7, 2022, as part of its preparations […]

The post XBRL News about Auditchain, IASB and LEI appeared first on Daily Fintech.

Corda powered SWIFT GPI Link could be a game-changer in global trade finance

In September, SWIFT – the inter-bank messaging firm, announced the successful proof of concept (PoC) of the “GPI Link” platform in collaboration with R3. The SWIFT Global Payments Innovation (GPI) platform has previously trialled Hyperledger without much luck.  However, with R3’s growing network of corporates, the pilot seems to have gone better. The pilot also […]

The post Corda powered SWIFT GPI Link could be a game-changer in global trade finance appeared first on Daily Fintech.

Two live Blockchain use cases in Mutual Funds administration and four pilots

In Blockchain world everybody wants to be `the World`s first`. The term started being a must in white papers, now it is all over social media, with announcements about The World`s first tokenized equity The World`s first STO The World`s first regulated Crypto bank The World`s first Initial Wallet Offering The World`s first Regulated ATS […]

The post Two live Blockchain use cases in Mutual Funds administration and four pilots appeared first on Daily Fintech.

FCA pioneers digitising regulatory reporting using DLT and NLP

Too many TLAs (Three Letter Acronyms), I agree. Earlier this week the Financial Conduct Authority (FCA) published the results of a pilot programme called Digital Regulatory Reporting. It was an exploratory effort to understand the feasibility of using Distributed Ledger Technology (DLT) and Natural Language Processing (NLP) to automate regulatory reporting at scale.

Image Source

Let me describe the regulatory reporting process that banks and regulators go through. That will help understand the challenges (hence the opportunities) with regulatory reporting.

  1. Generally, on a pre-agreed date, the regulators release templates of the reports that banks need to provide them.
  2. Banks have an army of analysts going through these templates, documenting the data items required in the reports, and then mapping them to internal data systems.
  3. These analysts also work out how the bank’s internal data can be transformed to arrive at the report as the end result.
  4. These reports are then developed by the technology teams, and then submitted to the regulators after stringent testing of the infrastructure and the numbers.
  5. Everytime the regulators change the structure or the data required on the report, the analysis and the build process have to be repeated.

I have super simplified the process, so it would help to identify areas where things could go wrong in this process.

  1. Regulatory reporting requirements are often quite generic and high level. So interpreting and breaking them down into terms that Bank’s internal data experts and IT teams understand is quite a challenge, and often error prone.
  2. Even if the interpretation is right, data quality in Banks is so poor that, analysts and data experts struggle to identify the right internal data.
  3. Banks’ systems and processes are so legacy that even the smallest change to these reports, once developed, takes a long time.
  4. Regulatory projects invariably have time and budget constraints, which means, they are just built with one purpose – getting the reports out of the door. Functional scalability of the regulatory reporting system is not a priority of the decision makers in banks. So, when a new, yet related reporting requirement comes in from the regulators, banks end up redoing the entire process.
  5. Manual involvement introduces errors, and firms often incur punitive regulatory fines if they get their reports wrong.
  6. From a regulator’s perspective, it is hard to make sure that the reports coming in from different banks have the right data. There are no inter-bank verification that happens on the data quality of the report.

Now, to the exciting bits. FCA conducted a pilot called “Digital Regulatory Reporting” with six banks, Barclays, Credit-Suisse, Lloyds, Nationwide, Natwest and Santander. The pilot involved the following,

  1. Developing a prototype of a machine executable reporting system – this would mitigate risks of manual involvement.
  2. A standardised set of financial data definitions across all banks, to ensure consistency and enable automation.
  3. Creating machine executable regulation – a special set of semantics called Domain Specific Language (DSL) were tried to achieve this. This functionality was aimed at rewriting regulatory texts into stripped down, structured, machine readable formats. A small subset of the regulatory text was also converted to executable code, from regulatory texts based on this framework.
  4. Coding the logic of the regulation in Javascript and executed using DLT based smart contracts.
  5. Using NLP to parse through regulatory texts and automatically populate databases that regulatory reports run on.

If the above streams of efforts had been completely successful, we would have a world of regulators creating regulations using DSL standards. This would be automatically converted to machine executable code, and using smart contracts be executed on a Blockchain. NLP algorithms input data into the reporting data base, which will be ready with the data when the smart contracts were executed. On execution, the reports will be sent from the banks to the regulators in a standardized format.

This would have meant a few Billions in savings for UK banks. On average, UK banks spend £5 Billion per year on regulatory programmes. However, like most pilots, only part of the programme could be terms as successful. Bank’s didn’t have the resources to complete all the above aspects of the pilot successfully. They identified the following drawbacks.

  1. Creating regulatory text in DSL, so that machines can automatically create and execute code, may not be scalable enough for the regulators. Also, if the creation of code is defective, it would be hard to hold someone accountable for error prone reports.
  2. NLP required a lot of human oversight to get to the desired level of accuracy in understanding regulatory texts. So, human intervention is required to convert it to code.
  3. Standardising data elements specific to a regulator was not a viable option, and the costs involved in doing so is prohibitive.
  4. While the pilot had quite a few positive outcomes and learnings, moving from pilot to production would be expensive.

The pilot demonstrated that,

  1. A system where regulators could just change some parameters at their end and re-purpose a report would enable automated regulatory reporting.
  2. Centralizing processes that banks currently carry out locally, create significant efficiencies.
  3. Dramatic reduction in the time and cost of regulatory reporting change.
  4. Using DLT could reduce the amount of data being transferred across parties, and create a secured infrastructure.
  5. When data is standardised into machine readable formats, it removes ambiguity and the need for human interpretation, effectively improving quality of data and the reports.

In a recent article on Robo-Regulators, I highlighted the possibilities of AI taking over the job of a regulator. That was perhaps more radical blue-sky thinking. However, using NLP and DLT to create automated regulatory reporting definitely sounds achievable. Will banks and the regulators be willing to take the next steps in moving to such a system? Watch this space.


Arunkumar Krishnakumar is a Venture Capital investor at Green Shores Capital focusing on Inclusion and a podcast host.

Get fresh daily insights from an amazing team of Fintech thought leaders around the world. Ride the Fintech wave by reading us daily in your email


Are Stock exchanges fast and efficient?

financial-markets

The Austrian school of economics view is that

Stock Exchanges are the fastest and most efficient data-processing large scale system that we humans have designed so far.

Stock exchanges need roughly 15minutes of trade to determine the effect of a piece of news – political, scientific, ecological, societal etc – on the prices of shares.

Whether this will change with DLT technology and when is up in the air. For now, we have old and powerful institutions running these data-processing systems and it won’t be easy to take steal their Cheese.

The Frankfurt Stock Exchange is over 400 yrs old with a market cap putting it in the 10th position globally[1]. The London Stock Exchange (LSE) and the New York Stock Exchange (NYSE) are both over 200yrs old and are in the 3rd and 1st respectively by market cap. Just a few blocks away from the front runner, there is NASDAQ only 45yrs old and with a 2nd ranking in market cap.

The 29yr old Australian Securities Xchange (ASX) ranking 14th in size, is actually the bravest in that they were the first to explore DLT technology for their settlement and post-trade activities. Digital Asset has been their partner, with whom they have been designing a replacement of their Clearing system CHESS since 2015, which they actually own (not the case for other stock exchanges). The full launch has been pushed out again from 2020 to 2021.

The architecture of this system maintains the messaging-based interaction with its participants and does not require them to have to run a node on the network in order to participate.

“We are often told by many, including other market infrastructures, ‘You’re so brave that you’re going first, you’re using DLT’ — we actually genuinely consider it brave to embark upon a large transformation program and not adopt this technology,” said Cliff Richards[2] ASX`s executive general manager of Equity Post-Trade Services.

NASDAQ is the most active stock exchange by being involved in several different DLT initiatives that are, however, recent.  In Spring 2016, in a post about Fintech in action on Western stock exchanges, I had mentioned Linq, a private blockchain company focused on private securities issuance. Linq allowed unlisted private companies to represent their share ownership digitally and securely. Later, Linq and Chain[3], a blockchain services provider, used DLT to register digitally ownership of private shares.

In May 2017, Nasdaq partnered with CitiConnect for Blockchain and took Linq to the next level. They went through a seamless end-to-end transactional process for private company securities.  Payment and reconciliation magic via DLT.

In October 2018, NASDAQ also partnered with the Azure blockchain service of Microsoft[4]. The aim is to integrate it in order to improve buyer-seller matching, management of delivery and payment. The key advantage they present is that this deployment will allow for interoperability with customers using various blockchains.

What really caught my attention is the Nasdaq`s use of DLT technology in their newswire services. They are starting to use smart contracts for time-sensitive data like corporate announcements, press releases, regulatory filings, etc and the associated valuable meta-data. Nasdaq seems to have filed for a patent around this  – Nasdaq Gets Patent for Blockchain Newswire to Solve Gaps in Audit Trail Gaps and Errors[5]!

For me, this latest use case can be big. Distributing meta-data through smart contracts and giving access to it on a pay-as-you-go basis, will be a huge business in financial markets and Nasdaq can dominate in this. If this then gets integrated into their market analytics business, then bingo.

[1] Data source from the Visual Capitalist as of April 2017 – Comparing the largest Stock exchanges

[2]Here’s what to expect from ASX’s blockchain-based CHESS replacement

[3] Chain was acquired by Stellar in Sep 2018

[4] Microsoft to Integrate Blockchain Offering Into Nasdaq Services Following New Partnership

[5] Nasdaq Wins Patent for Blockchain-Based Newswire Service

Efi Pylarinou is the founder of Efi Pylarinou Advisory and a Fintech/Blockchain influencer. 

Get fresh daily insights from an amazing team of Fintech thought leaders around the world. Ride the Fintech wave by reading us daily in your email.