The importance of agile data architecture in financial services

The move by traditional financial organisations to offer a modern interface to customers has been progressing for a number of years via digital transformation, new apps and services and back office modernisation.

They have been obliged to modernise because of regulations such as PSD2, consumer expectations, and the rising tide of fintech startups which threaten their market share.

The raw material for this transformation is data. Financial organisations are major employers of data scientists and use the models they create to optimise their services, create new products, assess risk and reduce fraud. They also have a great deal of data at their disposal, and make more use than other sectors of streaming technologies.

In view of the vast volumes of data flowing through financial organisations, the need for real-time information, as well as the many regulations with which they must comply, the infrastructure that channels the data to where it’s needed is of paramount importance.

Unsurprisingly, with their inevitable technical debt, the quality of the data infrastructure that exists in venerable financial organisations, is a mixed bag, according to a recent Computing survey of 150 IT leaders, with most expressing a middling degree of confidence.

Linked to confidence is the ability to make the best possible use of the data at their disposal. Again, while some financial organisations are well advanced in this regard, others are not. Fifty-three per cent make use of less than half of the available data, with a fifth utilising less than 30 per cent.

Blockers to the sort of data agility that most desire include integration issues, concerns about security, and the sheer complexity of their infrastructure, again, not surprising in a traditional, highly regulated sector.

Looking more closely at integration issues, financial organisations were much more likely to mention poor data quality than other sectors. A whopping 74 per cent of financial services companies represented here raised this as a problem, compared to 49 per cent of companies as a whole. Data quality is an ongoing bugbear for finance precisely because it is such a heavily regulated sector, and much resource is expended in cleaning and validating data on arrival.

The issue of data quality is closely related to the second and third most frequently raised issues, namely data platforms and formatting. There can also be issues with third-party data, as this respondent said.

“Data quality is the main challenge as we rely on master data sent by external partners – we have no control over the quality of this data, and we regularly see issues where mandatory data elements are missing or incorrectly formatted. These cause issues in day-to-day operational situations and subsequently in the data warehouse.”

Integration issues also extended to the cloud, where IT leaders in finance felt they were behind the curve.

Taken together, these responses make a strong case for the automation of ETL (and/or STL) to clean, deduplicate and transform data into the necessary format, preferably – in view of the ongoing skills shortage – one with a simple UI and lo-code functionality. These tools should be designed for the streaming era, rather than the batch processing of yesterday.

Data-driven strategies in financial services require agile data architectures and trusted data, and ingestion, integration and preparation platforms need to align with those goals.

For the UK financial sector, the alternative is to fall further behind in our increasingly data-driven recovery, and risk eventually being outrun altogether.

Download the Computing research white paper, sponsored by Qlik: How Agile Data Architectures Will Power Recovery in Financial Services