Regulations have always demanded a lot from financial services firms, but in recent years this has become more complex: the number of regulations’ firms have to comply with is increasing; the burden of proof against such regulations is increasing; and the need to supply ever more granular data is increasing.
Due to their very nature, financial services firms generate and store vast amounts of data. Data is held in a variety of disparate legacy and new systems and is generally only joined as needed to satisfy specific business outcomes. With exponentially increasing data volumes, making sense of data from different systems is becoming an increasingly more complex and time consuming task.
To a certain extent it is understandable that, to date, most organisations have taken a reactive approach to regulation, setting up projects to meet the specific needs of each regulatory requirement (e.g. Stress Testing, IFRS 9, CRD IV, BCBS 239) as it has come along. This siloed data management approach however is fraught with danger:
- It results in inefficiencies for the submitting organisation in collating the required data as no advantage is taken of extensive and robust approaches already implemented elsewhere in the organisation;
- It is difficult for the submitting organisation to be 100% confident that data submitted for one regulatory requirement is consistent with data submitted for other requirements;
- Where such inconsistencies are spotted by the regulator there is the potential for reputational risk, financial penalties and at least a call on already stretched resource to explain the differences.
While the regulators would argue that sufficient time is allowed between regulatory requirements being finalised and data submissions being required, the implementation process at organisations (translate requirements and assess how they apply to their own working practices; set up projects to deliver against the requirements; produce, test and supply the required data submissions) often result in last-minute first submissions. It is not unusual for first data submissions to also have required a number of manual processes.
Significantly, once an initial data submission has been made, organisations are very reluctant to change their mechanisms for producing subsequent refreshes of the same data for fear of introducing discrepancies and then having to spend time explaining away the differences.
While organisations may perceive current regulatory requirements as being onerous, they actually are not, as regulators currently accept data submissions in the format of management information reports rather than granular data. However it will not be too long before regulators have the ability to receive, hold and process individual data entities. Irrespective of the level at which data is required, it is fundamental that organisations be able to trust data quality and prove data lineage for each individual data item, whether this data is provided discretely, aggregated or provided as a component of a calculation.
A best practice approach to developing a strategy for data management in regulatory compliance includes: data integration, data quality, master data management, data governance and data analytics. Each of these pillars supports financial services firms in complying with regulatory objectives, including information access, processing and storage.
Financial firms must transform their data management approach to unlock data from its silos, incorporate unstructured information from non-traditional sources, and integrate information whether on-premises, in the cloud, batch-based, or real-time.
The new battleground for fighting each of these varying and unique regulations is best practice data management, as it enables companies to effectively manage the complete data lifecycle and also establishes a foundation for rapid and reliable compliance initiatives. In addition, a foundation for compliance based on a solid data management strategy will reduce the cost of managing new regulations and can result in a clear and competitive capital advantage.
As stated earlier, organisations are reluctant to change procedures and processes once initial data submissions have been made, almost irrespective of how inefficient these processes are at producing the required regulatory datasets. The good news is that through the use of experienced practitioners who understand both Credit Risk business functions and IT, it is possible to take an evolutionary approach to data management for regulatory compliance resulting in not only more efficient, consistent and trusted data submissions but also the foundations of more efficient, consistent and trusted data from which to gain commercial advantage.
To find out how to overcome the data management challenge, download our data management best practice guide.
Our industry-leading data management approach brings data sources and systems together and lets you cleanse, transform, and shape data for all your business purposes - whether for Business Intelligence (BI), complying with regulations or managing risk.