Many banks are updating their analytics systems. The irony is, they often end up rebuilding the same old problems, including the same controls, costs, and limitations they wanted to escape.
Modern platforms speed up the development of models for tasks such as credit risk and fraud. But then the whole process grinds to a halt. Why? Because the essential checks and approvals (governance, assurance, and risk management) still rely on slow, manual steps, like documentation and getting procedural sign-offs. This means the old way of working is simply carried over to the new technology.
The success of modernisation comes down to one thing: if governance isn't built directly into the execution process, your analytics transformation will just carry over old problems instead of creating a truly new approach.
This article draws on experience from regulated analytics modernisation programmes across credit risk, fraud and financial crime, with input from Alex, CEO of trac.
|
Key takeaways from this article:
|
The problem no one planned for: the cost of regulated analytics

Expectations for regulated analytics have increased. Regulators now demand greater transparency, clearer explanations, and strong auditing for credit risk, fraud, and financial crime models. Companies must now prove the outcomes of their models, as well as show exactly how those results were created, managed, and reviewed.
What’s changed far less is how many analytics environments are actually run. Core tooling and operating models have often remained largely unchanged, even as regulatory demands have risen. This mismatch has left firms under pressure to evidence control in ways their platforms were never designed to support.
And the response to this has been largely manual. For instance, additional controls, documentation packs, offline checks and procedural approvals have been added around existing workflows. Each control may make sense on its own. Over time, however, they have created heavier processes and growing operational overhead.
The impact of this shows up most clearly in resourcing. Teams expand to manage controls, produce evidence and reconstruct model behaviour after the event. What is described as increased compliance effort is more accurately understood as technical debt from legacy analytics environments converted into ongoing operating cost.
Tooling has changed. Operating models have not
Across regulated analytics, most institutions now run mixed estates. SAS remains embedded in many credit and fraud environments, while Python is increasingly used for new development and model rebuilds. Modernisation rarely involves a clean transition from one platform to another.
Historically, SAS provided more than analytics capability. It came with an implicit operating model for regulated use cases, shaping how models were developed, promoted, controlled and evidenced. Governance processes and assurance practices grew around these assumptions over time.
Python-based environments are different. They offer flexibility, integration with enterprise stacks and faster development cycles, but they do not define how analytics should be governed once deployed. Decisions about control, assurance and evidencing sit outside the language itself.
Problems emerge when existing governance frameworks are carried forward unchanged. Whether models run in SAS, Python, or both, legacy assurance approaches are often reapplied without adjustment. Manual controls, procedural approvals and documentation-heavy processes persist, driving cost and complexity regardless of the underlying technology.

Where most modernisation programmes are getting stuck
As institutions move regulated analytics onto modern technology stacks, most programmes tend to follow one of two approaches:
The first is the adoption of modern data science platforms. These platforms offer strong development environments, scalable execution and mature CI/CD capabilities. Productivity improves, and alignment with enterprise technology strategies is usually good.
The second approach is to build bespoke execution environments. This often happens where existing platforms do not meet current governance or control requirements. Engineering teams respond by creating custom frameworks designed to support regulated use cases.
Despite their differences, both approaches frequently lead to similar outcomes. And in many cases, governance is enforced through familiar mechanisms:
- environment segregation
- manual approval steps
- extensive documentation and evidencing
- offline checks and sign-off processes
These controls mirror those used in legacy analytics processes. As a result, the operating model many institutions were aiming to move away from is reproduced on new infrastructure.
Although it sounds obvious, this has two consequences. Cost and complexity are carried forward into the next generation of platforms. And in some cases, the controls themselves are weaker than before, as manual processes struggle to keep pace with more dynamic analytics environments.
The operating model insight: governance has to move closer to execution
The problem is, traditional assurance frameworks were designed for a very different analytics environment. Models were treated as relatively static assets, reviewed periodically and controlled through human-mediated processes. Validation, sign-off and audit relied heavily on documentation produced alongside the model rather than evidence generated by its execution.
Those assumptions are increasingly misaligned with how analytics now operates. Models change more frequently, development cycles are shorter, and analytics is embedded more deeply into decisioning and operations. Under these conditions, governance that sits outside execution becomes harder to sustain.
Modern analytics environments require a different set of foundations:
- results that are repeatable by default
- versioning and evidencing generated as part of execution
- auditability that does not depend on reconstruction after the event
Essentially, this is a change in how analytics is governed across its lifecycle. Bringing governance closer to execution moves assurance from a manual, retrospective activity to something that is inherent in how analytics runs day to day.
Tip: For risk, validation and transformation leaders, this is the core of model modernisation. Technology choices are important, but without an operating model designed for modern analytics, the same pressures on cost, control and confidence will continue to surface.
What “good” starts to look like in practice
Rather than enforcing control around analytics through layers of manual process, governance is being built into how analytics is executed in production.
At the centre of this is the idea of a governed execution layer. Instead of constraining how models are authored upstream, this approach focuses on embedding richer controls into how analytics is executed within the enterprise stack, rather than relying on manual processes to meet governance requirements.
When governance is embedded at this point, a number of benefits are produced:
- results are auditable and reproducible by default
- versioning, parameters and execution context are captured automatically
- documentation is generated as part of running analytics, rather than assembled afterwards
Of course, this reduces the need for manual controls, offline checks, and post-hoc reconstruction. Assurance becomes less about recreating what happened and more about relying on evidence produced as analytics executes.
Crucially, it also aligns more closely with modern enterprise analytics environments. Modern analytics development can continue to evolve using standard tools and platforms, while regulated use cases benefit from consistent execution and built-in evidencing.
One example of this is trac, which operates as a governed execution layer that makes regulated analytics auditable, repeatable and self-documenting by design. Used in this way, technology supports the operating model rather than defining it, allowing institutions to modernise analytics without carrying forward the constraints of the legacy estate.

What this means for risk, MRM, and transformation leaders
The bottom line is, analytics modernisation can disappoint in highly regulated functions when the operating model that governs how models are used, assured and evidenced is simply ported onto a new platform or modelling language.
For risk and model risk management leaders, this creates a familiar tension. Development accelerates, but assurance frameworks remain static. Controls become harder to maintain, confidence erodes, and manual effort grows rather than falls.
Addressing this requires coordinated change. Risk, validation and technology teams need to modernise together, with governance designed to support how analytics now operates. It’s best to treat this as a redesign of the operating model that underpins regulated analytics, rather than a platform migration exercise.
Modernisation is an operating model decision
Most enterprise analytics platforms were designed for general analytical and operational use cases rather than regulated model governance. As a result, many firms modernising analytics still rely on manual controls, documentation and procedural assurance wrapped around the technology.
Firms should also think more critically about what the future operating model for regulated analytics needs to look like. If governance can be built more directly into analytics execution, technology can handle more of the control, audit and evidencing requirements that currently rely on manual processes.
Firms that change both the technology and the operating model are less likely to carry legacy cost and complexity into modern analytics environments.
If you’re reviewing how regulated analytics is governed as part of a model modernisation programme, we can help. Jaywing works with risk, validation and transformation teams to redesign operating models that deliver control without recreating legacy cost. Get in touch if you’d like to learn more.