News & Views / 15 minutes with… Naureen Hussain, AI Governance Lead
16 May 2025

15 minutes with… Naureen Hussain, AI Governance Lead

With over 20 years of experience spanning law, data and business transformation, Naureen Hussain brings a rare depth of insight—most recently leading data governance and privacy strategy through the multi-billion pound Virgin Media O2 merger. At Jaywing, she’s helping clients design frameworks that embed accountability, support AI innovation, and build lasting confidence in data.

In this Q&A, she shares where firms are struggling, what “good” will look like in 2025, and how leaders can move beyond compliance to create real value.

Looking to the next 12–18 months, what emerging risks or blind spots should leaders be addressing now?

Focusing on AI—just like other transformative socio-technologies—there will come a time when the collective social consciousness will wake up to the harmful sides of AI, especially GenAI tools and agents. This is not to ignore or under-appreciate the immense benefits AI is bringing to society at large, as well as at an organisational level. However, the never-before-seen pace at which GenAI technology is being adopted puts trust at risk.  

I predict society will go through a similar experience to social media—we’re only just now waking up to the significant harm these platforms expose people to, especially young and vulnerable individuals. Attitudes are shifting, and trust is no longer a given.

Yes, further legislation and compliance standards will come (the EU AI Act was just the first out of the blocks), but if organisations want to assure their future brand and the trust of their customers and stakeholders, they must not wait for legislation. Governance and practical risk management should be baked into AI strategies and programmes from the start.

You've seen how privacy, governance, and AI intersect in large organisations. Where do you think firms are over-engineering their efforts—and where are they underinvesting?

I’m an advocate for taking an agile approach to governance—this manifests in two ways.

Firstly, ensure you are governing the risks (and opportunities) that matter to your organisation—hence the need to start with defining your governance strategy. Many governance programmes lead to overwhelm and delayed or invisible value realisation. A ‘future-back’ approach should be taken: reverse from the endpoint into a minimum viable product, with a flexible but credible roadmap for how you reach it. Start small, focus on early value realisation (to keep stakeholders and budget holders on side), and move through fast iteration cycles.

Second, effective governance is about ‘felt accountability’; it’s a culture shift that makes the difference, not more process and documentation. Obviously, there are minimum records and process requirements, but so often, I see an overload of bureaucracy and paperwork causing unhelpful friction, and this is usually to compensate for unclear or unaccepted accountabilities and insecure risk culture. 

Investment should be balanced between tooling and process development (which tends to be the focus) and strategy and culture development (which tends to be an afterthought).

How should firms balance the tension between regulatory compliance and innovation, especially with new technologies like generative AI?

I do not see regulation and innovation as contrary concepts. Good regulation enables safe and sustainable innovation. We’ve seen with social media and other digital platforms what happens when technology and the organisations that shape it are under-regulated—we reach a point where the risks manifest, but regulation after the event is extremely challenging. The horse has already bolted.

In my view, there should be ‘healthy’ tension in any organisational system to ensure optimal decisions are being taken, balancing risk and opportunity and innovation is sustainable. To ensure that tension is healthy and not an unjustified blocker, governance frameworks need to be anchored in risk and value—with differentiated pathways for business teams to follow according to the risk and value profile of their product, project, or activity.

This takes effort, but the benefit over a one-size-fits-all approach is that business teams that understand why a certain set of controls is required—or why something is outside of acceptable risk thresholds—are much more likely to engage positively and focus on finding a design or engineering-led solution than become frustrated with the governance process (or the governance teams).

What are you most looking forward to in your role at Jaywing?

Firstly, the opportunity to work with a group of super talented, friendly, and supportive people. I’ve worked with the Jaywing team (when I was client side) for a long while, and I’ve always enjoyed working with them—even in the most challenging of contexts, we’ve laughed our way through.

Beyond that, Jaywing has a stellar client portfolio, and I’m excited to work with those organisations, bringing my experience and skills to add value.

Final thoughts

From privacy and AI governance to culture, strategy and stakeholder alignment, Naureen’s perspective is pragmatic and refreshingly honest: If firms want to unlock the full value of data, governance can’t sit on the sidelines. It must be integrated, flexible, and led with purpose.

Want to connect with Naureen? Reach out to her on LinkedIn or find out more about how Jaywing can support your risk strategy here.

Stay ahead of risk news and insights. Sign up for our newsletter here to get expert insights straight to your inbox.