Strategic by Design, Using AI to Drive Business Decisions

07 Oct 2025

🏠︎ | Past Sessions | Strategic by Design, Using AI to Drive Business Decisions

  • Event: Finance Forum 25
  • Date: 7 October 2025
  • SpeakerRosemary Nunn, Founder and Managing Director, I&K the Innovation and Knowledge Agency
  • Estimated read time: 7-8 minutes

 


 

Quick summary

This session explored how finance leaders can approach AI as a strategic capability, not a technology project.

Rather than focusing on tools, Rosemary Nunn framed AI as a design challenge, one that starts with business questions, decision intent, and governance. The discussion emphasised that value comes when AI is aligned to organisational priorities, embedded into decision making, and shaped by ethical and operational constraints from the outset.

For finance leaders, the shift is from asking what AI can do, to deciding where it should be trusted, where it should not, and how its outputs are translated into better judgement.

 


 

AI strategy starts with intent, not technology

Rosemary began by positioning AI in plain terms. She made clear that AI is not a finance system, nor a reporting shortcut, but a decision support capability that must be designed around purpose.

The central risk she highlighted was organisations adopting AI because it is available, rather than because a decision problem has been clearly defined. Without clarity on intent, AI simply accelerates noise.

For finance leaders, this reframes the conversation. The starting point is not vendor selection or automation potential, but identifying which decisions genuinely benefit from prediction, pattern recognition, or scenario testing, and which still rely on human judgement and accountability.

 

Designing AI around decisions, not data

A key theme was the difference between data driven and decision led design. Rosemary argued that many AI initiatives fail because they begin with available data, rather than with the decision that needs to be improved.

From a finance perspective, this matters because the same dataset can support very different outcomes depending on how it is framed. Forecasting, risk assessment, and capital allocation all require different assumptions, tolerances, and confidence thresholds.

Designing AI around decisions forces earlier conversations about materiality, bias, and acceptable error. It also makes explicit who owns the outcome when AI informs, but does not make, a decision.

 

The role of governance in AI driven decision making

Governance was treated as an enabler, not a brake. Rosemary described governance as the structure that allows organisations to trust AI outputs without over relying on them.

This includes clarity on where AI is advisory versus determinative, how models are monitored over time, and how changes in context are handled. She stressed that AI systems reflect the conditions in which they were trained, and those conditions rarely stay static.

For finance leaders, this aligns closely with existing controls thinking. AI governance becomes an extension of risk management, not a separate technical discipline.

“If you cannot explain why a system reached an answer, you should be very careful how you use it.” Rosemary Nunn, Founder and Managing Director, I&K the Innovation and Knowledge Agency

 

Ethics as a practical design constraint

Rather than treating ethics as abstract, Rosemary grounded it in operational reality. Ethical considerations shape which data is appropriate to use, how outputs are applied, and where human intervention remains essential.

She noted that ethical AI is less about perfect fairness and more about transparency, challenge, and proportionality. In practice, this means being clear about limitations, documenting assumptions, and creating feedback loops when outcomes do not match expectations.

For finance teams, this reinforces the importance of narrative. Numbers and models need explanation, especially when AI is involved in shaping recommendations to boards or executives.

 

Moving from reporting to decision support

Throughout the session, AI was framed as part of the broader shift from reporting to decision support. Automation can remove manual effort, but the strategic gain comes when finance uses AI to explore options, stress test assumptions, and surface early signals.

Rosemary cautioned that AI does not remove responsibility. Instead, it raises expectations around judgement. When insights are faster and richer, leaders are expected to ask better questions, not accept outputs at face value.

This places finance at the centre of AI enabled decision making, acting as interpreter, challenger, and steward of how insight is used.

 

What good looks like, practical actions for finance leaders

This session translated into several concrete actions for senior finance teams working with AI strategy.

Questions to ask before adopting AI
  • Which specific decisions are we trying to improve, accelerate, or de risk
  • What level of confidence do we need from AI outputs for this decision
  • Where must human judgement remain decisive, regardless of model accuracy
  • How will we explain AI informed recommendations to non technical stakeholders
Signals to watch inside the organisation
  • AI projects framed around tools rather than decision outcomes
  • Limited clarity on accountability when AI informs a poor decision
  • Models that perform well initially but degrade without monitoring
  • Teams treating AI outputs as answers rather than inputs
Pitfalls to avoid
  • Assuming more data automatically improves decisions
  • Treating ethics as a compliance task instead of a design principle
  • Deploying AI without clear ownership across finance and the business
  • Over automating decisions that require context and judgement

What good looks like in practice

AI driven decision making works when finance leaders define the questions, shape the guardrails, and translate outputs into insight. AI becomes a trusted component of the operating model, not a black box, supporting better conversations and more confident choices.

 

Conclusion, AI strategy is a leadership discipline

The session’s core message was that AI success is less about capability and more about design. Organisations that gain value treat AI as a strategic asset, shaped by intent, governance, and ethics.

For finance leaders, this represents an opportunity to lead. By anchoring AI in decision quality, not automation alone, finance can help the organisation move faster while staying in control, using AI to support judgement rather than replace it.

 


 

Speakers

Rosemary Nunn

Founder and Managing Director, I&K the Innovation and Knowledge Agency, Economist and innovation leader who leads partnerships with business for the University of Southampton and advises organisations on strategy, knowledge exchange, and responsible innovation.

 

View all Past Sessions
Loading