I have sat in boardrooms that could not have been more different. Some were polished rooms with long tables and bottled water lined in neat rows. Others were unstable connections over virtual screens, with tired faces scattered across time zones. Yet no matter the setting, the same truth kept surfacing: boards are not remembered for the policies they sign off or the budgets they approve. They are remembered for what they failed to see coming.
And today, nothing challenges boards more than artificial intelligence.
AI is no longer something that might arrive in the future. It is here, shaping how decisions are made, how services are delivered, and how people experience power. In the humanitarian world, it influences how crises are analysed, how risks are modelled, and even how funding is prioritised. Yet, in too many boardrooms, AI still appears as a line item under “digital transformation,” tucked between cybersecurity updates and data protection reviews.
AI is not a tool to be tucked away. It is an infrastructure that reshapes the entire ecosystem around us. It determines the flow of information, the speed of decisions, and the balance between opportunity and risk. Like electricity or the internet, it is not an option to ignore. It is the environment in which organisations now exist.
The first principles
When faced with a shift of this magnitude, governance has to return to first principles. It is not about the technology itself, but about the questions it forces us to ask.
What problem are we trying to solve?
Whose voices are shaping the design, and whose are excluded?
Who owns the data that feeds our models?
What lines must never be crossed, even if the system allows it?
These are not questions for developers alone. They are the questions that belong at the heart of the board table.
The temptation to move fast
Technology circles are built on the mantra of moving fast and breaking things. That energy has delivered innovation, but it has also left behind broken trust, unsafe systems, and unintended consequences that still ripple today.
In humanitarian and social impact work, moving fast without reflection can be devastating. But moving too slowly is also dangerous. Communities cannot wait for governance processes that take years to catch up.
This is the paradox for boards. We must allow executives to move with speed, to test, to adapt, to use tactical innovations that bring immediate value. But we must also hold the line on values, ensuring that dignity, accountability, and human agency are not casualties of progress.
Oversight is not enough
Too many boards are still trapped in a cycle of oversight alone. They review budgets, sign off policies, and receive quarterly updates. But AI requires more than oversight. It requires foresight.
Oversight asks if the strategy was delivered. Foresight asks if the landscape itself is changing and whether the organisation is prepared for what is coming next. Oversight measures. Foresight imagines.
Boards that understand this distinction become more than brakes. They become compasses. They slow down the rush long enough to ask questions that no one else can afford to pause for. And then they clear the path for the organisation to move forward, not blindly, but with intention.
Thinking in ecosystems
This shift requires boards to think beyond the organisation. AI is not a single product, it is a force that reshapes entire ecosystems. Regulations, infrastructures, partnerships, communities, and donors are all influenced. Boards must see across these layers.
There is the regulatory layer, where compliance with GDPR and emerging AI laws determines legitimacy.
There is the ethical layer, where commitments to agency, diversity, and rights are tested daily.
There is the operational layer, where AI can improve efficiency but also create dependencies.
And there is the community layer, where affected people live with the consequences of data-driven decisions.
Boards that fail to take an ecosystem view will find themselves making narrow choices while the world shifts around them.
Building the infrastructure of trust
From my own experience leading investments in ERP systems, automation, and early AI applications, I learned that the technology itself is rarely the hardest part. The real challenge is building an infrastructure of trust.
I built frameworks around stewardship, ownership, and communities of practice. I chaired governance boards that brought together business process owners, developers, and end users. I saw how trust could turn resistance into adoption, and how the absence of trust could sink even the most sophisticated system.
This is where boards have an irreplaceable role. Technology can only succeed when it is trusted. And trust cannot be coded. It must be governed.
Quantum moments for boards
I sometimes call this a quantum moment for governance. Not because it involves physics, but because AI creates ripples that move in many directions at once.
A decision taken in a European boardroom about how data is stored can change the reality for a displaced family in the Middle East. A model designed in Silicon Valley can affect how aid is distributed in Lagos or Kabul. AI collapses distance. It makes boardroom choices more immediate, more consequential, and more complex.
Boards must be able to think in layers and in possibilities. They must hold the macro scale of ecosystems and regulations, while zooming in to the micro when ethics, privacy, or security are at stake. They must live in contradiction, balancing speed with reflection, innovation with values, ambition with humility.
The courage of foresight
None of this is easy. But boards are not meant to be easy places. They are meant to be places of courage.
Courage to ask the uncomfortable questions.
Courage to tell executives when ambition runs ahead of values.
Courage to imagine not only the future that technology promises, but also the futures it could destroy if left unchecked.
AI will not replace boards. But boards that ignore AI will drift into irrelevance. The organisations they serve will either stumble forward without direction, or worse, be directed by forces they never paused to understand.
The responsibility of boards in this moment is not only oversight. It is foresight. It is the ability to look ahead, to see beyond the quarterly agenda, and to anchor decision-making in first principles that do not bend with convenience.
The boards that succeed will be those that embrace this discipline of foresight. They will not fear AI, nor worship it. They will engage it with humility and imagination. They will remember that technology is powerful, but people are the purpose.
Because in the end, governance is not about only about systems. It is about humanity.
Ali Al Mokdad