When it comes to artificial intelligence, the biggest risk for strategy leaders is not whether the technology is “ready” or “overhyped.” The real danger is allowing ourselves to be carried away by the pendulum swing between awe and dismissal. One moment AI is framed as an unstoppable force that will transform everything overnight, the next it is written off as a trivial tool with more marketing than substance. Neither view provides a foundation for serious strategy.
Effective strategy has always required leaders to hold nuance, to work with uncertainty, and to discern meaning from signals that are often messy. AI is no exception. Progress in the field is neither linear nor uniform, and it often appears incremental until it suddenly reframes what is possible. This is why any AI strategy worth its name must resist the temptation to live at the extremes. It must instead be rooted in disciplined sense making: the ability to observe, test, and adapt as new realities emerge.
For decades, technology roadmaps were guided by predictable curves like Moore’s Law. Leaders could model timeframes for capacity, scale, and cost with a degree of confidence. AI is now disrupting that comfort zone. Its trajectory is not bound by steady doubling of hardware performance alone but by new architectures, learning efficiencies, and emergent behaviors that move faster than traditional forecasting frameworks anticipate. This creates a strategic challenge: how do you design systems, investments, and organizational capabilities when the underlying pace of change no longer fits the old playbook?
The answer is not to freeze in skepticism or leap blindly into hype-driven adoption. Instead, strategy needs to focus on balance. This means designing AI initiatives with both flexibility and rigor. Build for exploration, but within guardrails that prevent distraction. Invest in pilots that generate learning, but tie them back to clear organizational priorities. Create governance models that are not static checklists but living frameworks, capable of evolving as the technology evolves. Above all, anchor AI strategy in the same principles that have guided enduring organizations through past technological shifts: clarity of purpose, adaptability in execution, and humility in decision making.
AI will not replace wisdom, but it will challenge how we arrive at it. Strategy leaders have the responsibility to shape that path consciously rather than reactively. Those who fall into extremes – whether as believers or skeptics – risk missing the opportunity to design strategies that are resilient, grounded, and future facing. Those who embrace the messy middle, who are willing to continuously learn and adjust, will be better positioned to turn uncertainty into advantage.
AI strategy, in other words, is not about predicting the future perfectly. It is about preparing organizations to engage with the future wisely.
Manu Sharma
https://manusharma.ca

