Prediction is Cheap, Strategy is Key
At the beginning of 2025, the World Economic Forum hosted a conversation titled “The Day After AGI.” The speakers weren’t pundits or academic observers; they were the two men arguably closest to building Artificial General Intelligence: Demis Hassabis of Google DeepMind and Dario Amodei of Anthropic. If anyone could predict when AGI would arrive, it should be them.
The result? Amodei suggested that by 2026 or 2027, we could see an AI model capable of Nobel-level performance across multiple fields. He noted that within Anthropic, many engineers were already shifting from writing code to editing code generated by models, predicting that within 6 to 12 months, models could handle the vast majority of an engineer’s work end-to-end.
Hassabis took a different view. He estimated a 50% chance of systems achieving full human cognitive capabilities by the end of the century. He emphasized that while AI has made staggering progress in “verifiable” domains like coding and math, the challenges in natural sciences are significantly higher. The pinnacle of scientific creativity—asking the right questions or formulating entirely new hypotheses—remains a capability where AI is still distinctly lacking.
Same question. Same access to first-hand information. One says two years; the other says seventy-five.
This isn’t a margin of error. This is a fundamental divergence.
It reminds me of something a boss told me years ago when I worked in finance: “Prediction is cheap.” He meant that almost anyone can make a prediction, especially if they have no skin in the game. It costs nothing to say what might happen, and everyone can find ample reasons to justify their forecast.
But for most of us, what truly matters isn’t the prediction itself. It’s the decision—the strategy—we make in its wake. After all, we still have to eat; we still have to live. We cannot wait until the future is crystal clear before we act.
The Problem with Prediction
The issue with prediction isn’t that it’s often wrong—though it usually is. Even if Amodei or Hassabis turns out to be precisely right, for the vast majority of us, that knowledge changes nothing about our immediate reality. The real danger lies in the false certainty that prediction provides.
If you believe Amodei, you might panic and upend your career tomorrow. If you believe Hassabis, you might choose to stay the course and change nothing. But regardless of who you trust, you are placing all your eggs in one basket. That isn’t strategy; that is gambling.
Consider the stock market. If I predict the market will rise 20% next year, what use is that prediction? It tells me nothing about the volatility in between. What do I do if it crashes 10% in February? What if it skyrockets beyond my target in June? We cannot base our entire course of action on a single point forecast. This applies to investing, and it applies even more to the complex decisions of our lives and careers.
In The Black Swan, Nassim Nicholas Taleb points out that we live in “Extremistan”—a world where rare, unpredictable events (Black Swans) are systematically underestimated, yet are the very forces that define history. It’s not that we aren’t smart enough; it’s that complex systems are inherently unpredictable. When the two leading experts on AGI cannot agree, it is not a failure of their intellect. It is the nature of the problem.
The deeper question is this: Even if you had a perfect prediction, would you know how to act?
This explains why so many people appear busy—furiously learning new AI tools, chasing trending skills, signing up for every course. In the face of AI-driven change, this anxiety is understandable. But if you haven’t clarified what you actually want and how these actions help you get there, this busyness is merely “tactical diligence masking strategic laziness.”
Mastering every feature of ChatGPT without considering its long-term implication for your career; following every tool update without asking if it aligns with your goals; doing what everyone else is doing without wondering if it fits your specific situation.
These are actions. They are not strategy.
So, what should we do?
The Essence of Strategy
Strategy is not a prediction of the future, nor is it a to-do list. Strategy is a framework for how we make decisions and take action under varying conditions.
Strategic thinking begins with a simple but difficult question: What do I actually want?
This purpose doesn’t need to be grandiose, like “funding a unicorn” or “becoming a thought leader.” It can be conservative: “not being made obsolete.” It can be pragmatic: “maintaining my quality of life while keeping room for growth.” It can be a mix: “protecting my downside while keeping an open mind for upside.” The important thing is that you can articulate it.
Only when you have a purpose can you judge which strategy fits you. A founder seeking a breakthrough needs a completely different strategy than a professional with a family seeking stability. There is no “best” strategy, only the one that is best for you.
But purpose alone isn’t enough. We still face that fundamental problem: uncertainty.
We don’t know if AI will take two years or seventy-five. We don’t know which jobs will disappear or where new opportunities will arise. We might assign probabilities—30% scenario A, 60% scenario B—but how do we act on that?
This is the core difference between strategy and prediction: Strategy is not a single bet. It is a method for acting amidst uncertainty, tailored to your goals, resources, and constraints.
Fortunately, there are mental models to help us build such a strategy.
Nassim Nicholas Taleb offers a powerful concept in Antifragile: The goal shouldn’t just be to “withstand” uncertainty (robustness), but to actually benefit from it. He calls this “Antifragility.”
How do we achieve this? One method Taleb suggests is the “Barbell Strategy”:
- Avoid the “middle ground” of moderate risk/moderate reward.
- Instead, combine extreme safety with extreme risk-taking.
- Ensure that no matter which future arrives, you are prepared.
We can adapt this into a two-bucket framework for the AI era:
First, Follow the Consensus (Downside Protection). This ensures you don’t get left behind. But “consensus” here doesn’t mean sticking to the status quo or pretending AI won’t change anything. Quite the opposite. The current consensus is that AI will bring major changes and we need to adapt.
Following the consensus means learning the AI tools and skills that the market currently values, understanding the new rules of the game, and maintaining your baseline competitiveness. This is your safety net. It ensures that even if you don’t become a superstar, you won’t be obsolete. You move with the herd so the herd doesn’t trample you.
Second, Differentiated Investment (Upside Capture). If following consensus ensures you don’t lose, differentiated investment is how you win. This means dedicating a portion of your energy to explore areas that the mainstream undervalues, that schools aren’t teaching, but that you believe might be critical.
This requires “First Principles” thinking. Instead of looking at what others are doing, ask: What will truly be scarce in a world of abundant intelligence? It might be judgment, taste, the ability to ask better questions, or the capacity to connect disparate fields.
The key is that these investments should be small enough that failure is acceptable, but the potential upside is non-linear. While everyone learns to prompt, you might be learning how to curate. While everyone chases efficiency, you might be cultivating depth.
This approach doesn’t require you to predict the future. It prepares you for many futures.
Nassim Taleb’s Barbell Strategy suggests putting, say, 85-90% of your assets into extremely safe investments, and 10-15% into highly speculative ones. You avoid the “danger zone” in the middle where the risk is hidden but the returns are capped.
You don’t have to follow those exact percentages. The point is to make a conscious choice based on:
- Your Risk Tolerance: What is the cost of failure?
- Your Time Horizon: How much time do you have to experiment?
- Your Goal: What are you optimizing for?
There is no standard answer. There is only your answer.
An Antifragile Strategy Example
If you choose to adopt an “antifragile” approach, your strategy should have three characteristics:
- Sufficient Protection: Even in the worst-case scenario, you are not wiped out. You maintain relevance in the mainstream.
- Capacity to Capture Opportunity: If a “positive Black Swan” occurs, you are positioned to grab it. These opportunities come from your unconventional bets—your side projects, your exploration of niche fields, your unique combinations of skills.
- Adaptability: Strategy is not a one-time decision. Every 6 to 12 months, you re-evaluate. Has the probability of AGI changed? Are your experiments yielding results? Are there new signals? You adjust your allocation accordingly.
This is just one way to think. You might have a completely different goal—singular mastery, rapid prototyping, or network building—and thus a different strategy.
Returning to the WEF panel: Whether Amodei is right (two years) or Hassabis is right (seventy-five years), an antifragile strategy works:
- If it’s two years: You have your consensus preparation; you won’t be caught off guard.
- If it’s seventy-five years: You have your differentiated investments; you have long-term potential.
- If it’s somewhere in between: You can adjust as the reality unfolds.
This isn’t prediction. This is preparation. It’s not a bet; it’s an option.
Strategy Is How You Live
At the end of that WEF session, the moderator asked both men: “What is the most important thing to watch in the coming year?” Amodei pointed to the scaling of AI system construction. Hassabis pointed to breakthroughs in world models and continuous learning.
Two different answers. Two different focal points.
The key isn’t which prediction will come true. The key is whether you have built a strategy that can navigate either outcome.
Prediction is cheap. Anyone can do it. Everyone can find reasons to support their guess. But a prediction made without responsibility has little meaning for your life.
Strategy is how you live. It is finding a way to act in the face of the unknown. It is not betting on one future, but preparing for many. It is not seeking the comfort of false certainty, but building the capacity to handle whatever comes.
Stay stable within the consensus. Find opportunity at the edges.
But how do you know where to place those edge bets? How do you think from first principles about what the future truly needs?
These are questions worth exploring deeper.
For myself, over the last few years, I have maintained my competitiveness in my professional field (my consensus bucket), while dedicating 10-15% of my time to exploring life sciences—a field I knew nothing about. I don’t know if it will be as critical as I suspect. But I know that if it booms and I am unprepared, I will regret it. And if it doesn’t? I have simply spent a fraction of my time broadening my horizon.
I haven’t predicted the future. I have simply bought myself an option.
What will your option be? What will your strategy be?
I cannot answer that for you. But I hope this helps you start asking the right questions.