Arne Vanhoyweghen published this blog post with Cathy Macharis in Medium.
Imagine you’re faced with a gamble or a tough choice. Your gut tells you, “Don’t do it!” But the experts in decision science warn you not to trust this voice, as it can set you on a path toward irrationality. Daniel Kahneman has long taught us that our gut instincts (our intuitive System 1 thinking) often mislead us. According to him, our intuition is prone to biases that lead to systematic errors, especially in uncertain situations.
Kahneman’s research paints a picture of human decision-making that is brilliantly flawed: we rely on rules of thumb and gut feelings that can sabotage our decisions. But what if one of the key assumptions behind that idea was flawed? What if intuition isn’t always the enemy of rational thinking, but rather, we’ve been using the wrong measuring stick for rationality all along?
Enter ergodicity economics, a new lens on decision-making that just might turn some of Kahneman’s wisdom on its head. In this blog post, we’ll explore Kahneman’s core ideas on intuition and bias, unpack what ergodicity means, and see how this perspective challenges the way we think about “rational” decisions.
By the end, you might question whether some of your so-called irrational choices were actually wise all along.

What seems irrational in the short term may be essential for survival over time.
Kahneman’s Core Idea: Intuition and Biases in Uncertain Decisions
The work of Daniel Kahneman and Amos Tversky showed that people don’t always make decisions by carefully weighing probabilities and outcomes. Instead, we often rely on intuition and mental shortcuts, i.e., heuristics. This behavior stands in contrast to classical decision theory, which assumes that a rational person under uncertainty would calculate the expected value of each option and choose the one with the highest payoff.
This assumption, rooted in expected value calculations, was the foundation of early theories of rational choice. But it soon became increasingly clear that real human behavior didn’t follow this pattern. To reconcile this, economists proposed that individuals instead optimize expected utility, a subjective measure of how desirable outcomes are, not just how large. Kahneman’s work made it clear that even expected utility doesn’t fully explain human behavior. In real-world situations, people routinely rely on “System 1”, the brain’s fast and intuitive processing, which, while efficient, can lead to predictable biases.
Some of the most well-known biases are:
· Loss aversion: Losses hurt more than equivalent gains feel good. For example, losing $100 feels worse than winning $100 feels good.
· Probability weighting: We misperceive the likelihood of events. Small probabilities are often overweighted (e.g., fearing plane crashes), while near certainties may be treated as guaranteed.
These patterns lead us to systematically deviate from the “rational” decisions prescribed by expected utility theory. In Kahneman’s view, such deviations stem from cognitive biases and emotional reactions that distort our judgments.
But what if our intuition, shaped by real-world experience, is picking up on something that expected utility theory fundamentally misses? This brings us to the concept of ergodicity.
Ergodicity 101: One Average Is Not the Other
Ergodicity is a concept from mathematics and physics with serious implications for everyday decisions. In simple terms, an ergodic process is one where the long-term average for an individual equals the average across many individuals at a given point in time.
Let’s break that down:
- Ergodic: If lots of people each do something once, the group’s average result is the same as what one person would get by doing that thing many times. For example, if 600 people roll a fair six-sided die once, the average outcome is 3.5. If one person rolls the same die 600 times, their average will also be about 3.5. Dice rolls are ergodic, meaning that your average over time matches that of the group.
- Non-ergodic: In this case, the group average can be a misleading indicator of what happens to an individual over time. A dramatic (but useful) example is Russian roulette. If 20 people each play once, most survive, and a few win the prize — the group average doesn’t look bad. But if one person plays 20 times, their long-term outcome is almost certainly fatal. For the group, the risk is diluted; for the individual, it accumulates. In non-ergodic cases, the expected value doesn’t reflect what will happen to you over time.
Many real-life decisions are non-ergodic, where your outcome over time matters more than the average across a group. Yet traditional economic thinking often treats all situations as if they were ergodic, assuming that maximizing expected value leads to the best outcome for the individual. Instead, Ergodicity economics maximizes your time-average and asks:
What if we judged decisions based on how they affect an individual’s wealth, health, or well-being over time, not just on the ensemble average?
From this view, avoiding risks that could ruin you isn’t irrational; it’s essential for long-term survival. For example, take insurance. Statistically, buying home insurance has a negative expected value. The insurance company makes money, which means you lose money on average. So why do so many people willingly buy it? Kahneman might say it’s due to loss aversion, which makes us overweigh rare, scary events like house fires. However, ergodicity economics offers an alternative explanation: it’s because the average outcome across individuals isn’t what matters. What matters is that a house fire could ruin you personally. In that case, insurance allows you to recover and keep going. So, even if insurance is always a net loss in terms of expected value, it can be a net gain in terms of time averages because it allows you to stay in the game and keep growing rather than risk ruin for small savings. Of course, not all insurance makes sense from the perspective of Ergodicity Economics. Insurance is only interesting when it meaningfully improves your long-term outcome, such as by protecting against ruin or major setbacks.
Expected Value vs. Time Average: A Numerical Example
Economists love using a good gamble as a thought experiment to make abstract ideas more concrete. So, consider the following scenario — a simple coin flip that highlights the difference between expected-value thinking and time-average reasoning.
You start with $100, and we flip a fair coin:
- If it lands heads, you gain 50% → you now have $150.
- If it lands tails, you lose 40% → you now have $60.
At first glance, the bet looks attractive. Using the expected value, the average outcome is straightforward: 0.5 × $150 + 0.5 × $60 = $105, i.e., a 5% gain on your $100.
From a classical economic perspective, this is a favorable bet. Therefore, if you hesitate, behavioral economics might say you’re falling into the trap of loss aversion, irrationally overvaluing the fear of losing $40 over the satisfaction of gaining $50. But ergodicity economics invites you to ask a different question.
What happens if you take this bet repeatedly over time?
When you examine how your outcomes evolve over time, you’ll quickly notice that the expected value is misleading because this gamble is non-ergodic. But why is this gamble considered non-ergodic, even though there is no catastrophic ruin, such as in Russian roulette or house fires? The key is that the gamble is multiplicative: your outcomes compound. You’re not gaining or losing fixed amounts but percentages of your current wealth. Each result sets the stage for the next. This is path dependency in action: what happens next depends on what happened before, just as losing in Russian roulette determines whether you even get another round. And it’s precisely that path dependency that changes everything. It makes the time-average behavior very different from the ensemble average (the one-shot expected value).
As an example, consider what is most likely to happen with a fair coin, i.e., you win and lose once:
- Win, then lose: $100 → $150 → $90
- Lose, then win: $100 → $60 → $90
Either way, you’re down 10%, even though the average outcome per round looked positive.
So, while the expected value suggests a +5% gain, the time-average growth rate is negative, around –5%. The bet looks good on paper, but erodes your wealth in practice. This is a classic example of a non-ergodic process. From this perspective, refusing the bet isn’t irrational at all. It’s not an emotional overreaction or a failure of logic; it’s a smart, intuitive rejection of a strategy that would harm you in the long run. What looks like “bias” under one model turns out to be rational foresight under another.
Rethinking Intuition: A New Perspective on Rational Decision-Making
Viewing decisions through the lens of non-ergodicity provides a fresh perspective on what we often dismiss as bias. It suggests that our intuitive aversion to certain risks and our apparent “biases” might not simply be cognitive flaws to fix but could be sensible adaptations to the time-sensitive reality we live in. In Kahneman’s framework, when you decline a positive-expected-value gamble or buy insurance for peace of mind, you might be branded as irrational or biased by loss aversion. But from the ergodicity economics standpoint, you’re recognizing, perhaps subconsciously, that your outcome over time is what matters, not the expected outcome. In other words, maybe people are more rational than Kahneman gave us credit for. We’re just being rational in a dynamic, real-world sense . When faced with an uncertain choice, our gut often asks, “What will this probably mean for me down the road?” rather than “What will happen to the average person?” If the long-term consequences look risky, even when the immediate odds look favorable, our hesitation might not be a mistake. It may reflect an intuitive survival strategy suited for a non-ergodic world, where losing once can permanently change your path.
That’s not to say Kahneman’s insights are wrong or irrelevant. Cognitive biases are real, and we do make errors. The point is nuance. Some decisions that look irrational under the expected-value lens may make perfect sense when we account for time dynamics and individual exposure to risk. Ergodicity economics helps bridge the gap between psychological insight and the real-world behavior of systems that evolve over time. Rather than always blaming our intuition for being myopic or foolish, ergodicity economics asks if the context (the environment in which the decision is made) is non-ergodic, in which case the decision criteria should change.