Thinking in Bets is a book about decision-making under uncertainty, written by a poker player.
My impression of Thinking in Bets is mixed. On the one hand the book covers a fascinating topic and it was interesting to see how a poker player makes decisions. On the other hand I disliked the writing style, there was too much fluff for my taste.
Introduction: Why This Isn't a Poker Book
Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.
Life Is Poker, Not Chess
When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn't turn out well in the short run.
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable. When we say, "I should have known that would happen", or, "I should have seen it coming", we are succumbing to hindsight bias. Those beliefs develop from an overly tight connection between outcomes and decisions. That is typical of how we evaluate our past decisions.
We link results with decisions even though it is easy to point out indisputable examples where the relationship between decisions and results isn't so perfectly correlated. No sober person thinks getting home safely after driving drunk reflects a good decision or good driving ability. Changing future decisions based on that lucky result is dangerous and unheard of (unless you are reasoning this out while drunk and obviously deluding yourself).
When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.
Poker [...] is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time. (Not coincidentally, that is close to the definition of game theory.) Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don't know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions from the influence of luck is difficult.
If we want to improve in any game – as well as in any aspect of our lives – we have to learn from the results of our decisions. The quality of our lives is the sum of decision quality plus luck.
Making better decisions starts with understanding this: uncertainty can work a lot of mischief.
Our lives are too short to collect enough data from our own experience to make it easy to dig down into decision quality from the small set of results we experience. If we buy a house, fix it up a little, and sell it three years later for 50% more than we paid, does that mean we are smart at buying and selling property, or at fixing up houses? It could, but it could also mean there was a big upward trend in the market and buying almost any piece of property would have made just as much money. Or maybe buying that same house and not fixing it up at all might have resulted in the same (or even better) profit.
We get only one try at any given decision [...] and that puts great pressure on us to feel we have to be certain before acting, a certainty that necessarily will overlook the influences of hidden information and luck.
We are discouraged from saying "I don't know" or "I'm not sure". We regard those expressions as vague, unhelpful, and even evasive. But getting comfortable with "I'm not sure" is a vital step to being a better decision-maker. We have to make peace with not knowing.
Embracing "I'm not sure" is difficult. We are trained in school that saying "I don't know" is a bad thing. Not knowing in school is considered a failure of learning. Write "I don't know" as an answer on a test and your answer will be marked wrong. Admitting that we don't know has an undeservedly bad reputation. Of course, we want to encourage acquiring knowledge, but the first step is understanding what we don't know.
What good poker players and good decision-makers have in common is their comfort with the world being an uncertain and unpredictable place. They understand that they can almost never know exactly how something will turn out. They embrace that uncertainty and, instead of focusing on being sure, they try to figure out how unsure they are, making their best guess at the chances that different outcomes will occur. The accuracy of those guesses will depend on how much information they have and how experienced they are at making such bets.
When we think in advance about the chances of alternative outcomes and make a decision based on those chances, it doesn't automatically make us wrong when things don't work out. It just means that one event in a set of possible futures occurred.
When we think probabilistically, we are less likely to use adverse results alone as proof that we made a decision error, because we recognize the possibility that the decision might have been good but luck and/or incomplete information (and a sample size of one) intervened.
For most of our decisions, there will be a lot of space between unequivocal "right" and "wrong".
[...] every decision has risks, regardless of whether we acknowledge them.
[...] whenever we choose an alternative [...], we are automatically rejecting every other possible choice. All those rejected alternatives are paths to possible futures where things could be better or worse than the path we chose. There is potential opportunity cost in any choice we forgo.
By treating decisions as bets, poker players explicitly recognize that they are deciding on alternative futures, each with benefits and risks. They also recognize there are no simple answers. Some things are unknown or unknowable.
The betting elements of decisions – choice, probability, risk, etc. – are more obvious in some situations than others. Investments are clearly bets. A decision about a stock (buy, don't buy, sell, hold, not to mention esoteric investment options) involves a choice about the best use of financial resources. Incomplete information and factors outside of our control make all our investment choices uncertain. We evaluate what we can, figure out what we think will maximize our investment money, and execute. Deciding not to invest or not to sell a stock, likewise, is a bet.
In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing.
Whenever we make a choice, we are betting on a potential future. We are betting that the future version of us that results from the decisions we make will be better off. At stake in a decision is that the return to us (measured in money, time, happiness, health, or whatever we value in that circumstance) will be greater than what we are giving up by betting against the other alternative future versions of us.
How can we be sure that we are choosing the alternative that is best for us? What if another alternative would bring us more happiness, satisfaction, or money? The answer, of course, is we can't be sure. Things outside our control (luck) can influence the result. The futures we imagine are merely possible. They haven't happened yet. We can only make our best guess, given what we know and don't know, at what the future will look like.
When we decide, we are betting whatever we value (happiness, success, satisfaction, money, time, reputation, etc.) on one of a set of possible and uncertain futures. That is where the risk is.
Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief. This irrational, circular information-processing pattern is called motivated reasoning. The way we process new information is driven by the beliefs we hold, strengthening them. Those strengthened beliefs then drive how we process further information, and so on.
When someone challenges us to bet on a belief, signaling their confidence that our belief is inaccurate in some way, ideally it triggers us to vet the belief, taking an inventory of the evidence that informed us.
We can train ourselves to view the world through the lens of "Wanna bet?". Once we start doing that, we are more likely to recognize that there is always a degree of uncertainty, that we are generally less sure than we thought we were, that practically nothing is black and white, 0% or 100%.
Incorporating percentages or ranges of alternatives into the expression of our beliefs means that our personal narrative no longer hinges on whether we were wrong or right but on how well we incorporate new information to adjust the estimate of how accurate our beliefs are. There is no sin in finding out there is evidence that contradicts what we believe. The only sin is in not using that evidence as objectively as possible to refine that belief going forward.
Admitting we are not sure is an invitation for help in refining our beliefs, and that will make our beliefs much more accurate over time as we are more likely to gather relevant information.
Bet to Learn: Fielding the Unfolding Future
The way our lives turn out is the result of two things: the influence of skill and the influence of luck.
Outcomes don't tell us what's our fault and what isn't, what we should take credit for and what we shouldn't. Unlike in chess, we can't simply work backward from the quality of the outcome to determine the quality of our beliefs or decisions. This makes learning from outcomes a pretty haphazard process. A negative outcome could be a signal to go in and examine our decision-making. That outcome could also be due to bad luck, unrelated to our decision, in which case treating that outcome as a signal to change future decisions would be a mistake. A good outcome could signal that we made a good decision. It could also mean that we got lucky, in which case we would be making a mistake to use that outcome as a signal to repeat that decision in the future.
When we field our outcomes as the future unfolds, we always run into this problem: the way things turn out could be the result of our decisions, luck, or some combination of the two. Just as we are almost never 100% wrong or right, outcomes are almost never 100% due to luck or skill.
The way we field outcomes is predictably patterned: we take credit for the good stuff and blame the bad stuff on luck so it won't be our fault. The result is that we don't learn from experience well. "Self-serving bias" is the term for this pattern of fielding outcomes.
Self-serving bias has immediate and obvious consequences for our ability to learn from experience. Blaming the bulk of our bad outcomes on luck means we miss opportunities to examine our decisions to see where we can do better. Taking credit for the good stuff means we will often reinforce decisions that shouldn't be reinforced and miss opportunities to see where we could have done better. To be sure, some of the bad stuff that happens is mainly due to luck. And some of the good stuff that happens is mainly due to skill. I just know that's not true all the time. 100% of our bad outcomes aren't because we got unlucky and 100% of our good outcomes aren't because we are so awesome. Yet that is how we process the future as it unfolds.
[...] self-serving bias arises from our drive to create a positive self-narrative. In that narrative, taking credit for something good is the same as saying we made the right decision. And being right feels good. Likewise, thinking that something bad was our fault means we made a wrong decision, and being wrong feels bad.
Unfortunately, learning from watching others is just as fraught with bias. Just as there is a pattern in the way we field our own outcomes, we field the outcomes of our peers predictably. We use the same black-and-white thinking as with our own outcomes, but now we flip the script. Where we blame our own bad outcomes on bad luck, when it comes to our peers, bad outcomes are clearly their fault. While our own good outcomes are due to our awesome decision making, when it comes to other people, good outcomes are because they got lucky.
A lot of the way we feel about ourselves comes from how we think we compare with others.
If a competitor closes a big sale, we know about our tendency to discount their skill. But if we imagine that we had been the one who closed the sale, we are more likely to find the things to give them credit for, that they did well and that we can learn from. Likewise, when we close the big sale, let's spare a little of the self-congratulations and, instead, examine that great result the way we'd examine it if it happened to someone else. We'll be more likely to find the things we could have done even better and identify those factors that we had no control over. Perspective taking gets us closer to the truth because that truth generally lies in the middle of the way we field outcomes for ourselves and the way we field them for others. By taking someone else's perspective, we are more likely to land in that middle ground.
The Buddy System
Having the help of others provides many decision-making benefits, but one of the most obvious is that other people can spot our errors better than we can.
[...] while a group can function to be better than the sum of the individuals, it doesn't automatically turn out that way. Being in a group can improve our decision quality by exploring alternatives and recognizing where our thinking might be biased, but a group can also exacerbate our tendency to confirm what we already believe.
We don't win bets by being in love with our own ideas. We win bets by relentlessly striving to calibrate our beliefs and predictions about the future to more accurately represent the world. In the long run, the more objective person will win against the more biased person.
To get a more objective view of the world, we need an environment that exposes us to alternate hypotheses and different perspectives. That doesn't apply only to the world around us: to view ourselves in a more realistic way, we need other people to fill in our blind spots.
Diversity is the foundation of productive group decision-making, but we can't underestimate how hard it is to maintain. We all tend to gravitate toward people who are near clones of us. After all, it feels good to hear our ideas echoed back to us.
Although the Internet and the breadth of multimedia news outlets provide us with limitless access to diverse opinions, they also give us an unprecedented opportunity to descend into a bubble, getting our information from sources we know will share our view of the world. We often don't even realize when we are in the echo chamber ourselves, because we're so in love with our own ideas that it all just sounds sensible and right.
Dissent to Win
[...] don't disparage or ignore an idea just because you don't like who or where it came from. When we have a negative opinion about the person delivering the message, we close our minds to what they are saying and miss a lot of learning opportunities because of it. Likewise, when we have a positive opinion of the messenger, we tend to accept the message without much vetting. Both are bad. Whether the situation involves facts, ideas, beliefs, opinions, or predictions, the substance of the information has merit (or lack of merit) separate from where it came from. If you're deciding the truth of whether the earth is round, it doesn't matter if the idea came from your best friend or George Washington or Benito Mussolini. The accuracy of the statement should be evaluated independent of its source.
[A] way to disentangle the message from the messenger is to imagine the message coming from a source we value much more or much less. If we hear an account from someone we like, imagine if someone we didn't like told us the same story, and vice versa.
If the outcome is known, it will bias the assessment of the decision quality to align with the outcome quality.
After the outcome, make it a habit when seeking advice to give the details without revealing the outcome.
Adventures in Mental Time Travel
[...] we must recognize that no strategy can turn us into perfectly rational actors. In addition, we can make the best possible decisions and still not get the result we want. Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them.
When we make in-the-moment decisions (and don't ponder the past or future), we are more likely to be irrational and impulsive. This tendency we all have to favor our present-self at the expense of our future-self is called temporal discounting. We are willing to take an irrationally large discount to get a reward now instead of waiting for a bigger reward later.
One of our time-travel goals is to create moments like that, where we can interrupt an in-the-moment decision and take some time to consider the decision from the perspective of our past and future. We can then create a habit routine around these decision interrupts to encourage this perspective taking, asking ourselves a set of simple questions at the moment of the decision designed to get future-us and past-us involved. We can do this by imagining how future-us is likely to feel about the decision or by imagining how we might feel about the decision today if past-us had made it.
The way we field outcomes is path dependent. It doesn't so much matter where we end up as how we got there. What has happened in the recent past drives our emotional response much more than how we are doing overall.
[...] we shouldn't plan our future without doing advance work on the range of futures that could result from any given decision and the probabilities of those futures occurring.
The reason why we do reconnaissance is because we are uncertain. We don't (and likely can't) know how often things will turn out a certain way with exact precision. It's not about approaching our future predictions from a point of perfection. It's about acknowledging that we're already making a prediction about the future every time we make a decision, so we're better off if we make that explicit. If we're worried about guessing, we're already guessing. We are already guessing that the decision we execute will result in the highest likelihood of a good outcome given the options we have available to us. By at least trying to assign probabilities, we will naturally move away from the default of 0% or 100%, away from being sure it will turn out one way and not another. Anything that moves us off those extremes is going to be a more reasonable assessment than not trying at all.
In addition to increasing decision quality, scouting various futures has numerous additional benefits. First, scenario planning reminds us that the future is inherently uncertain. By making that explicit in our decision-making process, we have a more realistic view of the world. Second, we are better prepared for how we are going to respond to different outcomes that might result from our initial decision. We can anticipate positive or negative developments and plan our strategy, rather than being reactive. [...] Third, anticipating the range of outcomes also keeps us from unproductive regret (or undeserved euphoria) when a particular future happens. Finally, by mapping out the potential futures and probabilities, we are less likely to fall prey to resulting or hindsight bias, in which we gloss over the futures that did not occur and behave as if the one that did occur must have been inevitable, because we have memorialized all the possible futures that could have happened.
When it comes to advance thinking, standing at the end and looking backward is much more effective than looking forward from the beginning.
The most common form of working backward from our goal to map out the future is known as backcasting. In backcasting, we imagine we've already achieved a positive outcome, holding up a newspaper with the headline "We Achieved Our Goal!" Then we think about how we got there.
We start a premortem by imagining we failed to reach our goal [...]. Then we imagine why. All those reasons why we didn't achieve our goal help us anticipate potential obstacles and improve our likelihood of succeeding.
Imagining both positive and negative futures helps us build a more realistic vision of the future, allowing us to plan and prepare for a wider variety of challenges, than backcasting alone. Once we recognize the things that can go wrong, we can protect against the bad outcomes, prepare plans of action, enable nimble responses to a wider range of future developments, and assimilate a negative reaction in advance so we aren't so surprised by it or reactive to it. In doing so, we are more likely to achieve our goals.
[...] we tend to assume that, once something happens, it was bound to happen. If we don't try to hold all the potential futures in mind before one of them happens, it becomes almost impossible to realistically evaluate decisions or probabilities after.