Why I Support Bariatric Surgery (Part 2)



Valuing Losses & Gains

Valuing Losses & Gains

Yesterday’s post was about the widespread misconceptions around the risk of having bariatric surgery compared to the risk of not having surgery.

I pointed out that for a severely obese person with clinically significant end-organ damage, the risk of death without surgery within 1 year is about 10 times that of dying of the surgery itself.

Having looked at the risks, today, I wanted to discuss the ‘benefit’ side of the equation – after all, no one would consider even the safest surgery, if there was no benefit to having it.

But before I go into the discussion of benefits, I thought it may be worthwhile to discuss how we (both experts and non-experts) tend to perceive risk and why we are so easily bound to kid ourselves, even when we know the numbers.

The fundamentals of how human psychology tricks us into falling widely off the mark, when it comes to interpreting risk was described in a classic paper by Daniel Kahneman and Amos Tversky published in Science in 1974 (Kahneman went on to get the Nobel Prize for economics in 2002).

As pointed out in this seminal paper, based on a remarkably solid body of empirical psychological research (consistently replicated ever since), we all tend to make ‘gut’ decisions according to the following three principles:

1) Representativeness (or as Dan Gardner calls it, “the rule of Typical Things”)

2) Availability or recall of instances (“Example Rule”)

3) Anchoring

We tend to use all three rules to assess risk or judge probability – even when we know the numbers and statistics.

I have previously described the Anchoring Rule, so I will not discuss this again – suffice it to say, that our judgements are often clouded by (random?) numbers that we have heard somewhere, irrespective of whether they are even remotely true or not.

If someone (anyone) were to simply say 89.7% of patients struggle after surgery (a number I just made up), people will think that surgery is a risky business, even if I then tell them that I just made up this number and it is probably a wild exaggeration (and never mind that the actual number is probably well below 20%). They would just be obeying the Anchoring Rule.

The Rule of Typical Things is even trickier and leads us to believe more in stories that sound reasonable and ‘typical’ than stories that sound ‘non-typical’. Interestingly, what we perceive as ‘typical’ may not be at all ‘typical’ – we just think it is because it makes a good story and sounds highly plausible.

Here is how the ‘Rule of Typical Things’ works:

Stephanie is a 45 year old Canadian woman with severe Vitamin D deficiency.

Which of the following is most likely to be true:

a) Stephanie lives in Alberta

b) Stephanie has had bariatric surgery

c) Stephanie lives in Alberta and has had bariatric surgery

Pick one before you read on!

Let’s do the math:

Based on population distribution and assuming that Vit D deficiency is distributed equally across Canada, the chances that Stephanie lives in Alberta is 1 in 10.

Assuming that well under 5,000 45-year old Canadian women will have had bariatric surgery and assuming that perhaps 20% of bariatric surgery patients may have severe Vit D deficiency, chances that Stephanie may have had bariatric surgery is about 1 in 250.

So the chances that Stephanie both lives in Alberta AND has had bariatric surgery turns out to be only about 1 in 2500.

Yet, if you picked option (c), you would not be alone – this is what most people (around 80%) will pick – just because it tells a better story – the rule of ‘Typical Things’.

In short – if we associate a certain event (severe Vit D deficiency) to be ‘typically’ related to another event (bariatric surgery) we are far more likely to think that the likelihood of the combination of these two events occurring together is greater than the likelihood of either event occurring alone – despite the fact that probability theory tells you that this is impossible.

But, no doubt, when it comes to our ‘intuitive’ assessment of the risk of bariatric surgery, the third “Example” rule is perhaps the most pervasive and powerful.

According to this rule, our ‘gut’ tells us that something we have experienced, heard of, or can otherwise readily recall, is much more likely than something we do not remember (or know of) – in other words, the easier it is to recall something, the more we assume this to be a common occurrence.

It is of course human nature, to pay more attention to and remember the outliers than the typical instances. We do not remember every car that ever drove by us at the pedestrian crossing but will never forget the car that almost hit us – even if this only ever happens once, we will, from then on, always be extra careful when crossing the road and will likely warn all our friends and family to be extra careful because of this experience.

Translate this to the issue of bariatric surgery: we will always recall and remember the one case that sadly died of surgery or had some other horrible complication (the rarer – the more likely we are to vividly remember it) but remain quite unaware of (or forget) the many instances where things went well – in fact, the better things go in general, the more ‘spectacular’ and ‘memorable’ we find the rare cases that go wrong.

The reverse is also true: many, who enthusiastically decide to have surgery, remember the ‘spectacular’ case of the person they recall meeting or reading about, who lost 50% of their initial weight and changed their lives forever, but forget (or are unaware of) the not-so-spectacular ‘typical’ cases of those who only lost the usual 25% of their initial weight and just quietly went on with their lives.

Incidentally, Kahneman & Tversky also demonstrated the psychological phenomenon of Prospect Theory or “loss aversion“, which states that people value losses much more harshly than gains (Note the asymmetry of the curves in the Figure). So if you lose a $100 bill you will lose more satisfaction than you would have gained from finding a $100 bill. Similarly, a patient, who develops a new complaint from bariatric surgery (e.g. dumping syndrome), will find this far more distressful (even if it can be easily controlled by avoiding certain foods) than the relief from seeing his original problems (having to inject insulin and to sleep with a CPAP machine) getting better.

Not unexpectedly, all of these psychological phenomena readily explain some of the comments that readers leave on this site: I am of course far more likely to hear from readers who had a remarkable (but atypical) negative or positive experience, than from readers, where everything simply went its unremarkable and unspectacular usual course, which is exactly, why they do not leave comments (or even read this blog) – because everything went just the way it should – nothing special to write home about!

Sadly, perhaps, these heuristics (experience-based techniques for problem solving), which we all apply in so many judgements every day, are remarkably resistant to both data and common sense.

All of this brings us back to the discussion of risk and benefit, which underlies any clinical decision making.

In my own work as a physician, these heuristics are present every day. I am fully aware that my own judgements are subject to rules of ‘Typical Things’, ‘Examples’, and ‘Anchoring’ and it is often with great effort that I have to remind myself that ‘outliers’ (both good and bad), which will always happen, are thankfully (or sadly) just that – outliers. I will also always distress more about the ‘problems’ that any treatment may have caused my patients, than celebrate their improvements from that treatment

In real life, taking chances means placing your bets – this is best done, when you fully understand your odds of winning or losing. Great wins may justify large bets, even when the odds of losing are substantial. On the other hand, even a small chance of losing your shirt (no matter how big the jackpot), would make many decide to walk away from the table.

Hopefully my readers will forgive me for this brief excursion into the psychology of risk, before delving into the benefits of bariatric surgery tomorrow.

For anyone further interested in this topic, I highly recommend Dan Gardner’s immensely readable bestseller ‘Risk: Why We Fear The Things We Shouldn’t – and Put Ourselves In Greater Danger‘.

AMS
London, UK

Tversky A, & Kahneman D (1974). Judgment under Uncertainty: Heuristics and Biases. Science (New York, N.Y.), 185 (4157), 1124-31 PMID: 17835457