Tell Me About It
One thing that has informed my worldview over recent years was the discovery that there exist a whole host of cognitive biases that cause us to mis-perceive everything we see, hear and think. In a previous post I linked to an article that listed no less than 58 of these - and I have a poster on my desk which lists 20. (note that these both originate from the same website)
...
I can't remember whether I have blogged about it or not but one of the things that I rarely accept is anything that enumerates these sort of arbitrary delineations. What is clear to me is that there are many - and I don't know how many! In this post I just want to explore one a little - or perhaps it is more than one - perhaps a number of different biases combine to result in the "illusion of explanatory depth".
Google it and you find plenty of references. One of the authors answering the 2017 Edge Annual Question chose it as "a concept that ought to be more widely known".
So, I will skip the detailed explanation (which may actually be me falling into the trap - how apt) and simply say that it is the mistake we all make when we assume that we know much more about something than we actually do.
The reason I have jumped on this particular bandwagon is an article that I read today which brought it back to the forefront of my mind.
The illusion is exemplified every day, in all sorts of contexts. Everyone is an instant expert on a whole range of subjects. Unfortunately they are being fed "information" from sources that are, individually, extremely biased and they do not have the time, energy or will to properly research any subject, because it is too easy to "single source". The single sources that we each use are, by and large, working with similar biases to our own - giving a "spin" to any information that is released.
So we have a biased and shallow (because our sources rarely provide much detail) understanding of any issue you care to mention. This seems particularly problematic when we deal with "how things work" in the real world. Rarely is it simple and straightforward. Much of the time we can "get by" with this sort of superficial knowledge because we are merely using it to navigate the world - and it is more than enough to do that.
Going back to my very early posts our mental models are comprised precisely thus. What is never understood - at least not explicitly - is that every model is, necessarily, a simplification - something which is only detailed enough for the precise purpose model is built for.
Here in the UK the biggest demonstration of this illusion was during last year's referendum. Brexit is happening - Remain lost - but for me the biggest story is that no one seemed to look past the headlines - no one could explain what would really happen if and when they won the vote. There were plenty of vote winning soundbites - most of which carried very little meaning - some of which were entirely made up with no basis in reality.
Part of the reason is that it is all just too difficult. As far as we can discern the outcomes are not only unknown, but unknowable - at least for practical purposes. There is as much chance of correctly forecasting the "outcome" of Brexit as there is of forecasting the weather on April 17th 2021. (actually, given the natural seasonality of weather you might have more chance of getting the weather right). However - for large number of people - the In/Out decision was not an easy one it was an unwavering one - nothing would persuade them to even consider the alternative.
The referendum is now in the past, even though the process that it started is only just getting going. However, it doesn't seem to me as though there have been many lessons learned in terms of the dangers of falling into the trap of the illusion of explanatory depth. If anything it is encouraged and amplified by the way in which we approach many of the big decisions of life.
The simplified mental models that we use to navigate our way through our days are proven to be fit for purpose. We do not need to know the way that an internal combustion engine works in order to drive a car from A to B. We "know" enough to successfully do that without greater detail in our mental models. Every now and again we will come up against something that disturbs the model in some way - hence we learn.
Where those models are often not "Fit for Purpose" is when they are used to do forecasts beyond the immediate. Going back to the weather - we can tell what the immediate weather is going to be like by looking out the window - that model doesn't work when trying find out what it will be like tomorrow. For most of us our model is to watch the BBC or visit the met office website or whatever. That model then fails when we want to know what the weather will be next month or next year. The model must be fitted to the ultimate aim.
Everyone would agree that uncontrolled immigration is not going to be a good thing in the long term. The fun starts when you consider what the controls would be - like an onion, the problem is very much multi-layered. There is no way that I intend to address the entirety of that issue in a short blog post. Just some of the issues are - if you stop immigration what happens to those already in the country; do you use some sort of points based system - like Australia, for instance; at what stage does an immigrant become a citizen; should people stay in their town of birth - their county - their country - some other arbitrary area.
All these are points worthy of consideration (and there are plenty more) and are precisely the sort of thing that become uncovered when you start to address the hole in understanding that the illusion of explanatory depth creates.
The solution is deceptively simple - you ask "tell me about it" - then you ask it again - and again - quite quickly the lack of depth becomes apparent and - hopefully - a realisation dawns that what was thought to be "known" is in fact a rather superficial understanding. For me, the aim is not to understand all but rather to reach that point where we recognise our ignorance (in keeping with the blog theme).
The process goes something like this:
Statement : I think that there should be more money spent on the NHS
(clearly this is a popular thing that most people would instinctively support)
Response : Where is the money to come from? What aspects of the NHS should get more money? (and probably a few more questions - I'll only explore a subset)
Statement : We spend too much money on .... it could come from there
Statement : Seems like A&E services are struggling
Response : OK - if we take money from .... then they won't be able to fund this really important social project.
Response : Should we spend the money on providing more A&E centres or providing more services away from A&E to free up the A&E centres for urgent cases
....
This dialog can continue for a long time and will (almost certainly) continue to branch out into more and more related areas. It shouldn't be long before they all hit the "I don't know" point or the "I didn't think of that" point - both of which are limits of knowledge indicators where we have, indeed, reached the boundary of our own ignorance.
This is, of course, not unlike the processes that you would go through in understanding any system - systems dynamics as a technique does this sort of thing. Any problem solver will tell you that the first thing you need to get right is an understanding of where your starting point is. These are both examples of things that instinctively recognise that we are not good at really understanding anything. To some extent, our knowledge and understanding will be limited - almost always more limited than we realise. These techniques begin to force us down the path to identify where we reach that boundary.
Problem is, not everyone understands the need for this - it has been suggested that not everyone is wired up correctly to view these sort of systemic connections. So, for many, the limited, superficial, understanding is not going to be questioned.
That is where the less scrupulous media, advertisers, marketeers, politicians see an advantage. As long as they put forward a "plausible" story (for we all love a narrative that fits with our preconceived ideas - that's yet another cognitive bias) then huge numbers of people will accept whether its the truth, a part truth, or nothing like the truth. The upsurge in awareness of "fake news" is more to do with the recognition that it exists and its availability than any real increase in its existence.
Throughout history propaganda and 'tailored' news has been fed to the populace - usually when they had no way of knowing the veracity of what they were being told. Now - there are always alternative sources - but (and here we come full circle with this post) for many the tried and trusted sources will be believed regardless of what they say.
We must all be more aware of the ease with which we could be deceived. We must all be more aware of the limits of our knowledge. Whenever we think that we "know" something (and it becomes important for whatever reason) we should ask ourselves to explain it a bit further and see just how far we can go. Its not an easy thing to do - especially with deeply ingrained attitudes and opinions - but it is an essential thing if we are to cope in an era where "fake news" can so easily cause us to make wrong decisions.
Categories: Philosophical, Systems Thinking, Complexity, Cognition, Worldview, ----------
