In early 2019, WSJ asked a bunch of senior economists to make their interest rate predictions for the next twelve months. The question was innocuous enough. Economists predict interest rates all the time. And with the US economy and job market in full boom, economic theory predicted that the Federal reserve would likely raise rates further to curb inflation. And that would imply rates would rise. Armed with that insight, all economists went with a variant of the rising rates prediction. And boy, did they turn out to be wrong. Here is how their predictions look today:
Pretty bad, right? Working backwards, almost all these economists will likely defend their position. The looming threat of trade war and other recessionary risks meant that the Fed (egged on by Trump) had to actually decrease rates to keep the momentum in the economy going. Hindsight 20/20 and all that crap. But this post isn’t about slating economists or experts in general. The point of the example is to realize that experts are often wrong in non-scientific lines of inquiry. However, all of us routinely treat areas outside our domains of knowledge as sacrosanct. To take my personal example - I’ll happily debate the WACC used in making my next DCF model. I (kinda) understand that. But what about the 1.5% inflation rate that I have assumed for the next 10 years? Man, that’s from the Bank of England - there’s no point discussing that. And yet, it is quite likely that movements in inflation will have just as material an impact as the cost of capital.
It turns out there is a theory to explain away my (and everyone else’s) behavior. Named Gel-mann Amnesia effect, it postulates that we all are able to routinely spot flaws in a publication’s logic around our areas of specialization while acting as if that same paper or individual is a guru when it comes to another field. Here is the Wikipedia entry explaining the idea with an example:
“Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward — reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them.
In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.”
We all exhibit the behavior described above in our daily lives. It is not just individuals though. Corporations are just as predisposed to fall into this trap. Large corporations more so than small ones. But that is more by accident than design. In large companies, there is always an ‘expert’ on hand on a topic so that individuals may outsource their thinking.
In a startup, things are lean. You will likely quite often have to Google things and figure them out for yourself. And this Googling behavior often produces some of the best insights. It is a famous cliche that at startups, early employees are tasked with wearing multiple hats. Hidden behind the cliche is a truth that at such places, young boorish employees are often tasked with things that they should have no business doing. And even though this tasking of ‘amateurs’ sometimes lead to disaster, more often than not it produces interesting and novel results. We use the example of Admiral Group to illustrate that.
Consider Admiral, UK's largest motor insurer and how it got there over less than 25 years. A few key insights it was built on remain anathema to industry experts. Their initial schtick was selling insurance on the phone. And yet, any insurance industry expert in the early 1990s could have advised them that insurance can only be sold in-person via brokers or agents. Another thing they did was realize that you didn’t actually need a big team of actuaries if you just reserved extremely conservatively. Actuarial science is after all the science of figuring out the optimal level of reserves. And yet, if you reserve overly cautiously, you can have a very lean actuarial team and release the reserves subsequently. Lastly, the industry was addicted to GLM models for pricing risk. The founders figured out that simple tables in excel might work just as well if used appropriately. Moreover, by flipping the GLM model into a spreadsheet exercise, they gave immense power to the pricing team. Now, you couldn’t just use the GLM as a defense for any pricing change you made. Everything flew through the mind of pricing analysts. And this allowed them to make many more changes, much more quickly.
The last century has produced an era of hyper-specialization in most fields. And while this has positives, this over-reliance on field experts is one of the big negatives. While there is humility in admitting something one doesn’t know or understand, it is just as important to try and figure things out for yourself. And if an expert’s prediction doesn’t make sense to you, ask them directly or call them out on it (nicely). And while some people will take offense to it, many who are interested in debate will likely appreciate the fact that someone is showing interest in their field of work.
To receive more posts like this, sign up for our newsletter!