I never understood why consistent opinions are considered a good thing, it doesn't make any sense. If I realize I'm wrong, why should I stick to the wrong opinion?
The hypothetical cleverest, most correct person possible would be right 100% of the time and therefore would never change their opinion because changing it would go from right to wrong.
Therefore a simplistic view is that the closer to that person you are, the better. Some people may even think they are that person.
Realising that you are not going to be correct 100% of the time is the first step. The second step is working out when you are wrong, so you know when to change your opinion. Nobody manages both these steps all the time, some people rarely manage either.
The hypothetical cleverest, most correct person possible would be right 100% of the time and therefore would never change their opinion because changing it would go from right to wrong.
A subtly different view: The hypothetical cleverest, most correct person possible would be right close to 100% of the time because they would change any wrong opinion almost immediately.
And with this view, the closer you are to that person, the better you really are.
That's a much more realistic cleverest person, but in terms of hypotheticals it is clearly better to be right 100% of the time than to almost immediately change your opinion the 1% of the time you start off wrong.
> The hypothetical cleverest, most correct person possible would be right 100% of the time and therefore would never change their opinion because changing it would go from right to wrong.
The problem with this, is that most opinions are applied to most circumstances in highly contextual ways. What's the best database? It's one thing to talk about it in abstract and another entirely to talk about it for a specific project.
> Therefore a simplistic view is that the closer to that person you are, the better. Some people may even think they are that person.
When human knowledge, technology, and political circumstances mostly changed slowly, this was a good heuristic. Now, it's pretty rotten. It's better to take things up one level of indirection and instead of observing their opinions, note how they absorb new knowledge and circumstances instead. Observe if they are a good listener. Observe if they can alter their mental model in the light of new facts.
> Some people may even think they are that person.
Stay away from that person! Things change too fast nowadays. Look for the "beginner's mind."
> The problem with this, is that most opinions are applied to most circumstances in highly contextual ways. What's the best database? It's one thing to talk about it in abstract and another entirely to talk about it for a specific project.
That doesn't prevent someone from being right 100% of the time. It may prevent them from saying "x is the best database", but not from saying "x is the best for y use, z is the best for... etc"
The problem is with the fact that no-one is capable of always being right, not that the hypothetical person couldn't always be right.
> That doesn't prevent someone from being right 100% of the time. It may prevent them from saying "x is the best database", but not from saying "x is the best for y use, z is the best for... etc"
PG has noted that Robert Morris was one of the smartest people he knew, because he always knew when he shouldn't give an opinion.
> The problem is with the fact that no-one is capable of always being right, not that the hypothetical person couldn't always be right.
No disagreement there. A corollary: a big problem is embodied in people who don't know that the above is the case and take steps to mitigate it.
People's opinions usually have probabilistic nature, i.e. "From what I know I think X is the most probable conclusion". The problem is that they know too little, because people don't have time to study everything in detail.
I think it goes back to politics. It's considered a bad thing in that arena because people can't trust that you will keep your word on the platform that you hired them on. This is especially true for the divisive issues like abortion, healthcare, etc....
People like security and therefore stability, it's one of our basic needs. We like stable environment, including people's opinions. Imagine everyone changes their opinion all the time, it could be fun but not everyone would like that :)
Also, people who often change opinions may look dishonest, opportunistic and unpredictable.
I'm not sure I buy the theory about it originating in political discourse.
It's a very basic concept. Changing your mind almost necessarily involves admitting you were wrong in the first instance — if you do not admit it outright, someone else will likely confront you on it.
People don't like admitting they were wrong. It makes one look bad, and provides one's enemies with ammunition.
Politics often has a lot to do with it. I don't mean Politics politics, but rather, office politics and social politics. Social signaling plays a huge role in how we act, and in what types of traits we attribute to the actions of others.
In a typical Big Corporate workplace, to be seen as quickly decisive is to be seen as a strong leader. To change one's mind is to project weakness and indecisiveness. Perhaps in complete contradiction to Bezos's opinion, the prevailing opinion among most corporate managers -- even C-level execs -- is that those who can quickly arrive at a conclusion are the smartest, and are most likely to be correct. In reality, we all know where this usually leads. But there's still a prevailing opinion that jumping to conclusions, and doing anything to support them, is the ideal way to act.
Compounding this problem is a well-known and well-documented cognitive bias, present in some extent in all human beings. It's known as the "confirmation bias." Essentially, the way our brain works is to draw up a quick hypothesis about a situation, and then to look for any information that supports the hypothesis. We actually have to train our minds to look for contradictory or challenging evidence, because doing so is not hard-wired into us.
The confirmation bias happens at the subconscious level, and there's actually good evidence that it served as an adaptive, successful trait in human evolution. Back when we lived in the wild, we didn't have a lot of time to analyze threats, weather patterns, or other major situations. So being able to mentally "shorthand" a situation was a benefit. It still is in many instances, but it becomes a hurdle when we need to make longer-term, more complex decisions. In those cases, we need to bring conscious, analytical thought to the table.
Consistent opinions are a good thing as long as unbiased, informed data continues to support that choice. One must know, though, that it's not infallible, but it's good enough, given the information provided.
Getting to the new insight is only one reason why people change their opinions.
Often people change their argument istead because at new time different position benefits them most. Or because they don't think trough what they are saying. Or there are simply people who like to win. In my experience the later ones happen lot more often than people getting better understanding.
Changing opinions in smart people are often not oposites of previous one, but upgrade of previous position.
It's because if you change your mind frequently enough, people stop listening to you or giving your opinion any weight. I'm not saying this is right, or that it occurs as often as is feared, but it is the fear, and it does occur.
This is an ironic question: on the one hand, you say consistency doesn't make sense, on the other hand, the root meaning of "doesn't make sense" is precisely to be inconsistent.