Pages

Thursday, April 23, 2015

Are My Opinions Best?

I imagine most of us have at some point in our lives been told "you just think your opinions are best!" It's meant to imply that we're close-minded or arrogant. But on face, it's a very weird objection, isn't it? Of course I think my opinions are best! If I thought someone else's opinions were best, I'd adopt those opinions. Consider the following statement: "I believe X, but Jan believes Y. I think Jan has the better opinion on the matter, actually, but I still believe X." That seems awfully strange.

Thinking that, though, made me wonder if there are scenarios where that isn't true. The example I came up with was my versus Stephen Hawking's opinions on physics. I have certain beliefs about how the physical world works, as does Dr. Hawking, and I'm pretty confident his are better than mine. Does that mean I've just tacitly adopted his opinions on the matter? I don't think so, for a few reasons: First, I can't even coherently explain what his opinions are -- they're way beyond my understanding of physics. I don't think I can claim to hold an opinion that I don't actually know or understand the substance of. Second, related to the first, I still basically act in accordance to my own primitive understanding of how the physical world will behave. So in that sense, I'm still adhering more to my opinions than anyone else's. And finally, I'm using Dr. Hawking as a prominent stand in for "really smart physicist", but presumably many smart physicists disagree with one another, and I don't necessarily share Dr. Hawking's view when it is in disagreement with other prominent physicists. Even where two well-credentialed physicists take mutually contradictory positions, I still think that both of them have "better" opinions on physics than mine -- but it wouldn't make sense to say I'm adopting mutually-contradictory positions.

I don't know if this goes anywhere interesting (it feels like the sort of question epistemologists have resolved six ways to Sunday), but it was on the brain for the past few days so I thought I'd share. I would note that it seems to have greater relevance to the salience of "moral facts" (if such things exist) than it does physical facts. We're okay with people accepting that the best physical explanations may simply be beyond the reasoning capacities of the average person because we don't expect everyday people to do complex physics. But we do expect people to do moral reasoning accurately. So if we're in a world where there are moral facts, but average people are incapable of understanding what those facts are (knowing only that their primitive efforts at moral reasoning are obviously nothing close to reliable, and that while there exists a small class of persons who can do this sort of reasoning more effectively, they aren't in so much agreement so as to allow for us to simply substitute their views for our own), that seems to leave us in a dangerous place.

1 comment:

  1. 1. “Among all the opinions you have acquired in your lifetime, do you think you currently have any opinions that will prove to be mistaken?” Most people say yes – which would seem to demonstrate that most people hold some views they would acknowledge as contradictory, and hence suboptimal, even if they could not identify what they were.

    2. What makes an opinion “good”: accuracy or utility? There are many illustrations of the idea that true beliefs are not necessarily adaptive, and adaptive beliefs are not necessarily true.

    - Most/all people experience the same optical illusions. Magicians exploit known defects in human perception/cognition to fool us.

    - People compile lists of cognitive biases. The attitude that you can draw broad conclusions from easily gathered data (e.g., racism, sexism, ageism, etc.) is widespread, even if the conclusions are often faulty. Harvard psychologist Dan Gilbert observes that people make predictable errors in predicting how they will feel in the future, and what circumstances will influence their happiness.

    - In Night, Elie Wiesel suggests that people who find a purpose in life – no matter how delusional -- prove to be more resilient than people who don’t.

    If you subscribe to the theory of natural selection, it would be hard to devise an explanation for these systemic errors other than that certain patterns of perception and thinking are adaptive even if often inaccurate.

    Perhaps the guy who is constantly on the look-out for things that trigger his hope or fear will get more false positives than the guy who doesn’t share this bias – but is also more likely to have more true positives. And the benefit of the true positives may offset the burdens of the false ones. For example, the guy who tends to see a predator hiding in the bush, even if often wrong, may live longer than the guy who is more accurate, i.e., the guy who has an equal chance of seeing one when it isn’t there as he as of not seeing one when there is one there.

    ReplyDelete