I was recently involved in a discussion of post-modernism and relativism that started when a commenter on my blog tried to support how astrology can be true and then continued when I posted on Twitter about it. I wrote:
The human condition is relative from human to human and culture to culture. But there are scientific truths outside and independent of us.
I thought my meaning was clear. What might be moral in one culture may not be in another, and in many cases that’s OK. Cultures are different. But in the objective reality of the Universe, such relativism may fall apart. Physical laws have an objective reality; we may interpret them, but they continue to do what they do whether our interpretation is correct or not.
This led to a discussion of the meaning of things, and that I think is the important issue.
A follower on Twitter said:
Gravity may well exist. But if we can’t describe it, it’s hardly objective. And we can’t possible know it’s [sic] meaning.
I agree totally with Phil. Gravity has no ‘meaning’ and exists whether humans are here or not. Yet the discussion seems to be a nice measure of something discussed in the new book ‘Don’t Be Such A Scientist‘ by Randy Olson.
Most people see the world quite differently than scientists. They have emotional responses to things that we see as simply factual.
Many people deal with a complex world by using a few simple rules of thumb (or as scientists would say, heuristics). They do not have to think deeply about a new pattern. They see where it fits in their rules. And if it does not fit, they generally just ignore it.
Scientists have been trained to really use a much more complex set of heuristics because we have directly seen so many examples of how simple rules of thumb lead researchers astray. One such rule is to stick to facts, not to metaphysical meanings, which often have little real impact on objective facts.
So I am not surprised that the follower on Twitter discussed meaning. People look for meaning in all sorts of things; they try to find patterns even when there are not any.
While this is now a very non-selective approach (it can lead to decisions that are actually harmful for survival), in simpler times while on the savannah, finding patterns would be very useful for identifying the lion in the brush.
Researchers have trained themselves to actually think against some of our basic premises. Humans look for meanings and patterns in everything. Scientists have trained themselves to find the facts, not the meanings or misleading patterns.
Thus, a girl dying after being injected with a vaccine is, to many people, a sign that the vaccine caused the death. Scientists know that these post hoc fallacies are often wrong and wait. When it is found that she had a severe underlying illness, only scientists are really reassured.
Most people have already made the new rule that the vaccine is bad. Simply telling them that this is not so will have little effect. The only way I have found to get them to really ‘think’ and perhaps alter their rule is to use stories that come at the fallacy in ways that they can easily understand.
I’ve have used this story a couple of times:
A women puts on a red sweater, gets on a plane and then dies in a plane crash. Since she crashed so soon after putting on the red sweater, authorities are suspending sales of red sweaters until they can determine whether it was responsible for the accident. While it is an unlikely cause, they want to make sure that there is no connection.
Now, most people would say this is just ridiculous because there is no way that red sweaters could cause a crash. When the investigation shows the rear til broke off of the plane, everyone gets very comfortable because that explanation fits their rule of thumb quite nicely.
Why weren’t they fooled by the initial post hoc argument? Because many have direct knowledge of the entire system and know that the original conclusion is wrong.
Not so with most science, such as vaccine development. The process is a mystery as well as one that they have little control over.
Telling them the facts will not change this very basic rule of thumb. To get them to re-evaluate their conclusion, they need to be provided with a similar ‘story’ but one that illustrates the new rule by contradicting their old rule.
So after giving a nice story, I can then describe how a vaccine undergoes strong safety checks, how hundreds if not thousands of people have received it and that the side effects are no different than placebo.
Like an airplane, we may be a little reluctant to give up control but the results are so beneficial that this discomfort is worth it.
After all of this, a sudden death actually caused by the vaccine would be as odd as a red sweater causing a plane crash.
Not everyone gets it (some heuristics are too strong, particularly ones dealing with various forms of woo) but I have actually gotten a couple to be a little more skeptical. At least when they see a report, like the one (I’ll wait to actually see the results published. I also wonder if they looked at people who got no vaccine but got the flu last year, as a control) , they come and check with me.
I can see why Aesop’s Fables are so useful. They provide easy ways to gather rules of thumb that can be important in a complex world. Maybe we need some new fables for a complex world.