Tuesday, September 10, 2013

Bad study of ideological bias

Mother Jones cites a new study:
But in another sense, it really doesn't matter at all. These days, even relatively simple public policy issues can only be properly analyzed using statistical techniques that are beyond the understanding of virtually all of us. So the fact that ideology destroys our personal ability to do math hardly matters. In practice, nearly all of us have to rely on the word of experts when it comes to this kind of stuff, and there's never any shortage of experts to crunch the numbers and produce whatever results our respective tribes demand.

We believe what we want to believe, and neither facts nor evidence ever changes that much. Welcome to planet Earth.
The study claims:
The public’s limited knowledge is aggravated by psychological dynamics. Popular risk perceptions, it is thought, tend to originate in a rapid, heuristic-driven form of information processing — what decision scientists refer to as “System 1” reasoning (Stanovich & West 2000; Kahneman 2003). Overreliance on System 1 heuristics are the root of myriad cognitive biases. By fixing attention on emotionally gripping instances of harm, or by inducing selective attention to evidence that confirms rather than disappoints moral predispositions, System 1 information processing induces members of the public variously to overestimate some risks and underestimate others relative to the best available evidence, the proper evaluation of which requires exercise of more deliberate and reflective “System 2” forms of information processing
No, these distinctions have never been established, and the authors are the innumerate ones here. The study devises a trick question, worded in a confusing way. 59% get it wrong, and probably most of those getting it right just made a lucky guess. Then the question is modified for some political purpose.

All the study shows is that if the question is so confusing that no one understands it, then people will sometimes use their prior knowledge to make an educated guess.

Here are some confusing things about the skin cream question. The question fails to say that the two groups were randomly selected, or were otherwise similarly situated. The question suggests that a large number of patients dropped out of the study, but gives no clue how to account for that. The question asks whether "the new cream is likely to make the skin condition better or worse." In that sentence, it is not clear that the cream is to be compared to not using the cream. As a result, the question cannot be answered correctly. It is an exercise in guessing what assumptions to make. Unless the study authors understand why 59% give the supposedly incorrect answer, the whole study is worthless.

There are many academics, particular on the Left, who are always trying to concoct arguments that the public would change their political views if they only understood the facts. Maybe so, but the people who write these papers do not understand the facts.

No comments: