Thomas Gilovich, Robert Vallone and Amos Tversky (I’ll call them G, V and T) made quite a splash back in 1985, with a claimed debunking of the hot hand “myth”. According to the authors, a player who has just made his foul shot is thereby rendered neither more nor less likely to make the next one.Statistician Andrew Gelman explains, and so does a WSJ article.
Now a new paper by Joshua Miller and Adam Sanjurjo (call them M and S) claims that G, V and T drew the wrong conclusions
Landsburg compares the issue to this Google interview question:
In a country in which people only want boys every family continues to have children until they have a boy. If they have a girl, they have another child. If they have a boy, they stop. What is the proportion of boys to girls in the country?The answer is that each birth has a 50% probability of a boy regardless of previous births, parental intent, or anything else. (Technically it is closer to 51%, but this puzzle assumes 50%.) So the country will have about the same number of boys and girls.
Landsburg points out that if the country only has 5 families, and if they are allowed to complete their plans, and if you assume a peculiar weighting of the boys and girls when computing averages, then you can get a different number. In particular, if kids from larger families are weighted less, then you will expect fewer girls because larger families are mostly girls. You can read his posts for details. He seems to feel very strongly that there is some merit to this set of strange assumptions.
Your first reaction to this will be to wonder why anyone would interpret the puzzle to use such a stupid weighting. The intent of the puzzle seems to be whether some country like China could have an excess of boys because of how parents decide to stop having kids. The answer is no.
It appears that the 1985 hot hand analysis did indeed use such a stupid weighting to claim to prove that there was no such thing as a hot hand. This was widely acclaimed as a great work, and it is written up in textbooks. It helped one of the authors get one of those fancy Sweden prizes. One of the biggest selling science works of the last 5 years, if not the biggest, is a long story by Kahneman about how this is an example of a human cognitive bias, where we think that we see hot hands and the math proves that we don't.
For 30 years no one noticed that the analysis was wrong because of the crazy way it weighted the data. Other studies show that basketball players do have hot hands.
I have criticized Kahneman's book several times on this blog, on other grounds. A lot of these decision theory arguments about cognitive bias seem dubious to me. The whole field must have very poor standards that such a simple error lasted so long without anyone noticing.
No comments:
Post a Comment