Wednesday, November 29, 2023

How Statistics can Mislead

If you think you can draw causal conclusions from study data, consider Simpson's paradox
Another example comes from a real-life medical study[17] comparing the success rates of two treatments for kidney stones.[18] The table below shows the success rates (the term success rate here actually means the success proportion) and numbers of treatments for treatments involving both small and large kidney stones, where Treatment A includes open surgical procedures and Treatment B includes closed surgical procedures. ...

The paradoxical conclusion is that treatment A is more effective when used on small stones, and also when used on large stones, yet treatment B appears to be more effective when considering both sizes at the same time. In this example, the "lurking" variable (or confounding variable) causing the paradox is the size of the stones, which was not previously known to researchers to be important until its effects were included.

You can get an explanation in the above Wikipedia article, or here.

2 comments:

CFT said...

Now AI is going to enter the fray, and statistical assumptions, gaffs, and outright deceptions are going to be buried under the authoritarian premise that an AI is unbiased and incapable of drawing incorrect conclusions.

This is called 'lying by complexity'. Under this fallacy you can disguise any untruth by couching it inside of something too complicated to really unravel to the laymen.

I have contempt for all flavors of argument from authority. With AI, our scientists are creating a curtain of bullshit so convoluted that nothing behind it will be visible.

CFT said...

Roger,
Timcast on youtube has an excellent posting on a AI that was used at Sports Illustrated to publish articles. They even used AI to created a fake account and face for the fake person as a mouthpiece to make the illusion look more realistic. This is actually my primary concern, AI will allow effortless deception that humanity will never be able to keep up with or get to the bottom of quickly enough to be practical, I believe we are definitely headed into the 'Dark Age of un-enlightenment' where realistic deception generated by AI is going to be the primary means by which power is seized and controlled.

Skynet is going to bullshit us to death before even a single missile is fired. Sad.