Tuesday, December 3, 2013

Excerpt from "Policy: Twenty tips for interpreting scientific claims"

This article by Sutherland, Spiegelhalter, and Burgman in the journal Nature is great for understanding the limitations of science and the importance of knowing how to interpret science.

-------------------

Policy: Twenty tips for interpreting scientific claims
William J. Sutherland, David Spiegelhalter & Mark Burgman
Nature, 20 November 2013



And there is rarely, if ever, a beautifully designed double-blind, randomized, replicated, controlled experiment with a large sample size and unambiguous conclusion that tackles the exact policy issue.

In this context, we suggest that the immediate priority is to improve policy-makers' understanding of the imperfect nature of science. The essential skills are to be able to intelligently interrogate experts and advisers, and to understand the quality, limitations and biases of evidence. We term these interpretive scientific skills. These skills are more accessible than those required to understand the fundamental science itself, and can form part of the broad skill set of most politicians.

Politicians with a healthy scepticism of scientific advocates might simply prefer to arm themselves with this critical set of knowledge.

We are not so naive as to believe that improved policy decisions will automatically follow. We are fully aware that scientific judgement itself is value-laden, and that bias and context are integral to how data are collected and interpreted. What we offer is a simple list of ideas that could help decision-makers to parse how evidence can contribute to a decision, and potentially to avoid undue influence by those with vested interests. The harder part - the social acceptability of different policies - remains in the hands of politicians and the broader political process.


The 20 SSB Tips (Sutherland-Spiegelhalter-Burgman)
1. Differences and chance cause variation.
2. No measurement is exact.
3. Bias is rife.
4. Bigger is usually better for sample size. 
5. Correlation does not imply causation.
6. Regression to the mean can mislead. 
7. Extrapolating beyond the data is risky.
8. Beware the base-rate fallacy.
9. Controls are important.
10. Randomization avoids bias.
11. Seek replication, not pseudoreplication.
12. Scientists are human.
13. Significance is significant.
14. Separate no effect from non-significance.
15. Effect size matters.
16. Study relevance limits generalizations.
17. Feelings influence risk perception. 
18. Dependencies change the risks.
19. Data can be dredged or cherry picked.
20. Extreme measurements may mislead.


No comments:

Post a Comment