Datu Dr Andrew Kiyu 24/6/2013

Importance of topic INTERPRETATION OF EPIDEMIOLOGIC : • Epidemiology success stories have led to improved policy and Truth, Chance and Biases practice. • However, observational epidemiology has: – methodologic limitations that Dr Andrew Kiyu – affect the ability to infer causation. (MBBS, MPH, DrPH, AM, FACE) Consultant Epidemiologist • The major challenge of : Sarawak Health Department • dealing with the data deluge and Email address: [email protected] – uncovering true causal relationships from – the millions and millions of observations that are Sibu Clinical Research Seminar background noise. RH Hotel, Sibu Date: 24 June 2013 • Increased consumer awareness and education

• Source: NCI's Epidemiology and Genomics Research Program sponsored a workshop entitled” T rends in 21st Century Epidemiology- From Scientific Discoveries to Population Health Impact “ on 12-13 December 2012 at 1 http://epi.grants.cancer.gov/workshops/century-trends/ 2

“Interpretation of Epidemiologic Evidence” Osteoporosis as a Health Issue

Producer of • The US Surgeon General estimates that one out of every two Epidemiologic Evidence women over the age of 50 will have an osteoporosis-related fracture in their lifetime. • In addition, 20% of those affected by osteoporosis are men with 6% of white males over the age of 50 suffering a hip fracture. Personal level Population level • It is estimated that the national direct care costs for osteoporotic fractures is US$12.2 to 17.9 billion per year in 2002 dollars, with costs rising. • This cost is comparable to the Medicare expense for coronary heart disease ($11.6 billion) (Thom et al 2006) Consumer of • John A Sunyecz. The use of calcium and vitamin D in the management of osteoporosis. Therapeutics and Clinical Risk Management 2008:4(4) 827–836 at http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2621390/pdf/TCRM-4-827.pdf Epidemiologic Evidence 3 4

The Use of Calcium and vitamin D in the Anlene advertisement Management of Osteoporosis (2004, 2008)

Therapeutics and Clinical Risk Management 2008:4(4) 827–836 at http://www.surgeongeneral.gov/library/rep http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2621390/pdf/TCRM-4-827.pdf orts/bonehealth/OsteoBrochure1mar05.pd f

5 6 http://adsoftheworld.com/files/images/anlene_spine.jpg

Sibu Research Seminar 2013 1 Datu Dr Andrew Kiyu 24/6/2013

Calcium Supplements Increase Heart Attack Risk by 86 Percent (Heart 2012)

• New research published in the journal Heart has confirmed the findings of two controversial studies on calcium supplementation and heart attack risk published in the British Medical Journal last year, and which found a 24-27% increased risk of heart attack for those who took 500 mg of elemental calcium a day.[1] [2] • The results of this newest review, involving 24,000 people between the ages of 35 and 64, were even more alarming. Those participants who took a regular calcium supplement increased their risk of Flow Galindez. Fight Osteoporosis with two glass of Anlene everyday. Sunday, April 3, 2011. having a heart attack by 86% versus those who took no calcium supplements at all. http://angsawariko.blogspot.com/2011/04/fight-osteoporosis-with-two-glass-of.html • Conclusions: Increasing calcium intake from diet might not confer significant cardiovascular benefits, while calcium supplements, which might raise MI risk, should be taken with caution.

7 • Heart 2012;9 8:920e925. doi:10.1136/heartjnl-2011-301345 at http://heart.bmj.com/content/98/12/920.full.pdf+html 8

How to read a paper Greenhalgh T, Taylor R. BMJ 1997

1. The Medline database. BMJ 1997;315:180-183 (19 July). http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2127107 2. Getting your bearings (deciding what the paper is about). BMJ 1997;315:243-246 (26 July). http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2127173 3. Assessing the methodological quality of published papers. BMJ 1997;315:305-308 (2 August). http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2127212 4. Statistics for the non-statistician. I: Different types of data need different statistical tests. BMJ 1997;315:364- 366 (9 August). http://www.pubmedcentral.nih.gov/picrender.fcgi?artid=2127256&blobtype=pdf 5. How to read a paper: Statistics for the non-statistician. II: "Significant" relations and their pitfalls. BMJ 1997;315:422-425 (16 August). http://www.pubmedcentral.nih.gov/picrender.fcgi?artid=2127270&blobtype=pdf 6. Papers that report drug trials. BMJ 1997;315:480-483 (23 August). http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2127321 7. Papers that report diagnostic or screening tests. BMJ 1997;315:540-543 (30 August). NOT ACADEMIC EXERCISE OR A http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2127365 8. Papers that tell you what things cost (economic analyses). BMJ 1997;315:596-599 (6 September). ‘HOW TO’ SESSION http://www.pubmedcentral.nih.gov/picrender.fcgi?artid=2127419&blobtype=pdf 9. Papers that summarise other papers (systematic reviews and meta-analyses). BMJ 1997;315:672-675 (13 September). http://www.pubmedcentral.nih.gov/picrender.fcgi?artid=2127461&blobtype=pdf 10. Papers that go beyond numbers (qualitative research). BMJ 1997;315:740-743 (20 September). http://www.pubmedcentral.nih.gov/picrender.fcgi?artid=2127518&blobtype=pdf 9 10

http://eprints.mdx.ac.uk/2981/1/Developing_a_framework_for_critiquing_health_research.pdf

Natalie Wolchover, (Life's Little Mysteries Staff Writer). What If Humans Had Eagle Vision? 11 Date: 24 February 2012. at http://www.lifeslittlemysteries.com/2184-humans-eagle-vision.html 12

Sibu Research Seminar 2013 2 Datu Dr Andrew Kiyu 24/6/2013

Unpublished evidence Publication bias Published evidence Producers Of Epidemiologic Policymakers Legal Evidence Scientist Truth Health & medicine Public Journal decision to accept Researcher decision whether to submit Which parts to publish for publication Junk Study design analysis EVIDENCE Epidemiologic research/ Epidemiologic Interpretation methods Scientific method paradigm How scientific decisions are arrived at 14

Evidence Evidence

• Evidence is and includes everything that is used to • Research findings are defined (for the purpose of reveal and determine the truth, and therefore is this paper) as presumed to be true and related to a case – any relationship reaching formal statistical • Scientific evidence is evidence which serves to either significance. support or counter a scientific theory or hypothesis.

– Such evidence is expected to be empirical evidence and in accordance with scientific method • However, “Negative” research is also very useful (Ioannidis. 2005)

• Empirical evidence is a source of knowledge • John P.A Ioannidis. Why most published research findings are false. PLoS Medicine. August 2005; acquired by means of observation or vol2;Issue 8. p0696-0701 experimentation

• Ref: http://en.wikipedia.org/wiki/Evidence

• http://en.wikipedia.org/wiki/Scientific_evidence 15 16 • http://en.wikipedia.org/wiki/Empirical_evidence

Scientific evidence Absence of evidence is not evidence of absence

• By convention a P value greater than 5% (P>0.05) is called “not • As Bradford Hill (1965) said nearly 50 years ago: significant.” – “All scientific work is incomplete—whether it be • Randomised controlled clinical trials that do not show a observational or experimental. significant difference between the treatments being compared

are often called “negative.” • All scientific work is liable to be upset or modified by advancing knowledge. • This term wrongly implies that the study has shown that there is no difference, whereas • That does not confer upon us a freedom to – usually all that has been shown is an absence of evidence of – ignore the knowledge we already have, or a difference. – to postpone the action that it appears to demand at a • These are quite different statements. given time.” • Source: Douglas G Altman, J Martin Bland. Statistics notes: Absence of evidence is not evidence of absence. BMJ 1995;311:485 at http://www.bmj.com/content/311/7003/485 • Bradford Hill A. 1965. The environment and disease: association or causation? Proc R Soc Med 58:295–300.

17 19

Sibu Research Seminar 2013 3 Datu Dr Andrew Kiyu 24/6/2013

PPV of Research Findings for Various Combinations of Power (1 − β), Ratio of Why Most Published Research Findings True to Not-True Relationships (R), and Bias (u)

Are False by John P. A. Ioannidis (2005) 1 − β R u Practical Example PPV 0.80 1:1 0.10 Adequately powered RCT with little bias and 1:1 pre- 0.85 • The probability that a research claim is true may study odds depend on: 0.95 2:1 0.30 Confirmatory meta-analysis of good quality RCTs 0.85 1. study power and bias, 0.80 1:3 0.40 Meta-analysis of small inconclusive studies 0.41 0.20 1:5 0.20 Underpowered, but well-performed phase I/II RCT 0.23 2. the number of other studies on the same 0.20 1:5 0.80 Underpowered, poorly performed phase I/II RCT 0.17 question, 0.80 1:10 0.30 Adequately powered exploratory epidemiological study 0.20 • and, most importantly, 0.20 1:10 0.30 Underpowered exploratory epidemiological study 0.12 0.20 1:1,000 0.80 Discovery-oriented exploratory research with massive 0.0010 3. the ratio of true-to-no-relationships among the testing relationships probed in each scientific field. 0.20 1:1,000 0.20 As in previous example, but with more limited bias (more 0.0015 standardized) – Ioannidis JPA (2005) Why most published research findings are false. PLoS Med 2(8): e124. R = the ratio of the number of “true relationships” to “no relationships” among those tested in the field U = the proportion of probed analyses that would not have been “research findings,” but nevertheless end up presented and reported as such, because of bias The estimated PPVs (positive predictive values) are derived assuming α = 0.05 for a single study. RCT, randomized controlled trial. 22 Source: Ioannidis JPA (2005) Why most published research findings are false. PLoS Med 2(8): e124. 23

Why Most Discovered True Associations Are Why Most Discovered True Associations Are Inflated Inflated (John P. A. Ioannidis, 2008) (John P. A. Ioannidis, 2008)

True % of exposed Sample n Per Group Observed OR in Significant Association 1. Be cautious about effect sizes (equal no. of OR Individuals in the Median IQR Median Fold – (and even about the mere presence of any effect in new discoveries) Control Group participants in each group) Inflation 2. Consider rational down-adjustment of effect sizes 1.10 30 1000 1.23 (1.23–1.29) 1.11 3. Consider analytical methods that correct for anticipated inflation 1.10 30 250 1.51 (1.49–1.55) 1.37 4. Ignore effect sizes arising from discovery research 1.25 30 1000 1.29 (1.26–1.39) 1.03 5. Conduct large studies in discovery phase 1.25 30 250 1.60 (1.50–1.67) 1.28 6. Use strict protocols for analyses 1.25 30 50 2.73 (2.60–3.16) 2.18 7. Adopt complete and transparent reporting of all results 8. Use methodologically rigorous, unbiased replication (potentially ad infinitum) IQR = Interquartile Range 9. Be fair with interpretation Source: John P. A. Ioannidis. Why Most Discovered True Associations Are Inflated. Epidemiology 2008;19: 640–648) • Source: John P. A. Ioannidis. Why Most Discovered True Associations Are Inflated. Epidemiology 2008;19: 640–648)

24 25

Unpublished evidence Publication bias Published evidence Producers Of Epidemiologic Policymakers Legal Evidence Scientist Truth Health & medicine Public Journal decision to Pseudoscience accept Researcher decision whether to submit Which parts to publish for publication Junk science Study design analysis TRUTH Epidemiologic research/ Epidemiologic Interpretation methods Scientific method paradigm How scientific 26 decisions are arrived at

Sibu Research Seminar 2013 4 Datu Dr Andrew Kiyu 24/6/2013

Truth, Reality, Scientific Fact Truth in Science

• ‘Truth' involves both the • “Truth in science can be defined as the working hypothesis – quality of "faithfulness, fidelity, loyalty, sincerity, veracity", best suited to open the way to the next better one.” and —Konrad Lorenz (1903 – 1989) Austrian – zoologist, ethologist, and ornithologist. 1973 Nobel Prize in Physiology or Medicine "for that of "agreement with fact or reality", discoveries in individual and social behavior patterns • In philosophy, reality is the state of things as they actually exist, rather than as they may appear or might be imagined • Scientific truth is a moving target • A scientific fact is an objective and verifiable observation, • A major problem is that it is impossible to know with 100% what the truth is in any research question. – in contrast with a hypothesis or theory, which is intended

to explain or interpret facts • In this regard, the pure “gold” standard is unattainable – http://en.wikipedia.org/wiki/Truth (Ioannidis 2005). – http://en.wikipedia.org/wiki/Reality – http://en.wikipedia.org/wiki/Fact 30 • Ref: Ioannidis JPA (2005) Why most published research fi ndings are false. PLoS Med 2(8): e124. 31

Truth and Time

• Lind's discoveries (1753) scurvy and lime (vitamin C) – were not adopted by the British Navy for a full 40 years (1894) • Percival Pott's discovery (in 1775) about how to prevent scrotal cancer, – was not adopted in England for nearly a century, – though quickly adopted in Denmark, • The classic papers on lung cancer and tobacco smoke, – published in the Journal of the American Medical Association by Wynder and Graham and Doll and Hill (in 1950), • were almost rejected by the editor because of the lack of existing knowledge supporting the association. SCIENTIFIC METHOD • Despite numerous studies yielding similar findings, eminent statisticians (R.A. Fisher, Berkson) remained highly sceptical for many years. • Source: • Victor J. Schoenbach with Wayne D. Rosamond. Understanding the Fundamentals of Epidemiology; an evolving text. P25. Department of Epidemiology, School of Public Health, University of North Carolina at Chapel Hill. Fall 2000 revision. www.epidemiolog.net • Hill G, Millar W, Connelly J . "The great debate" 1: Smoking, lung cancer, and cancer epidemiology. Canadian Bulletin of Medical History [2003, 32 38 20(2):367-386] www.cbmh.ca/index.php/cbmh/article/view/538/535

What is Science?

Unpublished evidence Publication bias • “Science is an attempt, to Published evidence Producers Of understand the world, Epidemiologic get a grip on things, Policymakers Legal Evidence Scientist Truth get hold of ourselves, Health & medicine Public Journal decision to steer a safe course. Pseudoscience accept Researcher decision whether to submit Which parts to publish for publication • Microbiology and meteorology now explain Junk science Study design analysis what only a few centuries ago Epidemiologic research/ was considered sufficient cause to burn women Epidemiologic Interpretation methods to death.” Scientific method How scientific paradigm • ― Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark. New York: decisions are arrived at Random House. 1995 40

Sibu Research Seminar 2013 5 Datu Dr Andrew Kiyu 24/6/2013

Scientific Method Scientific Method

The Oxford English Dictionary defines the scientific method as: • The chief characteristic which distinguishes the scientific "a method or procedure that has method from other methods of acquiring knowledge is that characterized natural science since the 17th century, scientists – consisting in seek to let reality speak for itself, – • systematic observation, support a theory when a theory's predictions are confirmed and • measurement, – challenge a theory when its predictions prove false • experiment, and • hypothesis formulation, testing, and modification” • http://en.wikipedia.org/wiki/Scientific_method

• http://en.wikipedia.org/wiki/Scientific_method

41 42

Science as Falsification Karl Popper (Karl Popper 1963)

• Popper is known for his attempt to • “One can sum up all this by saying that repudiate the classical observationalist / – the criterion of the scientific status of a theory is inductivist form of scientific method in favour of empirical falsification its , or refutability, or testability” – (based on the 7 reasons that Popper gave during his speech)

• “These considerations led me in the winter of 1919-20 to conclusions

Sir Karl Popper. which I may now reformulate as follows. 1902- 1994 (aged 92) http://en.wikipedia.org/wiki/Karl_Popper 1. It is easy to obtain confirmations, or verifications, for nearly every theory — if we look for confirmations. 2. Confirmations should count only if they are the result of risky predictions; 3. … 4,5,6,7”

47 • Karl R. Popper. Science as falsification. Excerpt was originally published in Conjectures and Refutations 49 (1963). At http://www.stephenjaygould.org/ctrl/popper_falsification.html

Induction all swans are white Falsification Not all swans are white

http://www.blackswanconsultinggroup.com/wp-content/themes/BlackSwan/images/backgrounds/BlackWaterWhiteSwans.jpg http://himg2.huanqiu.com/attachment2010/2012/0606/20120606111915885.jpg 50 51

Sibu Research Seminar 2013 6 Datu Dr Andrew Kiyu 24/6/2013

Popperian Epidemiology

• Platt (1964) has suggested that it is easier to follow Popper's criterion of falsifiability if one makes multiple hypotheses to explain a phenomenon. • Lakatos (1968) emphasizes, the importance of Popper's philosophy lies in its use of refutability to choose which among competing hypotheses should be pursued rather than to reject hastily any single theory

• Platt, J. R.: Strong inference. Science 146: 347, 1964 EPIDEMIOLOGIC RESEARCH

• Lakatos, I.: Criticism and the methodology of scientific research programmes. Proc. Aristotelian Soc. 69: 149. 1968

• Ref: Buck, C. Popper's philosophy for epidemiologists. Int. J. Epid. 1975, 4 : 159-168

52 53

Expectations from Epidemiologic Research

goal for epidemiologic research: the quantification of the causal relation between exposure and disease Unpublished evidence Publication bias Lowest expectation Highest expectation Published evidence • constrained and so modest and technical • to generate knowledge that Producers Of in nature that - contributes directly to improvements Epidemiologic • Policymakers even our successes have no practical in the health of human populations. Legal Evidence value. • Such research would yield Truth Scientist • E.g., we could define the goal of • new knowledge, and Health & medicine Public epidemiology as • that new knowledge would have Journal decision to • the mechanical process of gathering and Pseudoscience accept beneficial applications to advancing Researcher decision analyzing data and generating statistical public health whether to submit Which parts to publish results, such as odds ratios or regression for publication coefficients, Junk science Study design analysis divorced from potential inferences and applications. Epidemiologic research/ • The ideal study yields a quantitative measure of association that reflects the causal influence of Epidemiologic exposure on disease. Interpretation methods • Methodologic problems and errors cause a deviation between the study results and this ideal Scientific method measure How scientific paradigm Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. decisions are arrived at Oxford. 2003. P.7,9 55

How To Define Effectiveness Of Epidemiologic Research

• To evaluate the quality or strength of epidemiologic evidence, we first need to – clarify what information we can expect epidemiology to provide. • The effectiveness of epidemiologic research must be defined in relation to attainable, specific benchmarks in order to make judgments about how close the evidence comes to reaching the desired EPIDEMIOLOGIC METHODS state of perfection

• Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. Oxford. 2003. P.7 56 58

Sibu Research Seminar 2013 7 Datu Dr Andrew Kiyu 24/6/2013

Unpublished evidence Publication bias Published evidence Producers Of Epidemiologic Policymakers Legal Evidence Gary Taubes. Scientist Truth Epidemiology faces its limits. Health & medicine Science, New Series Vol 269, Public Journal decision to No.5221 (Jul.14, 1995); 164- Pseudoscience accept 165+167-169 Researcher decision whether to submit Which parts to publish for publication Junk science Study design analysis

Epidemiologic research/ Epidemiologic Interpretation methods Scientific method paradigm How scientific decisions are arrived at 61

Epidemiology Faces its Limits Epidemiology Faces its Limits

• In the once fertile garden of epidemiology, • Gary Taubes cites a number of contrary findings, – all is not well, – where exposures have been found to be harmful • (according to some commentators). and – The low-hanging fruit has been plucked, and then safe (or vice versa) – the epidemiological ladder is not long enough to bring the in different studies, remainder within reach. or harmful in different ways, • Possibly the most famous expression of this dissatisfaction is a or harmful when studied using one study report by a journalist writing in Science in 1995 called design “Epidemiology Faces Its Limits”. but not when using another. • Ref: Gary Taubes. Epidemiology faces its limits. Science, New Series Vol 269, No.5221 (Jul.14, 1995); 164-165+167-169 • Ref: Gary Taubes. Epidemiology faces its limits. Science, New Series Vol 269, No.5221 (Jul.14, 1995); 164-165+167-169 • Alex Broadbent ( April 17, 2012) Taubes’ Tautology. http://philosepi.wordpress.com/category/aims-of-epidemiology/ 62 • Alex Broadbent ( April 17, 2012) Taubes’ Tautology. http://philosepi.wordpress.com/category/aims-of-epidemiology/ 63

Epidemiology Faces its Limits

• Taubes interviews a number of eminent epidemiologists and reaches a simple diagnosis: – epidemiology has spotted the big effects already, and – is now scrabbling around trying to identify small ones. • These are much harder to distinguish from biases or chance effects. • Indeed, he hypothesizes that epidemiological methods may be unable to tell the difference at all, in some cases. RESEARCHER’S DECISIONS • In this sense, Taubes suggests, epidemiology is facing its limits

• Ref: Gary Taubes. Epidemiology faces its limits. Science, New Series Vol 269, No.5221 (Jul.14, 1995); 164-165+167-169 • Alex Broadbent ( April 17, 2012) Taubes’ Tautology. http://philosepi.wordpress.com/category/aims-of-epidemiology/

64 65

Sibu Research Seminar 2013 8 Datu Dr Andrew Kiyu 24/6/2013

Aggressive Discoverer Versus Reflective Replicator

Unpublished evidence Aggressive discoverer Reflective Replicator Publication bias Published evidence What matters is … Discovery Replication Producers Of Databases are … Private goldmines not to Public commodity Epidemiologic Policymakers be shared Legal Evidence Scientist Truth A good Can think of more Is robust about design and Health & medicine epidemiologist … exploratory analyses analysis plan Public Journal decision to Pseudoscience accept Researcher decision One should report … What is interesting Everything whether to submit Which parts to publish for publication Publication mode Publish each association Publish everything as single Junk science Study design, analysis as a separate paper paper Epidemiologic research/ After reporting … Push your findings Be critical/cautious Epidemiologic forward Interpretation methods Scientific method paradigm How scientific decisions are arrived at Source: John P. A. Ioannidis. Why Most Discovered True Associations Are Inflated. Epidemiology 2008;19: 640–648) 67

Unpublished evidence Publication bias Published evidence Producers Of Epidemiologic Policymakers Legal Evidence Scientist Truth Health & medicine Public Journal decision to Pseudoscience accept Researcher decision whether to submit Which parts to publish for publication Junk science Study design, analysis JOURNAL’S DECISION TO ACCEPT Epidemiologic research/ PAPER FOR PUBLICATION Epidemiologic Interpretation methods Scientific method paradigm How scientific 68 decisions are arrived at

Measures of importance of a study Minimizing Mistakes and Embracing Uncertainty (Journal editors’ point of view) • Research progress depends on dissemination of results, and • At high-impact journals we (the Editors) see it as our job to journal articles are the most effective tool we currently have select important articles. to share them. – This means the conclusions reported should be more • The answer, therefore, cannot be that we wait until rather than less likely to be true. conclusions are proven beyond a doubt before we publish • better measures of importance are that a study should them. – address a substantial clinical or public- health question, in • Publication of – as rigorous a way as possible, and – preliminary findings, – the findings should be likely to have an effect on how – negative studies, other researchers think about the question. – confirmations, and refutations is

Source: The PLoS Medicine Editors (2005) Minimizing Mistakes and Embracing Uncertainty. PLoS Med 2(8): • an essential part of the process of getting closer to the e272. doi:10.1371/journal.pmed.0020272 truth • Source: The PLoS Medicine Editors (2005) Minimizing Mistakes and Embracing Uncertainty. PLoS Med 2(8): e272. 70 doi:10.1371/journal.pmed.0020272 71

Sibu Research Seminar 2013 9 Datu Dr Andrew Kiyu 24/6/2013

Minimizing Mistakes and Embracing Uncertainty

• Too often editors and reviewers reward only the cleanest results and the most straightforward conclusions. • At PLoS Medicine, we seek to create a publication environment that is comfortable with uncertainty. • We encourage authors to discuss biases, study limitations, and potential confounding factors. • We acknowledge that most studies published should be viewed as hypothesis-generating, rather than conclusive. USERS OF EPIDEMIOLOGIC • And we publish high-quality negative and confirmatory studies. EVIDENCE • Source: The PLoS Medicine Editors (2005) Minimizing Mistakes and Embracing Uncertainty. PLoS Med 2(8): e272. doi:10.1371/journal.pmed.0020272

72 73

Growing interest in understanding and using epidemiology Unpublished evidence Publication bias By media, courtroom, public, policy-makers -- with two extreme points of view compared Published evidence to more experienced epidemiologists Producers Of • so impressed with our More experienced • more impressed with the : Epidemiologic findings epidemiologists • lengthy list of potential Policymakers Legal Evidence • observed associations appreciate that sources of error, the Scientist Truth between agents and the truth ubiquitous potential Health & medicine disease are taken as lies somewhere confounders, Public Journal decision to Pseudoscience accept direct reflections of between the • the seemingly unending Researcher decision causal effects extremes. controversy and flow of whether to submit Which parts to publish • with little need for criticism among for publication scrutiny or caution epidemiologists, Junk science Study design, analysis • believe that all our Epidemiologic observations are hopelessly research/ flawed and cannot be trusted Epidemiologic Interpretation methods as indicators of causal relations Scientific method Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. Oxford. 2003. P.v paradigm How scientific decisions are arrived at 75

Difference In Expectations From Epidemiologic How the Public View Epidemiologic Evidence Evidence Between Researchers And Public

There may be tension between the : • Researchers – Cautious – who wish to ask narrow, modest questions of the data • (for which it may be well-suited) • Public – who wish to ask the broadest possible questions of ultimate societal interest

• (for which the data are often deficient). http://adai.files.wordpress.com/2006/12/borgman042797_600x385.jpg%3Fw%3D508%26h%3D397

Cartoon used by: (1) Kenneth Rothman when he delivered the 12th Annual Saward-Berg Lecture at the University of Rochester • Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University on the Public Perception of Epidemiology, (Friday, March 9, 2012 at the University of Rochester School of Nursing Press. Oxford. 2003. P.13 Auditorium.), and (2) Cristine Russell. Living Can Be Hazardous to Your Health: How the News Media Cover Cancer Risks. Monogr Natl Cancer Inst 1999;25:167–70 76 77

Sibu Research Seminar 2013 10 Datu Dr Andrew Kiyu 24/6/2013

How the Public View Epidemiologic Evidence Junk Science

• Definition of Junk Science: – “faulty scientific data and analysis • used to further a special agenda” (junkscience.com)

• David Michaels and Celeste Monforton. Manufacturing Uncertainty: Contested Science and the Protection of the Public’s Health and Environment. Am J Public Health. 2005;95:S39–S48. doi:10.2105/AJPH.2004.043059 • Junkscience.com at http://junksciencearchive.com/define.html

http://imgc.allpostersimages.com/images/P-473-488-90/60/6063/TMED100Z/posters/mischa-richter-the-scientific- community-is-divided-some-say-this-stuff-is-dangerous-so-new-yorker-cartoon.jpg

The scientific community is divided. Some say this stuff is dangerous, so… - New Yorker Cartoon Premium Giclee Print, By Mischa Richter, Item #: 15063145590A. Published March 21, 1988

Cartoon used by: (1) Kenneth Rothman when he delivered the 12th Annual Saward-Berg Lecture at the University of Rochester on the Public Perception of Epidemiology, (Friday, March 9, 2012 at the University of Rochester School of Nursing Auditorium.), and (2) Cristine Russell. Living Can Be Hazardous to Your Health: How the News Media Cover Cancer Risks. Monogr Natl Cancer Inst 1999;25:167–70 78 79

Junk Science Manufacturing Uncertainty

• Opponents of public health and environmental regulations • This strategy of manufacturing uncertainty is often try to – antithetical to the public health principle that – “manufacture uncertainty” by – decisions be made using the best evidence available. – questioning the validity of scientific evidence on which the regulations are based. • The public health system • This strategy is – must ensure that scientific evidence is – most identified with the tobacco industry, • evaluated in a manner that – has also been used by producers of other hazardous – assures the public’s health and environment will be products. o adequately protected • Its proponents use the label “junk science” to • David Michaels and Celeste Monforton. Manufacturing Uncertainty: Contested Science and the Protection of the Public’s – ridicule research that threatens powerful interests. Health and Environment. Am J Public Health. 2005;95:S39–S48. doi:10.2105/AJPH.2004.043059

• David Michaels and Celeste Monforton. Manufacturing Uncertainty: Contested Science and the Protection of the Public’s Health and Environment. Am J Public Health. 2005;95:S39–S48. doi:10.2105/AJPH.2004.043059 80 81

“Absence of evidence is not evidence of absence” (as used by Pseudo-science)

• Appeal to ignorance — the claim that – whatever has not been proved false must be true, – and vice versa • (e.g. There is no compelling evidence that UFOs are not visiting the Earth; – therefore UFOs exist — . and there is intelligent life elsewhere in the universe.)

• This impatience with ambiguity can be criticized in the phrase: USE OF SCIENTIFIC EVIDENCE absence of evidence is not evidence of absence.

• ― Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark. New York: Random House. 1995 Ch. 12 : The Fine Art of Baloney Detection, p. 221

82 83

Sibu Research Seminar 2013 11 Datu Dr Andrew Kiyu 24/6/2013

Layers of application of epidemiologic evidence Unpublished evidence Publication bias Using the data to Published evidence estimate the Producers Of quantitative effect

Epidemiologic of exposure on Beyond study The public at Policymakers Evidence the occurrence of population, but similar large Legal disease in study characteristics Scientist Truth Health & medicine population Study Public Journal decision to population Pseudoscience accept • use of that evidence to Researcher decision estimate the effect of whether to submit Which parts to publish exposure on disease for publication • for a broader population Junk science Study design, analysis outside the study Epidemiologic • but otherwise socially research/ and demographically Epidemiologic similar to the study whether a change in clinical Interpretation methods population, practice or individual behavior Scientific method • perhaps to help is warranted – How scientific paradigm formulate policy. This may be beyond the decisions are arrived at epidemiologic data from the86 study

Sufficiency of Epidemiologic data - Feasibility and costs Sufficiency of Epidemiologic data - Feasibility and costs of actually modifying behaviour of actually modifying behaviour

• As the goal is expanded, • Beyond the scope of epidemiology comes the – moving from a characterization of the risks and benefits of – feasibility and costs of alternative courses of action to • actually modifying exposure through • the question of what should be done and • how to achieve the desired end, –behavior change, – the sufficiency of even impeccable epidemiologic –clinical guidelines, or information diminishes and –regulation. o considerations outside of epidemiology

• Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University o often become increasingly prominent. Press. Oxford. 2003. P.13 • Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. Oxford. 2003. P.13

87 88

What is Sufficient Epidemiologic Evidence for Policy Decision?

• Policy decisions or individual decisions can be viewed as an integrated assessment of the risks and benefits among alternative courses of action • Thus “sufficient epidemiologic evidence” is specific to the situation, – depending on the weight of other factors promoting or discouraging the different policy options. • For a given level of epidemiologic evidence, INTERPRETATION OF SCIENTIFIC – extraneous considerations • define whether the balance tips in favor of one action EVIDENCE or another

• Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. Oxford. 2003. P.24-25 89 90

Sibu Research Seminar 2013 12 Datu Dr Andrew Kiyu 24/6/2013

Goal of interpreting Epidemiologic Evidence Unpublished evidence Publication bias Published evidence Producers Of • In order to make informed judgments about the Epidemiologic Policymakers results. Legal Evidence Scientist Truth – We assess how much confidence one can have in Health & medicine Public Journal decision to a given set of findings, by Pseudoscience accept Researcher decision whether to submit Which parts to publish • assessing the meaning and persuasiveness of for publication epidemiologic evidence Junk science Study design, analysis

• Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Epidemiologic Press. Oxford. 2003. P.v research/ Epidemiologic Interpretation methods Scientific method paradigm How scientific decisions are arrived at 92

How to Measure Causal Relations Between How to Measure Causal Relations Between Exposure and Disease Exposure and Disease

• With measurement as the goal, Rothman (1986) – assessment of epidemiologic evidence focuses on the • Emphasised the estimation of causal effects as the focus of aspects of epidemiology • study design, – tried to dislodge epidemiologists from testing statistical • conduct, and hypotheses as a goal • analysis that – persuade them to focus on quantifying measures of association. – may introduce distortion or enhance the accuracy of measurement.

• Source: • Kenneth J Rothman. Modern Epidemiology> Boston: Little Brown and Co., 1986. (Now third edition (2008) by Kenneth J. • Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Rothman , Sander Greenland , Timothy L. Lash ) Press. Oxford. 2003. P.10 • DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. Oxford. 2003. P.7

93 94

Alternative Explanations In Assessment Of Hill’s Criteria of Causation (1965) Causality • Hill asked: “What aspects of that association should we especially consider The alternative explanations of observed association: before deciding that the most likely interpretation of it is causation?” 1. Chance1 • He suggested that the following aspects of an association be considered in 2. Bias - information bias1 – Recall Bias, Interviewer Bias, Reporting Bias, Surveillance Bias attempting to distinguish causal from noncausal associations: 3. Bias - selection bias (including publication bias) 1 1. strength, 4. Bias – misclassification bias 2. consistency, – Erroneous classification of the exposure or disease status of 3. specificity, an individual into a category to which it should not be assigned 4. temporality, 5. Constructs 2 5. biological gradient, – How effectively the operational definitions approximate the 6. plausibility, constructs of ultimate interest 7. coherence, 1 6. Bias - analytic bias 8. experimental evidence, and 7. Confounding1 9. analogy. Reference: • Source: Hill AB. The environment and disease: association or causation? Proc R Soc Med. 1965;58:295–300., commented on 1. HILL, A.B. The environment and disease: association or causation. Proceedings of the Royal Society of Medicine, 58: 295–300 (1965) cited in WHO by : Regional Office for Europe (2000). Evaluation And Use Of Epidemiological Evidence For Environmental Health Risk Assessment p17-18 96 • Kenneth J. Rothman and Sander Greenland, Causation and Causal Inference in Epidemiology Am J Public Health. 98 2. DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. Oxford. 2003 2005;95:S144–S150. doi:10.2105/AJPH.2004.059204

Sibu Research Seminar 2013 13 Datu Dr Andrew Kiyu 24/6/2013

Unpublished evidence Publication bias Published evidence Producers Of Epidemiologic Policymakers Legal Evidence Scientist Truth Health & medicine Public Journal decision to Pseudoscience accept Researcher decision whether to submit Which parts to publish for publication RECOMMENDED STEPS BY SAVITZ (2003) Junk science Study design, analysis

Epidemiologic research/ Epidemiologic Interpretation methods Scientific method paradigm How scientific 99 decisions are arrived at

Interpretation Of Results: Key Questions Estimation of Measures Of Effect

1. How good are the data? • The starting point for evaluating the validity of results is 2. Could chance or bias explain the results? – to calculate and present estimates of – the effect measure or measures of primary interest 3. How do the results compare with those from other • Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University studies? Press. Oxford. 2003. P.34 4. What theories or mechanisms might account for findings? 5. What new hypotheses are suggested? 6. What are the next research steps? 7. What are the clinical and policy implications?

Source: Victor J. Schoenbach and Wayne D. Rosamond. Understanding the Fundamentals of Epidemiology: an evolving text. Department of Epidemiology, School of Public Health, University of North Carolina at Chapel Hill. Fall 2000 Edition. P481 101 102

Estimation of Measures Of Effect Summary Judgement of Experts

• The measure of interest is quantitative, not qualitative. • The ideal • Thus, the object of evaluation is not a statement of a – comprehensive, conclusion, – e.g., exposure does or does not cause disease, or – quantitative, • an association is or is not present. – objective assessment of evidence is,

• of course, • Instead, the product of the study is – unattainable in practice, but – a measurement of effect and – serves as a standard to – quantification of the uncertainty in that estimate, • e.g., we estimate that the risk of disease is – which interpreters of epidemiologic evidence – 2.2 times greater among exposed than unexposed persons should aspire.

. (with a 95% confidence interval of 1.3 to 3.7), • Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. Oxford. 2003. P.30

• Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. Oxford. 2003. P.34 103 107

Sibu Research Seminar 2013 14 Datu Dr Andrew Kiyu 24/6/2013

Summary Judgement of Experts Advantages of Relying on Authoritative Individuals or Groups • An easier and perhaps more commonly used approach to • The advantages of relying on authoritative – assessing evidence is to individuals or groups are the: • rely on a summary judgment of experts, 1. speed with which a summary of the evidence – either individually or as a committee. can be generated, 2. ease of explaining the process (at least at a • Some examples of judgement of experts: superficial level) to outsiders, and 1. Peer review of manuscripts submitted for publication, 3. credibility that authorities have 2. informal assessment by colleagues, and 3. consensus conferences • among both experts and non-experts.

• Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Press. • Source: DAVID A. SAVITZ. Interpreting Epidemiologic Evidence: Strategies for Study Design and Analysis. Oxford University Oxford. 2003. P.30 Press. Oxford. 2003. P.30

108 109

Summary

• For CPD: – Read the 10 BMJ articles on How to Read a Paper (free download) – Read David Savitz book (available for download – free)

• For major decision-making or advice – – work as an expert team; – don’t make recommendations alone

• For personal use: – Don’t try anything that needs drastic changes to lifestyle or finance, until the evidence is more firm SUMMARY • For outbreak investigation, Remember — – the answer lies in the differences • (Karl Popper and the white swans); – not in the similarities 111 112

Sibu Research Seminar 2013 15