Research integrity (and ethics in science)

Sabine Van Doorslaer Research integrity issues (and possible solutions) are discussed widely in the press all over the world Most university governers have taken research integrity as a central theme in their policy Flanders 2040 – Challenges for the LOCAL EXAMPLE universities of the future Herman Van Goethem Flemish rectors address research integrity in their opening speeches of the academic year 2018-2019

Themes of Open access and quality measurement of research are addressed https://www.uantwerpen.be/images/uantwerpen/container1957/files/auha_rede_pagesEN_dig_180921(1).pdf

Integrity and trust: absolutely vital for a university Luc Sels

https://www.kuleuven.be/communicatie/congresbureau/opening-academiejaar/english/speeches/speech-by-rector-luc-sels Some descriptions of science entail already its biggest dangers

Wikipedia-definition: “Die Geschichte der Wissenschaften ist eine Science is a systematic enterprise grosse Fuge, in der die Stimmen der Völker nach that builds and organizes knowledge und nach zum Vorschein kommen” in the form of testable explanations In “Wilhelm Meisters Wanderjahre” – J. W. and predictions about the universe Goethe

“Science does not purvey absolute truth, science is a mechanism. It’s a way of trying to improve your knowledge of nature, it’s a system for testing your thoughts against the universe and seeing whether they match.” Isaac Asimov (writer and biochemist)

“Science alone of all the subjects contains within itself the lesson of the danger of belief in the infallibility of the greatest teachers of the preceding generation.” Richard Feynman ( and Nobel prize winner), talk held in 1966 for the National Teachers Association, US ALLEA CODE OF CONDUCT FOR RESEARCH INTEGRITY

Scientists have a moral responsibility

Research practices are based on very fundamental principles

The European Code of Conduct for Research Integrity, ALLEA, 2017 [1] A code of conduct… ?

When are the fundamental principles being violated? Violations of Scientific Integrity

(1) FFP

The European Code of Conduct for Research Integrity, ALLEA, 2017 [1] Violations of Scientific Integrity (2) grey zone – sloppy science

The European Code of Conduct for Research Integrity, ALLEA, 2017 [1] Violations of Scientific Integrity This case attracted for the first Examples of fabrication and falsification time the attention to the

Dr. John Darsee [2,3] possibility of scientific fraud - Brilliant student and medical researcher (cardiology) - 1979: becomes research fellow in the Cardiac Research Laboratory in Harvard University - He is performing so well that his direct boss gives him a tenured position in 1981 - BUT: in that year, doubts start to arise about his research results - More investigations reveal that Darsee invented and falsified data - 1983: Several of his scientific papers were withdrawn from the literature and Darsee gets a disciplinary penalty (essential loses right to act as medical doctor)

- Darsee names stress, too much work and ambition as reason for FF [3] - Co-authos knew little about the scientific work [2] - Darsee’s boss (E. Braunwald) was criticized because he had to little interaction with his senior researcher and he even created a stressful atmosphere in the lab, urging to get everything done quickly [2,3] - Even years after the discovery of the fraude, Darsee’s work is being cited [3] Violations of Scientific Integrity Examples of fabrication and falsification

Was it the first case? Of course not, another example

Robert A. Millikan [4]

Nobel prize in 1923 for the measurement of the charge of the via an In his “seminal” paper of 1913 [5] Millikan writes

Discovery of his lab books: he performed 140 experiments, and took only those that agreed with his hypothesis (“publish this surely/beautiful!!” versus “error high, will not use”) Violations of Scientific Integrity Examples of fabrication and falsification

Was it the first case? Of course not, another example

Robert A. Millikan [4]

There was a warning in the first paper [6], where he described a first attempt with a set-up that was conceptually less well designed

He actually says here that he is in the habit of discarding observations that do not match his expectations [7] Violations of Scientific Integrity Examples of fabrication and falsification

Was it the first case? Of course not, an other example

Robert A. Millikan [4]

Felix Ehrenhaft (Austrian physicist) - In competition with Millikan (and they were criticizing each other’s experiments) - had done similar experiments and found a large spread on the results -> his results seemed less reliable than those of Millikan [4,7,8]

Millikan was so convinced of being right and did some “cherry picking” in his 1913 paper -> the scientific community is marveled by the results and he gets the Nobel Prize [4] Moreover, Millikan’s scientific intuition was correct

Is science harmed here ? Violations of Scientific Integrity Examples of fabrication and falsification

Was it the first case? Of course not, an other example

Robert A. Millikan

It was a case of falsification, and yet there is still a lot of discussion on-going. Some argue that Millikan was merely neglecting “failed” experiments, others see this as fraud

Modern (educational) books on physics ignore largely this fraud discussion and Millikan-Ehrenhaft controversy [8]: Violations of Scientific Integrity Examples of fabrication and falsification

Was it the first case? Of course not, an other example

Robert A. Millikan There was more: Harvey Fletcher did his PhD with Millikan, was essential in the conception of the idea of using an oil drop method and designed the set-up. Even though he was a co-author on later papers, it was Millikan who was the sole author on the first paper (ensuring him the Nobel prize)

Harvey Fletcher [9]:

Loyalty can play an important role in sustaining the fraud It is not always that easy to judge Personal conflicts may play role in alleged research misconduct cases

1821 – time of his apprenticeship with Davy

Faraday makes a current-carrying wire rotate about a magnet Publishes it in Quaterly Journal of Science

Michael Faraday (apprentice) Accuses Faraday of stealing his ideas (Wollaston had done some unsuccesful experiments at the Royal Institution when visiting Davy)

If proven right, it would have ended Faraday’s career (ungentelmanly behavior was not tolerated)

BACKGROUND OF THE CONFLICT: William Hyde Wollaston How appropriate was it for a lab assistant to presume to carry out (aristocrate) experiments in a field already “owned” by other, more important figures? Can the assistant’s work even be considered to belong to him rather than the master?

I. Rhys Morus, ‘Michael Faraday and the electrical century’, Icon Books Ltd., London (2017) Violations of Scientific Integrity Examples of fabrication and falsification

Dr. Diederik Stapel [10] • Till 2011: full professor social psychology at University Tilburg • Won several scientific prizes • Attracted a lot of media attention (“pop-star researcher”): a.o. meat eaters are less social and people become more rude when they think of meat • 2011: 3 young researchers from his group ventilate their suspicion of fraud to the head of department • Stapel turn out to have frequently fabricated data. He gave his students excel- files, with so-called data from inquiries, but he had fabricated the data himself. Many papers were then retracted. • New York Times called him “The biggest con man in academic science”

• This had implications for his co-workers that had drawn scientific conclusions on the basis of the data provided by him -> all their work was retracted • He gives ambition as reason and the fact that some of the experimental data was not clear enough, while it was obvious to him what should be the true answer Violations of Scientific Integrity Examples of fabrication and falsification

Dr. Diederik Stapel [11] Violations of Scientific Integrity Examples of fabrication and falsification closer to home

https://retractionwatch.com

This researcher worked both at the Leiden University Medical Centre and at UAntwerpen

After the discovery of the fraud, she was fired in both institutes

(2013)

Again work stress and failing resultats are given by the researcher as reasons for the fraud EOS survey (2013) – Science fraud – the hard figures (R. Verbeke)

https://studylib.net/doc/8650857/science-fraud--the-hard-figures Nature survey (2016) – Reproducibility of science [12]

Questions answered by 1576 researchers Nature survey (2016) – Reproducibility of science [12]

Questions answered by 1576 researchers

Trust in published research is very low in scientific community Why scientific misconduct?

Sociology of science according to Robert Merton [13]

Merton described the 4 norms of science: “universalism, communism, disinterestedness, organized skepticism”

Extract from [13]:

-> personal ambition can play an important role (see Stapel) Why scientific misconduct? There are many roads that lead to misconduct T Comment in Nature in 2018 [14] R A G

WORK LOAD E D I E S

© Nature Something to discuss during the break...

[15]

True or false?? Why scientific misconduct? Was introduced for as a tool for The neoliberal research evaluation system [4,16-19] librarians to make an optimal choice in which journals to buy Science is captured and evaluated in numbers: given a limited budget

h index, number of papers, impact factors of papers, number of citations, number of finished PhDs, amount of research money, number of projects, …

Web of science: example of Kurt Wüthrich (Nobel prize winner)

An array of bibliometric tools !! Why scientific misconduct?

The neoliberal research evaluation system [4,16-19]

Science is captured and evaluated in numbers:

h index, number of papers, impact factors of papers, number of citations, number of finished PhDs, amount of research money, number of projects, …

Is used in all evaluations: to From [17]: assign (governmental) research money (grants), promotions and hiring of researchers, … science is a competitive business

Where and how much you publish has a direct influence on what you can realize in your research. Bad metrics can end your scientific career ! PUBLISH OR PERISH -> this is bound to lead to misconduct Why scientific misconduct?

The neoliberal research evaluation system [4,16-19]

The dangers of relying on bibliometrics instead of content

Impact factor 15.27 271 x cited

Diabetes Care, 17, 152 (1994)

The trapizoidal rule known to secondary school children !! Why scientific misconduct?

The neoliberal research evaluation system [4,16-19]

Relying on bibliometrics – makes that administration rules It can lead to a tremendous frustration

Eric Boodman, 2018

https://www.statnews.com/2018/10/26/typeface-glitch-dooms-va-ptsd-funding/ What is according to you a correct way to prepare a scientific publication ? Scientific publishing How it should be (and often is) • Authors A,B and C preform research on a certain problem, they check each other’s data and discuss and interpret the data together. • Together they write their findings in a paper (and they discuss extensively how they will bring the message across). • The data are open for the 3 authors and none of them is hiding any information or data • The authors refer to the relevant papers of others (and themselves) • A, B and C will of course be co-author. • The manuscript will be sent to a scientific journal • A, B and C do not try to influence the editor at any moment • The editor sends the manuscript to reviewers D and E. Both read the paper and evaluate it and decide whether it can be published (possibly asking for revisions). Their decision is purely on the basis of scientific arguments. • D and E mention all possible conflicts of interest to the editor • D and E are honest in their evaluation and friendship or competition with the authors or own interest should not interfere with their evaluations • At revision the process repeats itself Bad influence of the neoliberal evaluation system on the publication behaviour What sometimes happens … (a couple of possibilities) “Guest authorship”

Some authors are added, without a significant contribution, often without even reading the paper, because

- they lead the group (they are research managers and not researchers) - Out of strategical considerations: important for a next career step - Because they pay for it [20] Bad influence of the neoliberal evaluation system on the publication behavior

The fight for the order in the author list (discipline-dependent)

Chemistry and physics First author-> has done most of the lab work and/or theoretical work Shared first-authorship is possible Corresponding/last auteur -> is the person that led the research, the senior researcher -> sometimes one cheats and changes order or leaves out people!

Recent views [21] Bad influence of the neoliberal evaluation system on the publication behaviour

Some other common practices:

• Deals with colleagues to cite each other papers even when it is not necessary

• Selfcitations that are not necessary

• Salami slicing and self plagiarism [22]

• Leaving out data (think of Mullikan)

© https://www.enago.com/academy/perspective- • Data falsification/fabrication (think of Stapel) how-to-share-opinion-on-research-articles/

• Reviewers blocking paper because they have a similar study running and want to be the first

• Reviewers/editors demanding that their papers or papers of the journal are cited … All scientific journals have a code of conduct for authors and reviewers Bad influence of the neoliberal evaluation system on the publication behaviour

Publishers exploit this evaluation system (universities have to pay to read the work that their own researchers wrote, edited and reviewed) Papers behind a pay-wall

Predatory journals: journals asking the researcher to pay in order to get accepted. Often no proper reviewing, just fast money (for the publisher of course) Bad influence of the neoliberal evaluation system on the publication behaviour

Publishers exploit this (universities have to pay to read the work that their own researchers wrote, edited and reviewed) Papers behind a pay-wall

Predatory journals: journals asking the researcher to pay in order to get accepted. Often no proper reviewing, just fast money (for the publisher of course)

POSSIBLE SOLUTION Funding agencies + governments demand open access publications [23]

Unfortunately this also catalizes growth of the predatory journals

See also OPEN SCIENCE LECTURE of F. Verbruggen Bias thinking – it happens to us all

Standard example of bias thinking Bias thinking – it happens to us all Cognitive psychology [24]

What can lead to bias thinking?

- Popularity: the tendency to do (or believe) what many others do or believe - Belief bias: someone’s evaluation of the logics behind an argument is based on the probability - Clustering-illusion: the human tendency to recognize everywhere patterns, even if they are not there - Confirmation bias: the tendency to search for information or interpret information such that it confirms the own hypotheses - Framing effect: the conclusion is dependent on how the information is represented

These mistakes are systematic, predictable, wide-spread and hard to get rid of

It is ascribed to evolutionary developments: for our natural survival reflexes, we need to think fast -> fast conclusions lead to errors Bias thinking – it happens to us all

Examples of bias-thinking in scientific experiments

- 1989: Cold fusion in an electrochemical cell: clear signals were ignored about the fact that something was wrong in the interpretation (e.g the observation that the spectrum of the “gamma-radiation” was incorrect was attributed to wrong calibration of the detector) - 2010: Bacterium can grow by using arsenic instead of phosphorus [25]: The scientists were so convinced that they had a breakthrough that was both an important step in the understanding of the origin of life and a key point in discussion about extraterrestrial life, that they started to be sloppy

Possible solutions: • Think slowly (leave time for alternative ideas) • Instead of looking for arguments to support your ideas/hypothesis, try to find arguments that contradict them • Fix the evaluation criteria beforehand. • Independent interpretations by different people • Be aware that you may be subject to bias thinking Bias thinking – it happens to us all

Bias thinking is reinforced by our publication culture and urge to report positive results

“To advance my career I need to get published as frequently as possible in the highest-profile publications as possible. That means I must produce articles that are more likely to get published. These are ones that report positive results (“I have discovered …”, not “I have disproved …”), original results (never “We confirm previous findings that …”), and clean results (“We show that …”, not “It is not clear how to interpret these results”). But most of what happens in the lab doesn’t look like that —instead, it’s mush. How do I get from mush to beautiful results? I could be patient, or get lucky—or I could take the easiest way, making often unconscious decisions about which data I select and how I analyze them, so that a clean story emerges. But in that case, I am sure to be biased in my reasoning.” Brian Nosek (Prof. in Psychology, Univ. of Virgina) (http://nautil.us/issue/54/the-unspoken/the-trouble-with-scientists-rp) Bias thinking – it happens to us all

Bias thinking is reinforced by our publication culture and urge to report positive results

We think it is hard to publish reproduced results, but is this really so?

Nature survey (2016) – Reproducibility of science [12] Responses of 1576 researchers Wrong use of statistics can reinforce bias thinking Correct use of statistics can detect bias thinking or fraud

Important: the right choice of a statistical test !!

Analysis of the statistical significance can indicate scientific misconduct (table taken from [26]) Wrong use of statistics can reinforce bias thinking Correct use of statistics can detect bias thinking or fraud

Important: the right choice of a statistical test !!

p-hacking [27-29]

p-value = measure of whether a result is significant If p < 0.05 than the result is significant

The presented data consists of reported p-values in different papers of 3 psychology journals (figure taken from [28]), but is found also in other science domains [27,29] Effect of funders

Can be positive: give incentives towards research integrity (e.g. Open science [30])

BUT: Science is not always independent

Implicit demands of funders (companies, governments, army, …) can be very bad

© James MacLeod, Evansville

(this is the border between research integrity and research ethics) Difference between research integrity and research ethics?

Example of what a commission will consider as violation of research integrity, but not (so much) of research ethics

The research protocol is ethically responsible in the choice of the participants in the trial and the outline of the research protocol, but the researcher has manipulated the data.

Example of what a commission will consider as violation of research ethics, but not of research integrity The researcher is making no scientific errors in the collection and analysis of the data, no data falsification or fabrication occurs, the work is original, but the research involves gathering data from participants that did not give their consent for the use of this data

-> THE BORDER BETWEEN THE TWO IS VERY THIN AND USUALLY SEMANTIC Science and the media

Just like in Faraday’s times, scientists are urged to communicate to the general public

We should think very carefully about how we communicate to the media -> too often we simplify up till a level that the message becomes incorrect Science and the media

Just like in Faraday’s times, scientists are urged to communicate to the general public

We should think very carefully about how we communicate to the media -> too often we simplify up till a level that the message becomes incorrect

Additional problem: the message is sometimes changed by the media

An example of what happened to colleages in another ITN network here at UAntwerp during communication upon start-up of the project

What do scientists tell: What does the media say? B-Q-MINDED H2020 MSCA ITN The aim of the research that just started is to improve the current state of the MRI technique

New technique gives better results (VRT nieuws (Flemish television)) Science and the media

Just like in Faraday’s times, scientists are urged to communicate to the general public

We should think very carefully about how we communicate to the media -> too often we simplify up till a level that the message becomes incorrect

Additional problem: the message is sometimes changed by the media

[31]

-> Very rarely the mistakes are corrected in the news paper

-> we worry about fake news in politics, but don’t bother to correct the fake news in our own fields What if you ever become the victim of violation of scientific integrity or you notice scientific misconduct ?

Look up the commission in your own institute that deals with scientific misconduct and contact them with your question. Very often they function through a contact person that is bound to secrecy, so you can also ask advice if you are unsure without risking to wrongly accuse a colleague

The formal complaint about the scientific misconduct has to be filed in the university where the accused researcher works/worked. If it is not your own university, your home institute can help to file this complaint (and may even do it for you).

If the complaint is about published data, the publisher/editor should be made aware of potential problems with a paper. All (bona fide) journals have clear procedures for this.

Many countries/regions have a commission for second advice that can be addressed if you feel the local commission has not handled your case appropriately (eg LOWI (the Netherlands), VCWI (Flanders), …) Is scientific integrity an issue on a European policy level ? Is scientific integrity an issue on a European policy level ?

Science Europe

Around 2013 the awareness about this issue has kicked in

H2020 demands all fellows and grant holders to work in a scientific integer way TAKE-HOME MESSAGE

It is fantastic to work as a scientist, but be aware of all the ethical and integrity pitfalls when conducting science Food for thought:

The film ‘On being a scientist’, developed at the University of Leiden

https://www.youtube.com/watch?v=tCgZSjoxF7c

Dilemma games – developed at the Erasmus University Rotterdam (see this afternoon) https://www.eur.nl/en/about-eur/policy-and-finance/integrity-code/dilemma-game References 1. ALLEA The European Code of Conduct for Research Integrity: https://ec.europa.eu/research/participants/data/ref/h2020/other/hi/h2020-ethics_code- of-conduct_en.pdf 2. Barbara J. Culliton, “Coping with Fraud: The Darsee Case”, Science, 220, 31-35 (1983) 3. Carol A Kochan, John M. Budd, “The persistence of fraud in the literature – The Darsee case”, J. Am. Soc. Inform. Sci, 43, 488-493 (1992) 4. David Koepsell, “Scientific Integrity and Research Ethics – An approach from the ethos of science”, Springer International Publishing, 2017. 5. R. A. Millikan, “On the elementary electrical charge and the Avogadro constant”, Phys. Rev., 2, 109-143 (1913). 6. R. A. Millikan, “A new modification of the cloud method of determining the elementary electrical charge and the most probable value of that charge”, Phil. Mag. 19, 209 (1910) 7. G. Holton, “Pesuppositions, and the Millikan-Ehrenhaft Dispute”, Hist. Stud. Phys. Sci., 9, 161-224 (1978). 8. M. A. Rodriguez, M. Niaz, “The oil drop experiment: An illustration of scientific research methodology and its implications for physics textbooks”, Instruct. Sci, 32, 357-386 (2004). 9. H. Fletcher, “My work with Millikan on the Oil-drop experiment”, Physics Today, 43-47 (1982) 10. Frank van Kolfschooten, “Ontspoorde wetenschap. Over fraude, plagiaat en academische mores”, De Kring (2012). 11. D. M. Markowitz, J. T. Hancock, “Linguistic traces of a scientific fraud: the case of Diederik Stapel”, PLOSOne, 9(8): e105937 (2014) 12. M. Baker, ‘Is there a reproducibility crisis?’, Nature, 533, 452 (2016) 13. Peter Godfrey-Smith, Theory and reality – an introduction to philosophy of science , Press, chapter 8 (p122) References 14. C. K. Gunsalus, A. D. Robinson, Nine pitfalls of research misconduct, Nature 557, 297-299 (2018) 15. F. C. Fang, J. W. Bennett, A. Casadevall, Males are overrepresented among life science researchers committing scientific misconduct, mBio 4, e00640-12 (2013) 16. A. Molinié and G. Bodenhausen, ‘Bibliometrics as weapons of mass citation’, Chimia, 64, 78 (2010). 17. A. Molinié and G. Bodenhausen,’Sisyphus desperately seeking publisher’, J. Biosci., 43, 9 (2018). 18. P. Verhaeghe, ‘Identiteit’, De Bezige Bij (2012) 19. G. Kreiner, The slavery of the h-index – measuring the unmeasurable, Front. Hum. Neurosci, 10, 556 (2016) 20. M. Hvistendahl, ‘China’s publication Bazaar’, Science, 342, 1035 (2013) 21. G. L. Kiser, ‘No more first authors, no more last authors’, Nature, 561, 435 (2018) 22. S.P.J.M. Horbach, W. Halffman, ‘The extent and causes of academic text recycling or ‘self plagiarism’’, Research Policy (2017) http://dx.doi.org/10.1016/j.respol.2017.09.004 23. Schimmer, R., Geschuhn, K. K., & Vogler, A. (2015). Disrupting the subscription journals’ business model for the necessary large-scale transformation to open access. doi:10.17617/1.3 24. M. Battersby, S. Bailin, ‘Critical thinking and cognitive biases’, OSSA Conference Archive, 16 (2013) 25. F. Wolfe-Simon, J. Switzer Blum, T. R. Kulp, G. W. Gordon, S. E. Hoeft, J. Pett-Ridge, J. F. Stolz, S. M. Webb, P. C. W. Davies, A. D. Anbar, R. S. Oremland, ‘ A bacterium that can grow by arsenic instead of phosphorus’, Science, 332, 1163 (2011) References

26. M. S. Thiese, S. Walker, J. Lindsey, ‘Thruts, lies, and statistics’, J. Thorac. Dis., 9, 4117 (2017) 27. M. L. Head, L. Holman, R. Lanfear, A. T. Kahn, M. D. Jennions, ‘The extent and consequences of p-hacking in science’, PLOS Biology, 13: e1002106. DOI:10.1371/journal.pbio.1002106 (2015). 28. E. J. Masicampo, D. R. Lalande, ‘A peculiar prevalence of p values just below .05’, The quarterly journal of experimental psychology, 65, 2271 (2012). 29. L. Holman, M. L. Head, R. Lanfear, M. D. Jennions, ‘Evidence of experimental bias in the life sciences: why we need blind data recording’, PLOS Biology, 13, e1002190. doi:10.1371/journal.pbio.1002190 (2015) 30. Open Science and its role in universities: A roadmap for cultural change, League of European Research Universities (2018) (https://www.leru.org/files/LERU-AP24-Open- Science-full-paper.pdf) 31. F. Gonon, J.-P. Konsman, D. Cohen, T. Boraud, Why most biomedical findings echoed by newspapers turn out to be false: The case of attention deficit hyperactivity disorder, PLoS ONE, 7, e44275 (2012)