Do “Brain-Training” Programs Work? 2016, Vol
Total Page:16
File Type:pdf, Size:1020Kb
PSIXXX10.1177/1529100616661983Simons et al.Brain Training 661983research-article2016 Psychological Science in the Public Interest Do “Brain-Training” Programs Work? 2016, Vol. 17(3) 103–186 © The Author(s) 2016 Reprints and permissions: sagepub.com/journalsPermissions.nav DOI: 10.1177/1529100616661983 pspi.sagepub.com Daniel J. Simons1, Walter R. Boot2, Neil Charness2,3, Susan E. Gathercole4,5, Christopher F. Chabris6,7, David Z. Hambrick8, and Elizabeth A. L. Stine-Morrow9,10 1Department of Psychology, University of Illinois at Urbana-Champaign; 2Department of Psychology, Florida State University; 3Institute for Successful Longevity, Florida State University; 4Medical Research Council Cognition and Brain Sciences Unit, Cambridge, UK; 5School of Clinical Medicine, University of Cambridge; 6Department of Psychology, Union College; 7Geisinger Health System, Danville, PA; 8Department of Psychology, Michigan State University; 9Department of Educational Psychology, University of Illinois at Urbana-Champaign; and 10Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign Summary In 2014, two groups of scientists published open letters on the efficacy of brain-training interventions, or “brain games,” for improving cognition. The first letter, a consensus statement from an international group of more than 70 scientists, claimed that brain games do not provide a scientifically grounded way to improve cognitive functioning or to stave off cognitive decline. Several months later, an international group of 133 scientists and practitioners countered that the literature is replete with demonstrations of the benefits of brain training for a wide variety of cognitive and everyday activities. How could two teams of scientists examine the same literature and come to conflicting “consensus” views about the effectiveness of brain training? In part, the disagreement might result from different standards used when evaluating the evidence. To date, the field has lacked a comprehensive review of the brain-training literature, one that examines both the quantity and the quality of the evidence according to a well-defined set of best practices. This article provides such a review, focusing exclusively on the use of cognitive tasks or games as a means to enhance performance on other tasks. We specify and justify a set of best practices for such brain-training interventions and then use those standards to evaluate all of the published peer-reviewed intervention studies cited on the websites of leading brain-training companies listed on Cognitive Training Data (www.cognitivetrainingdata.org), the site hosting the open letter from brain-training proponents. These citations presumably represent the evidence that best supports the claims of effectiveness. Based on this examination, we find extensive evidence that brain-training interventions improve performance on the trained tasks, less evidence that such interventions improve performance on closely related tasks, and little evidence that training enhances performance on distantly related tasks or that training improves everyday cognitive performance. We also find that many of the published intervention studies had major shortcomings in design or analysis that preclude definitive conclusions about the efficacy of training, and that none of the cited studies conformed to all of the best practices we identify as essential to drawing clear conclusions about the benefits of brain training for everyday activities. We conclude with detailed recommendations for scientists, funding agencies, and policymakers that, if adopted, would lead to better evidence regarding the efficacy of brain-training interventions. Keywords brain training, cognitive training, learning, transfer, cognitive, skill Corresponding Author: Daniel J. Simons, Department of Psychology, University of Illinois, 603 E. Daniel St., Champaign, IL 61820 E-mail: [email protected] Downloaded from psi.sagepub.com at UNIV OF MICHIGAN on October 4, 2016 104 Simons et al. Introduction evidence to date that they do” (“A Consensus on the Brain Training Industry From the Scientific Community,” Spend a few minutes listening to public radio, surfing the 2014). Internet, or reading magazines, and you will be bom- Then, in December 2014, a group of 133 scientists and barded with advertisements touting the power of brain therapists countered with their own open letter on a training to improve your life. Lumosity converts basic cog- website called Cognitive Training Data (www.cognitive- nitive tasks into games and has noted in an introductory trainingdata.org/), claiming that “a substantial and grow- video that “every game targets an ability important to you, ing body of evidence shows that certain cognitive-training like memory, attention, problem-solving, and more” regimens can significantly improve cognitive function, (“Learn How Lumosity Works” video previously hosted at including in ways that generalize to everyday life.” Like www.lumosity.com: “Cutting Edge Science Personalized the Stanford/Max Planck letter, the response letter con- for You,” 2015). Posit Science teamed up with the AARP curred that “claims promoting brain games are frequently (formerly the American Association of Retired Persons) to exaggerated, and are often misleading,” but it argued that offer a version of its BrainHQ software as part of a “Staying the literature is replete with “dozens of randomized con- Sharp” membership (http://www.aarp.org/ws/miv/staying- trolled trials published in peer-reviewed journals that sharp/). Cogmed markets its working-memory training document specific benefits of defined types of cognitive program to schools and therapists, claiming that it “will training.” The signatories argued that help you academically, socially, and professionally” by “allowing you to focus and resist distractions better” (“How many of these studies show improvements that Is Cogmed Different,” 2015). And CogniFit has promised to encompass a broad array of cognitive and everyday “add useful cognitive training programs for your daily life” activities, show gains that persist for a reasonable (“Improve Your Brain While Having Fun,” 2015). amount of time, document positive changes in real- Such statements are standard fare in the marketing life indices of cognitive health, and employ control materials of brain-training companies, and most back strategies designed to account for “placebo” effects. their claims by appealing to the expertise of their found- (para. 7) ers and/or by citing supporting published research. The aforementioned video emphasizes that Lumosity is “based In January 2016, the U.S. Federal Trade Commission on neuroscience research from top universities around (FTC; 2016a) announced that it had charged Lumos Labs the world,” and elsewhere on the website the company with “deceptive advertising” regarding some of the claims provides a bibliography of 46 papers, posters, and con- the company had made about Lumosity’s efficacy and ference presentations from its Human Cognition Project simultaneously announced that the company had agreed (www.lumosity.com/hcp/research/bibliography). Posit Sci- to settle the government’s $50 million judgment against it ence’s website notes that BrainHQ was “built and tested by paying a $2 million fine (reduced because of financial by an international team of top neuroscientists and other hardship) and agreeing to change some of its sales and brain experts” and has claimed real benefits shown in marketing practices. “Lumosity preyed on consumers’ “more than 70 published papers” (“Brain Training That fears about age-related cognitive decline, suggesting their Works,” 2015), stating that “no other program has this games could stave off memory loss, dementia, and even level of proof.” Cogmed, too, notes that its program was Alzheimer’s disease. But Lumosity simply did not have “developed by leading neuroscientists” and claims that the science to back up its ads,” an FTC official noted “no other brain-training product or attention-training (Federal Trade Commission, 2016a). Speaking to NBC method can show this degree of research validation” Nightly News, a staff lawyer for the FTC added, “There just (“Frequently Asked Questions,” 2015). CogniFit has isn’t evidence that any of that [using Lumosity] will trans- promised “fun addictive games designed by neuroscien- late into any benefits in a real-world setting” (“Lumosity tists” (“Improve Your Brain While Having Fun,” 2015). to Pay $2M,” 2016). The government and Lumos Labs But does the published research support the claim that agreed that any future claims of Lumosity’s efficacy would such brain-training interventions, or “brain games,” have to be backed by “competent and reliable scientific improve real-world performance on tasks that matter in evidence” (Federal Trade Commission, 2016a). The settle- our academic, personal, or professional lives? In October ment specified that with respect to “performance at 2014, the Stanford Center on Longevity and the Max school, at work, and in athletics; delaying age-related Planck Institute for Human Development issued an open decline; and reducing cognitive impairment,” this stan- letter, signed by an international group of more than 70 dard would require tests that are “randomized, adequately psychologists and neuroscientists, that “[objected] to the controlled, and blinded to the maximum extent practi- claim that brain games offer consumers a scientifically cable.” For other claims, the FTC required evidence