<<

IMPROVING WEB-PUSH RESPONDENT COMMUNICATION

IN THE AMERICAN COMMUNITY

By

JONATHAN PATRICK SCHREINER

A dissertation submitted in partial fulfillment of the requirements for the degree of

DOCTOR OF PHILOSOPHY

WASHINGTON STATE UNIVERSITY Department of Sociology

DECEMBER 2019

© Copyright by JONATHAN PATRICK SCHREINER, 2019 All Rights Reserved

© Copyright by JONATHAN PATRICK SCHREINER, 2019 All Rights Reserved

To the Faculty of Washington State University:

The members of the Committee appointed to examine the dissertation of JONATHAN PATRICK SCHREINER find it satisfactory and recommend that it be accepted.

Don A. Dillman, Ph.D., Chair

Erik Johnson, Ph.D.

Carmen Lugo-Lugo, Ph.D.

ii ACKNOWLEDGEMENT

This dissertation would have been impossible without the support of my mentor and chair, Don Dillman. Don’s contribution to the field of cannot be quantified.

His life’s work forms the foundation of my dissertation and my work at the U.S. Bureau.

Far greater than Don’s academic contribution to this dissertation is the contribution he made through his unwavering support of me and my scholarship. I cannot put into words the gratitude I have for Don for sticking with me this past decade. Though three dissertation topics, two careers, and multiple moves across state and across the country, Don’s support was constant. He saw in me something I did not see in myself, and I can never thank him enough.

I would also like to thank my committee, Erik Johnson and Carmen Lugo-Lugo, for their support thought multiple projects and courses on the way to this dissertation. This dissertation would also not be possible without contributions from my colleagues at the U.S. Census Bureau.

No researcher exists on an island. This work was supported by an amazing team of dedicated collaborators, researchers, and public servants. I specifically would like to thank Broderick

Oliver, Sarah Heimel, Eli Poehler, Dottie Barth, and Dave Tuttle for their contributions to this dissertation and their continued work to improve Census Bureau survey communications.

Lastly, I would like to thank my family. This dissertation is dedicated to my parents,

Roger and Grace, for being the best role models a kid could ever have, and to Amanda Miller, my best friend, for her support throughout graduate school and this dissertation.

iii IMPROVING WEB-PUSH RESPONDENT COMMUNICATION

IN THE AMERICAN COMMUNITY SURVEY

Abstract

by Jonathan Patrick Schreiner, Ph.D. Washington State University December 2019

Chair: Don A. Dillman

The purpose of this dissertation is to identify ways to improve the effectiveness of the communication procedures used by the U.S. Bureau of the Census to elicit self-responses to the

American Community Survey (ACS). The ACS, which seeks responses from 3.5 million randomly sampled housing unit addresses each year, serves as the main source of information for comparing the characteristics of people across cities, states and regions of the .

Although overall response rates to the ACS have not declined dramatically in recent years, as has been the case for many surveys, it now takes more effort at greater cost to get a sampled housing unit address to self-respond. In this dissertation, I note the shortcomings in the current process used by the Census Bureau to innovate the ACS methodology to gain self-response. I then review existing theories of human response behavior and utilize them to conduct a content analysis of five ACS mail communications. Based upon this analysis, and recommendations derived from relevant literatures, I proposed a revised sequence of communications for future testing against current communications efforts. It is hypothesized that use of this comprehensive redesign will improve the speed by which responses are obtained and lower costs by reducing the amount of in-person follow-up

iv interviews as well as increase participation from reluctant respondents, reducing non-response bias. These recommendations will apply directly to future ACS testing, but will also be applicable to government and non-government surveys that utilize a similar multi-mail-contact web-push methodology.

v TABLE OF CONTENTS

Page

ACKNOWLEDGEMENT ...... iii

ABSTRACT ………………………………………………………………………………………………………………………...... iv

LIST OF TABLES ...... ix

LIST OF FIGURES ...... x

Chapter I. Improving Communications to Households asked to answer the American Community Survey ...... 1

Purpose...... 1

What is the American Community Survey and why is it Important? ...... 2

Challenges facing the American Community Survey ...... 4

Limitations of previous ACS communications research ...... 7

Improved messaging as a means of improving ACS response rates ...... 11

Why is Sociology Relevant to Improving ACS Methodology? ...... 15

Conclusion and Overview of remaining chapters ...... 17

Chapter II. Understanding survey response ...... 20

How research on messaging can help reduce survey errors ...... 20

Addressing Survey Response through Messaging...... 22

Survey Mode ...... 23

Sponsorship ...... 25

Burden of the response Task (task-burden or respondent burden) ...... 27

Incentives ...... 30

Structuring Request to Respond ...... 33

vi Communication Content ...... 36

Attributes of Potential Respondents ...... 38

Conclusion: Why focus on messaging? ...... 40

Chapter III. The current ACS mail contact methodology ...... 42

Current ACS Methodology ...... 42

Mailing 1 ...... 44

Mailing 2 ...... 49

Mailing 3 ...... 52

Mailing 4 ...... 58

Mailing 5 ...... 59

Overview of ACS research and innovations that impact messaging ...... 63

Early Years of ACS Testing ...... 64

Recent research, innovation and potential problems ...... 72

Conclusion: Problems with recent innovation to ACS mail communications ...... 93

Chapter IV. Developing recommendations for the ACS messaging ...... 97

Understanding types of respondents from Census Research ...... 97

Identifying useful messaging concepts from general theories of behavior...... 107

What is communication? ...... 108

Theories on human action and behavior ...... 111

Theories about why people respond to surveys ...... 117

Leverage-Saliency Theory...... 117

Social Exchange Theory ...... 121

Applying Results from Census Bureau Sponsored Messaging Research ...... 128

vii

Expert Review of ACS messaging ...... 138

Building tentative messaging recommendations from theory and research ...... 141

Establish credibility and trust in the first mailing ...... 142

Clearly connect the ACS to the Census Bureau as the survey sponsor ...... 144

Simply state confidentiality and data security ...... 145

Personalize the survey request ...... 146

Stage messaging across mailings...... 148

Frame ACS participation as a community benefit ...... 149

Leverage audience-based insights across mailings ...... 150

Leverage behavior insights to communicate personal benefits ...... 152

Reduce burden by presenting clear, non-complicated instructions and deadlines 154

Highlight that participation is mandatory in a respectful tone ...... 155

Keep it simple: Plain language, simplistic but eye-catching design ...... 155

Conclusion ...... 156

Chapter V. Analysis of ACS mail communication materials ...... 158

Methodology ...... 158

Developing the ACS codebook and coding ACS materials ...... 160

Additional Analysis ...... 168

Results of the content analysis ...... 170

Messaging may be overwhelming ...... 170

Messaging is repetitious...... 177

Messaging lacks strategic purpose ...... 189

Missed opportunities ...... 199

viii

Sponsor Information is Not Communicated Consistently ...... 203

Graphic and format inconsistencies ...... 218

Conclusion ...... 223

Chapter VI. Proposing a new implementation strategy for the American Community Survey 227

The First Mailing ...... 230

The Second Mailing ...... 244

Third Mailing ...... 256

Fourth Mailing ...... 270

Fifth mailing ...... 272

Sixth mailing ...... 277

Summary and Conclusion ...... 280

REFERENCES ...... 293

ix

LIST OF TABLES

Table 1. How factors of survey response intersect with messaging ...... 40

Table 2. September 2018 ACS Mail Contact Strategy (English) ...... 62

Table 3. Percent of participants who stated a message would increase their likelihood of completing the decennial census ...... 129

Table 4. Results of the messages in the Benchmark Messaging Survey ...... 131

Table 5. Results of ACS Messaging Refinement Survey 25 26 ...... 133

Table 6. Percentage more likely to complete the ACS in the Daily Tracking Survey ...... 135

Table 7. Comparing five messages on privacy and confidentiality ...... 137

Table 8. Content analysis codebook ...... 163

Table 9. IRR results from coding ACS mail communication materials ...... 167

Table 10. Flesch reading ease score and Flesch-Kincaid grade level ...... 169

Table 11. Number of messages in each mailing ...... 171

Table 12. Frequency of messages in Mailing 1 ...... 173

Table 13. Number of messages and codes in each mailing item ...... 174

Table 14. Flesch reading ease scores of ACS mail materials...... 176

Table 15. Repetition across letters and postcard ...... 179

Table 16. Word count and repetition rate of legal obligation, confidentiality, and data security statements ...... 180

Table 17. Count of the assigned codes by messaging category ...... 204

Table 18 . Application of sponsorship in ACS mail materials ...... 208

Table 19. Addresses used on ACS mail materials ...... 215

Table 20. Mail strategy comparison: ACS production vs proposed redesign ...... 230

ix LIST OF FIGURES

Figure 1. Current ACS production mailout strategy...... 44

Figure 2. Mailing 1: Outgoing envelope ...... 45

Figure 3. Mailing 1: Instruction card (front) ...... 46

Figure 4. Mailing 1: Instruction card (back) ...... 46

Figure 5. Mailing 1: Invitiation letter ...... 47

Figure 6. Mailing 1: Multilingual brochure ...... 48

Figure 7. Mailing 1: Frequently Asked Questions (FAQ) brochure ...... 49

Figure 8. Mailing 2: Bi-fold pressure seal mailer (outside, without fold lines) ...... 50

Figure 9. Mailing 2: Bi-fold pressure seal mailer (inside, without fold lines) ...... 51

Figure 10. Mailing 3: Outgoing envelope ...... 52

Figure 11. Mailing 3: Instruction card (front and back) ...... 53

Figure 12. Mailing 3: Letter ...... 54

Figure 13. Mailing 3: Paper survey form (front) ...... 55

Figure 14. Mailing 3: Paper survey form (back) ...... 56

Figure 15. Mailing 3: Frequently Asked Questions (FAQ) brochure ...... 57

Figure 16. Mailing 3: Pre-paid return envelope ...... 58

Figure 17. Mailing 4: Reminder postcard...... 59

Figure 18. Mailing 5: Bi-fold pressure seal mailer (outside) ...... 60

Figure 19. Mailing 5: Bi-fold pressure seal mailer (inside) ...... 61

Figure 20. Initial ACS mail contact methodology ...... 65

Figure 21. ACS mail contact methodology update with additional post card reminder ...... 69

Figure 22. ACS mail contact methodology adding the internet response option ...... 71

x Figure 23. Impact of dropping the ACS prenotice: Combining two mailings into a new initial mail package ...... 75

Figure 24. ACS mail contact methodology after dropping CATI ...... 76

Figure 25. Example of a sealed tri-fold PSM ...... 78

Figure 26. ACS cognitive testing envelop - plain ...... 82

Figure 27. ACS Cognitive Testing envelop - graphic...... 82

Figure 28. ACS cognitive testing envelop back - simple color ...... 83

Figure 29. ACS cognitive testing envelop - full graphics ...... 83

Figure 30. ACS cognitive testing envelope back - colorless text ...... 84

Figure 31. ACS cognitive testing postcard - graphic ...... 84

Figure 32. ACS cognitive testing postcard - plain ...... 85

Figure 33. ACS cognitive testing postcard - formatted ...... 85

Figure 34. ACS “Why We Ask” brochure (front) ...... 87

Figure 35. ACS “Why We Ask” Brochure (back) ...... 88

Figure 36. ACS data slide (unfolded) ...... 90

Figure 37. Example of ACS messaging content ...... 161

Figure 38. Example of coded messages ...... 166

Figure 39. Images of the five mail pieces in Mailing 1 ...... 172

Figure 40. Messages repeated verbatim or paraphrased in the ACS letters and postcard ...... 178

Figure 41. Space (in yellow) used to communicate legal obligation and data security messages in Mailing 1 mail items ...... 182

Figure 42. FAQ Brochure ...... 184

Figure 43. Repetitious content in the multilingual brochure (highlighted in yellow) ...... 185

Figure 44. Repetition in the Frequently Asked Questions (FAQ) brochure ...... 186

xi Figure 45. Repetition in the first and third mailing instruction cards ...... 187

Figure 46. Security and used ID found on ACS PSMs ...... 196

Figure 47. address label ...... 196

Figure 48. Census Bureau references (highlighted) in the initial Mailing letter ...... 205

Figure 49. Letterhead used in ACS mailing letters ...... 206

Figure 50. Placement of Census Bureau logos on envelopes ...... 209

Figure 51. ACS production envelope callout box ...... 210

Figure 52. ACS outgoing mailing first-class mail postage area ...... 210

Figure 53. Placement of Census Bureau logos and form ID on an ACS letter ...... 212

Figure 54. Census Bureau references (highlighted in yellow) on the front of the paper survey 213

Figure 55. Census Bureau references (highlighted in yellow) on the back of the paper survey 214

Figure 56. Examples of locations of the Census Bureau from internet searches ...... 217

Figure 57. Excerpt of ACS multilingual brochure of full-color graphics ...... 219

Figure 58. ACS FAQ brochure (close-up of graphics)...... 220

Figure 59. Excerpt of FAQ brochure inside American flag graphic ...... 220

Figure 60. Excerpt of the top of the ACS instruction card graphic ...... 221

Figure 61. Excerpt of mailing 3 instruction card to highlight use of icons ...... 221

Figure 62. Excerpt of the survey form to show telephone icon ...... 222

Figure 63. Security bar from ACS PSMs ...... 222

Figure 64. Proposed redesigned ACS mailout methodology ...... 229

Figure 65. Mailing 1: Redesigned PSM (outside) ...... 232

Figure 66. Sight pathway commonly used to read envelopes ...... 233

Figure 67. Current ACS PSM outside and letter with form ID circled in red ...... 235

xii Figure 68. Mailing 1: Redesigned PSM (inside letter) ...... 238

Figure 69. Mailing 2: Redesigned outgoing envelope (front and back)...... 245

Figure 70. Mailing 2: Redesigned envelop (back) ...... 247

Figure 71. Mailing 2: Redesigned language card (front) ...... 248

Figure 72. Mailing 2: Redesigned backside of the Identification Card ...... 249

Figure 73. Mailing 2: Redesigned Letter (front) ...... 251

Figure 74. Mailing 2: Redesigned Letter (back) ...... 253

Figure 75. Mailing 3: Redesigned outgoing envelope (front) ...... 257

Figure 76. Mailing 3: Redesigned outgoing envelope (back) ...... 259

Figure 77. Mailing 3: Redesigned paper survey form (front) ...... 260

Figure 78. Outdated telephone icon used in current ACS production ...... 262

Figure 79. Mailing 3 Redesigned paper survey form (back) ...... 263

Figure 80. Mailing 3: Redesigned pre-paid return envelope ...... 264

Figure 81. Mailing 3: Redesigned letter ...... 265

Figure 82. Mailing 3: Redesigned letter (back) ...... 268

Figure 83. Mailing 4: Redesigned postcard (front) ...... 271

Figure 84. Mailing 4: Redesigned postcard (back) ...... 272

Figure 85. Mailing 5: New graphic letter ...... 273

Figure 86. Mailing 5: New graphic mailing envelope (front and back) ...... 275

Figure 87. Mailing 5: New graphic letter (back) ...... 276

Figure 88. Mailing 6: Redesigned PSM (outside) ...... 278

Figure 89. Mailing 6: Redesigned PSM (inside) ...... 279

xiii

Chapter I. Improving Communications to Households asked to answer the American Community Survey

Purpose

The purpose of this dissertation is to identify ways to improve the effectiveness of the communications used by the U.S. Bureau of the Census to elicit responses to the American

Community Survey (ACS). Although overall response rates to the ACS have not declined dramatically in recent years, as has been the case for many surveys (see Czajka and Beyler

2016), it now takes more effort at greater cost to get a sampled housing unit addresses to self- respond, and there are growing fears of gaining survey participation from an increasingly distrustful population.

To improve the communications strategy for the ACS, I evaluate the current methodology for requesting responses and propose a revised set communication materials based on survey response theories, other literature relevant to survey communications, and a content analysis of current ACS communications presented here. It is hypothesized that use of this comprehensive redesign will improve the speed by which responses are obtained and lower data collection costs by reducing the number of mailings sent to households and costly in-person follow-up interviews. Also, by targeting messaging towards groups with historically low response rates and distrust in surveys, these materials are designed to reduce non- response bias, improving overall data quality of the ACS. These recommendations are specifically designed for testing against current ACS communication efforts, but the recommendations developed here will also be applicable to government and non-government surveys that utilize a similar multi-mail-contact web-push methodology.

1 What is the American Community Survey and why is it Important?

The U.S. Census Bureau’s American Community Survey (ACS) may be the most important sample survey in the United States. Prior to becoming an annual survey in 2005, the ACS was known as the “long form” in the decennial census, being sent to 1 in 6 housing unit addresses that received it as an alternative to the much shorter standard census form. This “long form” asked questions on housing, education, income and many other topics not asked in the standard census form. This information was critical for setting national policy and measuring the country’s success in improving people’s lives. However, because this survey was only conducted every 10 years, the long-form census was not suited for many national policy purposes. The data took up to two years to publish and so was not available to provide policy guidance for up to twelve years after data from the previous census could be replaced (U.S.

Census Bureau 2014; U.S. Census Bureau 2017; Hotchkiss and Phelan 2017).

After the 2000 census, a Congressional decision was made that a new survey, the ACS, would replace the long-form census. The ACS was designed to continually produce fresh, up to date estimates for making policy decisions. Rather than being sent to 1 in 6 housing unit addresses every 10 years, the ACS is now sent to 3.5 million housing unit addresses (about 1 in every 35) each year. Response to the ACS remains part of the constitutionally required decennial census, which means that people residing in sampled addresses are required by federal law to respond.1 Because of the number of housing unit addresses surveyed each year, it is possible to estimate with reasonable precision the characteristics of individuals and households in states, cities, metro regions, and large counties. When aggregated across

1 The American Community Survey is authorized by 13 U.S.C. § 141 and 13 U.S.C. § 193.

2 multiple years, the Census Bureau can produce data on lower level geographies down to the census track (U.S. Census Bureau 2014).2

Since the first year ACS data was published in 2006, the ACS has been the nation’s primary source of information on the U.S. population, housing, and workforce.3 Today, data from this survey is used in the allocation of over $675 billion dollars in federal and state funding and by community leaders and local governments to plan roads, locate hospitals, prepare for disasters, and much more (Hotchkiss and Phelan 2017). Businesses from Wall Street to Main

Streets throughout the U.S. use ACS data to determine local needs for workforce development.

This survey’s data also serve as the main source for benchmarking, verification and weighting of nearly every survey in the country. Because of these uses, ACS results constitute the most important and widely used data product produced by the U.S. federal statics system (U.S.

Census Bureau 2017; Hotchkiss and Phelan 2017).

Due to the large-scale operation required to field this survey, the ACS is also expensive with a budget of over $210 million dollars for 2019 (U.S. Census Bureau 2019). To collect these critical data, the ACS is conducted year-round. Each month, approximately 295,000 housing unit addresses are sampled from the U.S. Census Bureau’s Master Address File (MAF), a regularly updated file that contains an accurate inventory of all living quarters in the United States,

Puerto Rico, and other U.S. territories (U.S. Census Bureau 2014).4 Use of this sample frame is

2 The Census Bureau formerly produced one-, three-, and five-year data products from the ACS. Due to budget limitations, the Census Bureau now produces only one- and five-year data products and custom-ordered tables. 3 The ACS survey went to the field in 2005, producing the first usable one-year estimate in 2006. 4 The MAF contains addresses for residential housing as well as group quarters facilities, such as nursing homes, military bases, and prisons. The overall process also includes rural Alaska and in separate efforts to insure representation. This dissertation focuses on residential housing in the continental United States, , and most of Alaska. It excludes an analysis of the operations used to recruit participants from group quarters, rural Alaska, and Puerto Ric.

3 necessary to gain acceptable coverage of the U.S. population. Other lists, such as identifying potential respondents using email or telephone lists, would exclude a significant number of households and result in representation problems (Dillman, Christian, and Smyth 2014).

To request a self-response, the Census Bureau currently sends sampled residential addresses a series of up to five mailings containing a variety of recruitment materials including letters, post-cards, brochures, and instruction cards. The first and second mailings request households respond by the internet, the third mailing provides a paper survey to complete and return, while the fourth and fifth mailings remind respondents to reply. If these attempts to induce a self-response fail, the ACS methodology switches to in-person follow-up for a sample of non-respondents.5 This methodology produces an overall weighted response rate above 93% (U.S. Census Bureau 2019b). This response rate has remained relatively stable over the past decade due largely to the extensive and expensive follow-up operation. The total self-response rate; i.e. the rate of self-responding over the internet or by postal mail, was approximately 64% in 2017 (Baumgardner 2018).6

Challenges facing the American Community Survey

Response rates to federal surveys are on the decline (Czajka and Beyler 2016). To maintain a high response rate, the ACS deploys an expensive data collection methodology, and costs of

5 About 1 in 3 non-respondents remaining after the mail contact period are placed in the nonresponse follow-up (NRFU) sample. 6 This rate includes all sample housing unit addresses that self-respond. A small portion of this response occurs during the nonresponse follow up period, i.e., someone who receives a visit from a Census Bureau enumerator but decides to respond on the internet. The Census Bureau does not calculate the response rate for self- response due solely to the self-response mailing materials. Accounting for a small portion of self-response occurs during the follow-up operation, the self-response rate induced by mail communication materials is lower than the overall self-response rate.

4 data collection operations continue to increase as government budgets become tighter (U.S.

Census Bureau 2017). A high response rate is a valuable measure for assessing data quality and critically important to the ACS as the nation’s premier demographic data course. To accomplish this, the ACS operation must elicit responses from potential respondents across demographic characteristics of race, income, location, occupation and education. Because of cost and procedural limitations, the ACS sends a single set of English-language recruitment materials to most addresses throughout the US, with some locations receiving alternate or additional materials in Spanish.7 The U.S. population is highly heterogeneous, making it difficult to develop a single set of recruitment materials that are effective for all households. But this is the reality facing the Census Bureau. It is critical that these materials communicate in a way in way that convinces all types of recipients to respond.

Collecting answers to the ACS is a monumental challenge. Changes in cultural norms and feelings toward survey response have made certain once-critical modes of data collection nearly obsolete. In 2018, the ACS discontinued its use of Computer Assisted Telephone

Interviewing (CATI) whereby ACS non-respondents were called and asked to answer the survey over the phone (U.S. Census Bureau 2018). This data collection mode had been used successfully by the ACS and for decades more by other surveys to increase overall response rate

(Dillman 1978; Dillman 2000; Dillman, Smyth, and Christian 2009; Lavrakas, et al. 2017b). The

CATI operation also boosted survey self-response by the internet and mail, as the phone call would act as an additional reminder to respond (Griffin, Fischer, and Morgan 2001; Nichols,

7 This dissertation will focus only on the English Language recruitment materials. Further research and expertise are required to evaluate Spanish and other language supporting materials.

5 Horwitz, and Tancreto 2013). The rise of cellphone only households and new cultural norms regarding answering a phone call from an unknown number have diminished the returns of other telephone data collection operations (Dutwin and Lavrakas 2016; Lavrakas, et al. 2017a;

Blumberg and Luke 2016; Blumberg and Luke 2019; Kennedy and Hartig 2019; Olson, et al.

2019). For the ACS, it became increasingly costly and inefficient and the mode contributed only modestly to the number of responses with response by CATI decreased by over 50 percent between 2012 and 2016 (Mills 2016b; U.S. Census Bureau 2018). There also is evidence that the

U.S. population is increasingly distrustful of requests for personal data and distrustful of technological systems that collect and store data (Macro 2009; Conrey, ZuWallack, and Locke

2012; Reingold 2014a; Pew 2019). Lastly, the ACS is a long survey estimated to take on average

40 minutes to complete (U.S Census Bureau 2014). Some concern exists that the associated burden potential respondents feel towards the task to complete the ACS is a factor in declining response rates (Holzberg, et al. 2018).

These and many other challenges put the Census Bureau in a position of needing to undertake constant innovation in order to maintain data quality in the ACS, achieve acceptable response rates, and maintain or reduce costs. This dissertation is aimed at achieving these objectives by directly addressing some of the challenges facing the ACS. By evaluating and revising mail communication materials, it may be possible to improve participation from a heterogeneous population by revising messaging to increase trust in the ACS survey request, clearly communicate the benefits of responding, and to reduce the burden people feel towards participating.

6 Limitations of previous ACS communications research

The Census Bureau is deeply concerned with maintaining and improving ACS response rates and reducing costs of the ACS program (U.S. Census Bureau 2017). One approach to reducing costs is to get households to self-respond to the ACS by the internet early because they are then are removed from receiving additional mailings, which reduces costs to the ACS program. Pushing respondents to the internet, rather than other modes of response, is also important because internet self-responses are cheaper to process than a mailed-returned paper survey. Some people either desire or need to respond by paper due to lack of access to the internet in their home. While paper responses are more costly than internet responses, they are still far less- costly than in-person or phone assisted interviews. Providing this second mode in a way that decreases in-person or phone interviews while maintaining a high rate of internet response is also critical to reduce costs to the ACS program. If the yield from web and mail self-response can be improved earlier in the mail contact period, costs for the ACS can likely be reduced by significant amounts.

The process used to elicit self-response to the ACS by mail communications has undergone many changes. Most significantly, in 2013 the option to respond by internet was added to the ACS, and the mail contact materials were updated to offer the internet response mode in the first two mailings and withhold the paper survey form until the third mailing

(Matthews, et al. 2012; Tancreto, et al. 2012; U.S. Census Bureau 2014). This “web-push” methodology, a form of which was originally developed and tested by the Washington State

University Social and Economic Sciences Research Center and adapted through Census Bureau tests for the ACS, is still relatively new in the field of survey methodology (Dillman 2017a). The

7 aspects of this methodology that produce higher response rates at lower cost is far from settled science and procedures to optimize this methodology are being developed and refined on a continuous basis.

The Census Bureau has taken a leading role in the innovation of web-push methodologies through consistent experimentation (see Matthews, et al. 2012; Tancreto, et al.

2012; Longsine and Risley 2019). Before a change in the ACS is authorized, experimental survey design methodologies are applied to conduct large mail-out field tests. To facilitate the testing of new methodologies with the ACS, a process called “Methods Panel Testing” was implemented in 2015.8 Each month, the approximately 295,000 housing unit addresses are divided into 24 equal groups called Methods Panels. These Methods Panels can then be assigned different treatments to test a variety of methodological or content changes. This way, the Census Bureau is using in-sample households to conduct ongoing research, rather than devoting additional cases to conduct research. This process allows for additional opportunities to test and innovate the ACS methodology. Using the Methods Panels, each field-test can include well over 100,000 sampled households testing multiple new ideas against the current

ACS production materials. Each test requires months of planning, design, and pre-testing prior to fielding, as well as months of response rate and cost analysis.

For example, based on research conducted for the decennial census in the 1990’s

(summarized in Dillman 2000), multiple experiments were conducted to test how best to communicate that responding to the ACS is required by law (Dillman 1996; Oliver, et al. 2017;

8 My position at the Census Bureau is in the Survey Methods and Measures area of the American Community Survey Office in the Methods Panel Coordination Branch. My job is to coordinate the methods panel testing and contribute to the methodological design and experimentation of new ideas for the ACS and the Census Bureau.

8 Barth, et al. 2016; Oliver, Risley, and Roberts 2016). Each test isolated a change to the mandatory messages included in the ACS mail recruitment materials, such as adding a mandatory message to the outgoing envelope or using bold text to draw attention to a mandatory message in a letter. Methods Panels were randomly assigned different versions of mandatory messaging, with most receiving the current ACS production materials. In this type of testing, inferences can be drawn from the difference in response rates to determine if changes were successful at increasing response rates and reducing overall costs for the ACS program.

The result of these tests showed that deemphasizing the mandatory messaging in the

ACS led to decreases in response, while emphasizing the mandatory messages (with bolded font for example) increased response rates (Oliver, et al. 2017; Barth, et al. 2016; Oliver, Risley, and Roberts 2016). However, it took multiple tests spanning three years to investigate this one change to the ACS program and findings from these tests have yet to be implemented at the time of this dissertation in 2019. While the Methods Panel facilitates testing within the ACS sample reducing costs, the process is still slow. Moreover, when this process is used to isolate single factors that may impact survey response, and when multiple tests are required on a single issue, innovation can be extremely slow. In addition, in this case how the required by law message was communicated in ACS mail recruitment letters, these tests did not investigate how changes to mandatory messaging interacted with other messaging spread across the multiple letters as well as in supporting materials such as brochures, instruction cards, or on the cover of the survey form. As a result, the Census Bureau bases changes to the ACS program on testing that largely focuses on small alterations to existing communication materials. The

9 testing process does not seem to look, and isn’t designed to look, holistically at the mail communication strategy as a whole.

It is possible that the changes to the mandatory messaging could be further optimized if a more holistic approach to experimentation was used. All messaging across the mail communication could have been considered, and the test could have been inspired by survey communication literature rather than small alterations to existing ACS communication materials. This example highlights a concern shared by survey methodologists, as Dillman,

Smyth and Christian (2014) stated, “research has been successful in identifying specific design features that tend to have larger versus smaller effects [on response rates], but generally leaves unanswered how all elements of a design fit together.” To provide a case that a thorough review of ACS mail communication materials is warranted, and that a redesign of those materials may increase response rates in early stages of implementation and lower costs, this dissertation will highlight examples from ACS testing that may have resulted in less-than- optimal ACS mail communication materials. For example, the analysis in this project, which evaluated the ACS materials more holistically, found that bolding the “required by law” statements makes sense (as shown in current Census Bureau testing), but that other legal obligation statements are repetitive and may overstate concerns about security which may lead to decreases in response rates. The current process of incremental change did not, and could not, identify and address this second issue highlighted by the holistic approach used to evaluate messaging used in this dissertation.

The main problem may be that the current communication materials used to recruit ACS participants are the product of a decade of research prior to the implementation of the

10 Methods Panel process. After a decade of incremental changes to the ACS based on separate

Methods Panel and non-Methods Panel testing, it is possible that the communications across mailings may not be mutually supportive and effective in the current climate of increased distrust. The Census Bureau has innovated individual elements of the ACS survey methodology, but it has not fully accounted for how all design and messaging elements fit together or if any problematic gaps in messaging have been created. Basing changes on previous materials may be logical in the short term, but over time it is possible that communication materials become out of date and no longer follow best practices of survey methodologists that may be innovating at a faster pace or in new directions. Because the ACS has never conducted a thorough, independent review of their communication materials, concern exists that the current ACS communications may not incorporate new and best practices for survey communication.

Improved messaging as a means of improving ACS response rates

To make a meaningful impact on the ACS, this dissertation focuses on one critical aspects of the

ACS methodology, i.e. to evaluate and propose changes to the messaging contained in mail recruitment materials. For purposes of this dissertation, messaging is defined as the content of any mail communication piece that communicates information to a potential respondent. Many elements of a communication can send a message, including: words, phrase, sentences, graphics, symbols, color, and the overall design of mailing pieces (Dillman and Redline 2004;

Christian and Dillman 2004). Survey methodologist have shown that messaging in survey recruitment materials can impact whether people respond, how they respond, and when they

11 respond, but research on messaging remains an understudied aspect of survey methodology

(Dillman, Smyth, and Christian 2014; Dillman 2019a; Dillman 2019b).

My goal in this dissertation is to encourage the Census Bureau to design and test entirely new mail communication materials that follow a strategic plan for the messaging content using concepts developed in this dissertation. The challenges facing the ACS, such as increased costs, increased distrust in survey requests and the government, and concerns about respondent burden, are unrelenting. More drastic changes to ACS messaging may be needed than have been suggested by the current process of incremental change.

To address the main challenge of reducing (or at least maintaining) current costs while increasing response rates in the early stages of implementation, I propose a comprehensive evaluation of the messaging contained in the ACS mail communications, and a complete redesign of ACS mail messaging. One of the challenges of working on government surveys, and particularly the ACS, is that the process required to innovate is inherently slow and methodical

(Dillman 1996). The ACS is a highly visible survey because it is expensive and funded by federal tax dollars. The ACS is a survey often vilified on the floors of congress and the senate, receiving a, “small but continuous stream of complaints to members of Congress” each year (National

Academies of Sciences, Engineering, and Medicine 2016). The ACS is also regularly defended, inasmuch as it provides vital information for policy makers, businesses and local leaders that is used to allocate federal funds and enforce certain laws.

The large sample sizes and monthly data collection allows the ACS to produce yearly and multi-year data products used for a variety of critical governmental and business needs. To conduct such a survey, the ACS requires a complex survey operation with over 100 hundred full

12 time staff at the Census Bureau Headquarters, and hundreds more working across the U.S. at the Census Bureau National Processing Center (NPC), six regional offices, and two telephone call centers. Because the ACS is expensive to conduct, it is also expensive to change. Due to the high-profile and critical uses of the ACS, proposing changes to any aspect of the ACS operation is not a simple task. All ideas hypothesized to improve response rates that involve a change to any aspect of the ACS operation must be thoroughly researched, conceptualized, and vetted prior to implementation.

This dissertation research is a first step towards significantly changing the ACS recruitment mailings in a holistic way. Projects focused on single factor include a multi-year process of background research, materials development, cognitive and pretesting testing, redesign, allowing time for public comment and expert review, experimental field testing, additional months to conduct response rate and cost analyses, and draft, review and publish final reports and briefings prior to implementing changes to any aspect of the survey operation.

Once a project to improve the ACS is conceived, as is the goal of this dissertation, developing new materials and cognitive and field testing the ideas in this dissertation will cost well over a million dollars, take multiple years of pretesting and development, and will likely include multiple rounds of nation-wide field testing, with each round including over 100,000 sampled housing unit addresses.

The ACS is in many ways a unique survey. It is the only, nation-wide sample survey of its kind mandated by law as part of the Census Bureau mission. However, the findings of this dissertation may be relevant for other national surveys. Many surveys have implemented a similar web-push, mixed-mode survey design, which involves sending multiple mail contacts to

13 sampled housing unit addresses that introduce a web mode of survey response before following up with reminders and introduction of additional survey collection modes (Dillman

2017a; Olson, et al. 2019). This is largely a conceptual project, but it is grounded in the specific, real-world issues facing the ACS. As the flagship survey of the Census Bureau, many other surveys (government, academic, and private) look to the ACS for guidance. By grounding the work in real problems facing a highly visible survey, the results from this dissertation will speak directly to a wide audience of survey methodologists, sociologists, and data users beyond the

Census Bureau.

By examining and seeking to improve such a prominent survey, this dissertation may help create new knowledge while building upon previous sociological work on survey design and practice. This dissertation is facilitated by my knowledge gained from my current position at the U.S. Census Bureau on the Survey Methods and Measure staff in the American

Community Survey Office. While any researcher could feasibly access the same mail communication materials and conduct their own evaluation of ACS materials and m their own recommendations, my current position at the Census Bureau helps provide perspective on operations associated with the ACS. Because of my current role at the Census Bureau, and the detailed knowledge of procedures I have accumulated since beginning work there, I am optimistic that the recommendations that come from this dissertation will be considered for further testing.

Some aspects of the research reported here have already been incorporated into changes to the ACS. Moving these ideas forward has involved a team of researchers to operationalize the recommendations in this dissertation, and others, into functional changes

14 into the ACS materials. Currently, the Census Bureau has a multi-year plan to use findings from this research along with insights from a larger team of Census Bureau staff to develop cognitive and field test of new mail contact materials and messaging. While this project will result in years of future work to test and implement, and budgets are never certain in the government, I have the support of the Census Bureau to attempt to contribute to the evaluation and redesign of ACS materials.9

Why is Sociology Relevant to Improving ACS Methodology?

Maintaining or improving response rates and data quality of surveys, particularly the ACS, is an important topic that advances the field of sociology. The discipline of sociology studies the causes and consequences of human interaction and how living within a social world influences and shapes behavior, opinions, and life chances of individuals and groups. Sociology is one of several social science disciplines that seeks to understand this phenomenon. The field draws much of its strength for explaining behaviors from knowledge of peoples’ demographic characteristics and group affiliations that shape our relationships with one another and our social, economic and political positions in society.

The process of gaining survey response is, at its’ core, a sociological issue. Responding or not responding to a survey is a behavior that can, to some extent, be explained by people’s life situation and demographics. In addition, responding to a government survey is of interest to sociology because this action involves attempts by a societal institution (government) to explain

9 Work at the Census Bureau is collaborative. Many ideas in this dissertation have benefited from numerous collaborations with Census Bureau staff (particularly, Broderick Oliver, Elizabeth Poehler, Sarah Heimel, and Dorothy Barth), as well as from consulting on other survey operations with survey methodologists and working with sociologist through the American Associating for Researchers (AAPOR).

15 the need for a response and convince individuals of all backgrounds and social situations to partake in a desired action (respond to the survey honestly and promptly).

Many sociologists have contributed to our understanding of survey behavior and the impact of survey methodology, as evidenced by articles published in the nation’s premier sociology journals (see Dillman, et al. 1974; Dillman 1991; Dillman, Smyth, and Christian 2014;

Heberlein and Baumgartner 1978; Singer 1978; Singer and Frankel 1982; Groves, Singer, and

Corning 2000; Schaeffer and Presser 2003). Sociologists have contributed extensively to theoretical attempts to identify attitudes and behaviors that are likely to be associated with responding to surveys. The two most prominent theories in survey methodology were both developed by sociologists. Leverage-Saliency Theory (Groves, Singer, and Corning 2000) provides guidance on certain individual elements of survey communication that impact response, and how important it is for survey communications to make different leverages salient to gain response from an unbiased, large group of a sample. Social Exchange Theory

(Dillman 1978; Dillman 2000; Dillman, Smyth and Christian 2014), builds off of the work of sociologists George Homans (1961) and Pete Blau (1964). This theory provides a framework for explicitly identifying elements of survey design that impact response by using communications to build trust in the survey request, effectively communicate the benefits of survey response, and reduce a sense of burden toward responding to the survey.

Beyond the direct impact to the ACS, this dissertation, contributes to the further development of sociological thought on explaining survey response on an applied problem that is central to societal governance and the production of unbiased, high quality data. By improving the ACS through the application of concepts developed by sociologists, this

16 dissertation will advance this sociological work while simultaneously ensuring that surveys like the ACS continue to provide sociologists, social scientists, demographers, and policy makers high quality, unbiased data.

Conclusion and Overview of remaining chapters

I this chapter, I introduced the ACS as the most critical survey to sociologists, policy analysis, business leaders, government officials, and community leaders. While the ACS receives relatively strong response rates, gaining a survey response is increasingly costly as feelings of distrust and changes to social norms are making it harder, and more expensive, to recruit participants each year. The Census Bureau has a process in place to innovate the ACS, but this process may be too slow to innovate fast enough to stay ahead of the curve of declining response. Changes to the ACS are expensive, and even small changes to the ACS program can have an impact on critical data used by scholars, policy makers and businesses around the country. A thorough case must be made before recommending large-scale changes to the ACS operation. Any suggested changes must also be extensively researched and justified. In this dissertation I propose new ideas to innovate the recruitment of ACS participants to self- respond. To thoroughly make my case that the ACS would benefit from redesigned communications, this dissertation will do the following:

Chapter 2: A survey operation as large as the ACS has many factors that can be changed to potentially increase response. A single project cannot address them all. In chapter two, I make the case to focus on “messaging” changes to the ACS communication materials because messaging intersects with most of the factors that impact response decisions and directly addresses one of the main sources of survey error. By focusing on messaging, this dissertation

17 can make practical, actionable recommendations to the Census Bureau to improve the ACS methodology to increase response rates and reduce nonresponse bias while maintaining costs for the ACS program.

Chapter 3: In chapter three, I provide an overview of the current mail communication strategy used for the ACS and the history of incremental research that led to this current methodology.

By understanding the history of ACS innovation, I make the case that while the ACS has made changes to increase response and reduce costs, the method of change has been incremental and path-dependent and may have left gaps in the communication strategy. This review also shows that the Census Bureau has not conducted a thorough review of how ACS mail communication materials work (or don’t work) together as a cohesive communication strategy.

Chapter 4: To break the cycle of incremental, path-dependent innovation based on changes to previous versions of ACS materials, chapter 4 develops an independent theoretical framework for messaging based on a review of relevant literature for survey communications and on the mindsets of potential ACS respondents. This chapter concludes by developing a list of recommendations for ACS communications based on the literature reviewed.

Chapter 5: Chapter 5 will use the theoretical framework developed in chapter 4 to analyze the messaging in the current ACS communication materials described in chapter 3. Making changes to the ACS is expensive, so it is important to know what changes are necessary, if any. If the ACS mail materials follow the recommendations developed in chapter 4, a complete redesign of ACS mail communication materials may not be necessary. This chapter concludes by presenting findings from this analysis that suggest that the current ACS communication materials do not follow recommendations developed in this dissertation, warranting a complete redesign.

18 Chapter 6: In Chapter 6 I present a suite of new ACS mail communication materials based on recommendations from the literature reviewed in chapter 4 and the findings from the evaluation of current ACS materials from chapter 5. These materials are hypothesized to increase overall self-response to the ACS, reduce nonresponse bias by promoting response from a more heterogeneous portion of the sampled population, and reduce cost to survey operations by increasing early response and reducing the cost of expensive follow-up operation.

19 Chapter II. Understanding survey response

How research on messaging can help reduce survey errors

All sample surveys attempt to simultaneously overcome four major sources of potential survey error (see Groves 1989; Groves, et al. 2004). My focus on messaging in this dissertation means I will not be dealing with all four sources equally or in depth. I present them here in order to convey the general context that requires giving attention to messaging, as described later in this chapter.

Sampling error occurs in all sample surveys when random samples of a population are used to estimate the characteristics of a larger population. Large sample sizes and randomization can decrease this error. While it can never be eliminated, it can be mathematically accounted for with reasonable certainty (Lohr 2008). Redesigns that reduce are clearly beyond the scope of this dissertation.

Coverage error occurs when a sample population doesn’t accurately represent the total population of interest because all members of the population (housing unit addresses) are not given a known and nonzero opportunity to be included in the sample. For example, a phone survey in could use a land-line telephone directory as the frame from which to sample people into the survey (see Groves and Jahn 1979; Lepkowski, et al. 2007; Steeh 2008).

Coverage error would occur if the data from this survey were used to draw inferences on the entire population of California. This sample frame based on a list of land-line telephone numbers does not “cover” the entire population of California housing unit addresses, so data from this survey could not estimate the total population, just those with landline telephone numbers. Error is introduced when inferences are made to the entire population, but non-

20 phone households are systematically different than land-line households in a way that biases responses to survey questions.10 For the ACS, the Census Bureau uses the MAF of residential addresses which they continuously update. There appears to be no alternative to relying on postal contact of those addresses, and this decision does not involve messaging, so this source of error is also outside the scope of this dissertation.

Measurement error occurs when questions from a survey do not accurately measure the concepts they attempt to measure. The bulk of research in survey methodology deals with reducing this source of error through improvements to question wording and survey design

(see Schuman and Presser 1996; Tourangeau, Rips, and Rasinski 2000; Dillman, Smyth, and

Christian 2014). A secondary goal of improved messaging is to reduce the likelihood that individuals will skip answering certain questions or provide untruthful answers. This concern suggests that the connectivity among aspects of the messaging process must create trust that improves the willingness and ability of people to answer the ACS truthfully for their household.

Nonresponse error can occur when respondents to a survey request are systematically different than responders in a way that biases data collected (Groves 2006; Groves and

Peytcheva 2008; Lynn 2008). For example, if a survey claims to represent a total population, but uses recruitment materials written at a reading level appropriate for persons with a college degree, the households that choose to respond may be systematically different than those that do response by level of education. In this case, all household were covered by the sample strategy, but because the materials could easily not be read or understood by all sample

10 For example, younger people are less likely to have land lines, so a made from a list of land-line telephone numbers could bias estimates on many topics where age is a factor.

21 members, some respondents may be frustrated and confused and choose not to respond.

Nonresponse bias happens when the characteristics of households that respond are different from households that choose not to respond in a way that that affects sample survey estimates.

Reducing nonresponse error is the central focus of the recommendations to messaging developed in this dissertation, as improved messaging may be able to encourage all types of residential households to respond, respond more quickly to reduce costs, and to respond truthfully. Aside from some Spanish-language materials sent to known Spanish language speaking areas, the ACS sends the same set of English-language contact materials to all sampled housing unit addresses. The challenge for the ACS is that this one set of English-language communication materials must illicit truthful responses from a heterogeneous U.S. population without introducing nonresponse bias.

The challenges of assuring low measurement error and avoiding nonresponse error are not unique to the ACS. All surveys, regardless of their population of interest, must recruit participants that exhibit some degree of heterogeneity. As the largest and most widely used national representative survey in the United States, the ACS has a more important and difficult challenge to reduce nonresponse bias. How does the ACS get people to take their survey? More broadly, why would anyone voluntarily participate in any survey?

Addressing Survey Response through Messaging.

No single, unified theory on why people chose to take surveys now exists. However, sociologist and survey methodologist Don Dillman (2019a) provides a list of seven factors involved in a survey request that influence a person’s response propensity. These are: 1) Survey Mode, 2)

Sponsorship, 3) Task-Burden, 4) Incentives, 5) Structure of Survey Request, 6) Content of

22 Communications, and 7) Respondent Attributes. It is impossible to cover all factors that impact survey response in a single dissertation. Messaging, however, is a particularly useful aspect of survey methodology to study because it is both understudied in the literature and it intersects with most of the factors that can potentially impact survey response propensity. In this section,

I review each of the seven factors that impact survey response and explain how each intersects with survey messaging, providing evidence that changes to ACS messaging can be an impactful way to improve ACS response rates.

Survey Mode

Surveys can be administered in a variety of modes. Some surveys ask respondents to self- complete the survey with a paper questionnaire or on an internet survey instrument. Surveys can also administer assisted modes of data collection, such as respondents completing a survey over the phone or in person with a trained survey enumerator (de Leeuw 2005). The survey mode (or modes) offered to a potential respondent can impact their ability and desire to participate in the survey request (Olson, Smyth, and Wood 2012; Olson, et al. 2019). For example, while an internet-only survey may be easy and convenient for some respondents, it is inaccessible to persons who do not have access to the internet, difficult for persons that lack proficiency with computers and internet devices and is not appealing to people with low levels of trust in computer technology (Dillman, Smyth, and Christian 2014).

For much of the past century, survey operators would use a single survey mode to collect data from their target population. Today, many survey operations, including the ACS, offer surveys to potential respondents in multiple modes, often referred to as “mixed mode surveys” (see Dillman, Smyth, and Christian 2009; Dillman 2017a; Lavrakas, et al. 2017a; Olson,

23 et al. 2019). Mixed-mode designs help gain response from potential participants who cannot respond to a single more survey (i.e. people without internet access). They also reduce the cost or burden of a survey request, as people generally do not want to be forced to complete a survey in a mode they are not comfortable with, unfamiliar with, or distrust. Evidence suggests that mixed-mode surveys increase response and may reduce nonresponse bias by gaining responses from more people as well as from people with different demographics and life situations (Smyth, et al. 2010; Messer and Dillman 2011).

Survey practitioners are aware that multi-mode surveys can benefit response rates but may also introduce mode-differences that may impact data quality. For example, people may be more willing to share sensitive information on a self-administered survey form than to a person in a face-to-face (Bowling 2005; Tourangeau and Yan 2007; Kreuter, et al.

2008; Preisendorfer and Wolter 2014). However, self-administered surveys can have higher rates of item-nonresponse than surveys conducted with the assistance of an interviewer

(Heerwegh and Loosveldt 2008; Heerwegh 2009; Klausch, Hox and Schouten 2013; Breton, et al.

2017). If a survey uses both modes and combines data into a single data set, a mode difference effect may be introduced into the data. Managing the impact of mode differences is possible through survey design practices (see Cristian 2007; de Leeuw 2005; de Leeuw 2008; de Leeuw,

Hox, and Dillman 2008; Olson, et al. 2019). The impact of mode differences may be tolerable due to increases in response rates that come from providing multiple response modes (de

Leeuw, Hox, and Dillman 2008).

Suggesting new or different survey modes for the ACS is beyond the scope of this dissertation. The ACS is already a complex multi-mode survey that provides self-response

24 options by internet and paper questionnaire, and with assistance through a telephone call center and in person home visits (see Chapter 3). The messaging contained in mail recruitment materials that informs potential respondents of the multiple response modes available, and the sequencing of the information that communicates modes of response, is an issue within the scope of this research. For example, the ACS materials offer households the internet response option in the first two mailings and then adds the option to respond by paper survey in the third mailing. Respondents can also call the Census Bureau help line to complete the survey with a trained enumerator but this is not explicitly communicated until it is briefly mentioned in the fifth and final mailing.11 The decisions on the sequence and messaging of presenting response modes to potential respondents is an aspect of messaging that is within scope of this dissertation study.

Sponsorship

Surveys can be sent to potential respondents from institutions, organizations, business, governments, or individuals. The organization or entity that conducts, authorizes, or endorses a survey is referred to as the survey sponsor. Research has shown that the trust and name recognition of the sponsor and the type of sponsor (government agency, non-profit organization, business, etc.) can all impact response rates and nonresponse bias (Herberlein and Baumgardner 1978; Presser, Blair, and Triplett 1992; Groves, et al. 2006; Groves, et al.

2012; Jones 2012; Dillman, Smyth, and Christian 2014). For example, a university affiliated research institution may see an increase in response to a survey request when leveraging the

11 This is different than that CATI operation that made outgoing calls to non-respondents. Incoming callers can still complete the survey with a Census Bureau enumerator.

25 brand and trust recognition of the University (Edwards, Dillman, and Smyth 2014). However, attaching an environmental survey to a known and trusted environmental group (Greenpeace, for example) may increase response rates, but may do so from those who trust and value the organization. While response rates increase, this decision may skew the segment of the population that responds, in this case, respondents that are pro-Greenpeace with similar environmental views as the organization.

Sponsorship is a complex issue for survey practitioners. One consistent finding across decades of study on sponsorship is that government surveys tend to benefit from their connection to the U.S. government and government agencies (Heberlein and Baumgartner

1978; Brick and Williams 2013). Even though government surveys have seen a decline in response over recent years (see Czajka and Beyler 2016), the general public still has a high level of trust in the governments’ authority to conduct surveys (Brick and Williams 2013, National

Research Council 2013b). For example, a potential respondent may distrust a survey from a marketing company asking for detailed income information, especially if the survey is conducted online (Dillman 2019a). However, the same person may readily provide this information if the survey is sponsored by the U.S. government. It is possible that the government sponsored survey would outperform the marketeering survey based on the government sponsorship, The rate or response may vary by the sponsorship agency, with a survey sponsored by the widely known Internal Revenue Service (IRS) potentially outperforming a survey sponsored by the lesser known Bureau of Labor Statistics. Moreover, people who trust a survey request may provide more truthful responses, reducing measurement error.

26 Research suggests that the U.S. Census Bureau is seen as a trusted and known sponsor, and garner higher rates of trust than the U.S. Government in general (Brick and Williams 2013;

Schwede 2013). It is beyond the scope, and abilities, of this dissertation to suggest a different survey sponsor for the ACS. What is within scope is how Census Bureau sponsorship is communicated in the ACS mail recruitment communications. Because the Census Bureau is a widely known and trusted sponsor, it is important that this sponsorship is communicated consistently and clearly to potential respondents.

Burden of the response Task (task-burden or respondent burden)

The task of completing a survey (sometimes called task-burden or respondent burden), can be defined as, “the product of an interaction between the nature of the task and the way it is perceived by the respondent” (Bradburn 1978, p36). The “nature” of a survey task involves two key elements: the length of the survey and the level of effort required to complete the survey questions. While both factors are important, survey length and the total number of survey questions is often seen as a direct measure of survey burden (National Research Council

2013b).12 The federal statistics system recognizes burden in the “Paperwork Reduction Act” which requires that all surveys report the “Paperwork Burden” of the survey along with a time estimate in minutes for the survey task (U.S. Office of Personnel Management 2011). Evidence of the impact of survey length on response rates isn’t unanimous, but most studies suggest that shorter surveys (measured in time to complete, number of questions, or number of

12 Survey length can be measured in many ways. Often, it is measured in time estimated to complete the survey in minutes. It can also be measured by the number of pages in a paper questionnaire or the number of screens for internet surveys

27 pages/screens) have increased response rates (Heberlein and Baumgartner 1978; Yammarino,

Skinner, and Childers 1991; Galesic 2006; Dillman 2000; Galesic and Bosnjak 2009; Rolstad,

Adler, and Ryden 2011) and produce lower rates of drop-offs and nonresponses (Dillman,

Smyth, and Christian 2014). In cognitive interviews, most respondents admit that shorter surveys are also perceived as being less burdensome (Fricker, Kreisler, and Tan 2012; Yu,

Fricker, and Kopp 2015; Fricker, Gonzalez, and Tan 2011; Fricker, Yan, and Tsai 2014).

Survey length is not the only factor of respondent burden (Dillman, Sinclair and Clark

1993; Tortora 2017). Bradburn (1978) also references the stress on the respondent as a factor of survey burden. Many factors can increase the stress on a respondent, such as the complexity and invasiveness of survey questions. Studies have found that the type of questions in a survey, i.e. open ended versus closed-ended can impact task-burden and increase changes of survey nonresponse (Galesic 2006). Complex surveys requiring respondents to use difficult reference periods in questions (Fricker 2012), or to consult personal records as part of a survey task

(Phipps 2014; Hedlin, et al. 2005; Yang 2015) also increase task-burden. A survey that asks difficult or sensitive questions may result in fewer responses or increased drop-offs compared to a longer but less sensitive or invasive survey (Tourangeau and Yan 2007; Robins, et al. 2016;

Kaplan and Fricker 2017).

Multiple factors impact the burden a respondent feels a survey represents (Dillman,

Sinclair, and Clark 1993; Tortora 2017). Because not all respondents calculate burden the same for each survey request, it is important for survey practitioners to realize, understand, and plan messaging that can minimize the impact that response task burden has on decreasing response rates of survey requests. The ACS has many factors that can increase burden on responding

28 households. The survey is long with an estimated time to completion of 40 minutes for an average household (U.S Census Bureau 2014). As a household survey, a resident of each housing unit address selected must complete the survey for all inhabitants, and some complex living situations with a larger number of family members or multiple unrelated persons can increase the burden of the response task. The current ACS paper questionnaire is 28 pages and asks detailed questions that some may find burdensome (Holzberg, et al. 2018). For example, the ACS income questions are routinely flagged as causing burden on respondents (Raglin 2014) and are the focus of Census Bureau research on how to best use administrative records to reduce this burden (see; Griffin 2014; Velkoff, et al. 2018; Velkoff and Ortman 2018; Dillon, et al. 2018; Clark, et al. 2018; Keller 2018; Fernandez, Shattuck, and Noon 2018). Each question on the ACS serves a purpose for the federal government and policy requirements and the Census

Bureau has a formal, multi-year process to conduct a review of the survey questions to reduce burden.13 Because the content of the survey form does not involve survey messaging as defined in this project, and because the process to change content already exists, issues of direct survey task-burden are beyond the scope of this dissertation.

However, response burden is not only related to survey length and question difficulty.

One area within scope for this dissertation is the perception of burden people feel towards a survey, and how this perception of burden can be minimized through changes to messaging in survey recruitment communications. Messaging in survey recruitment materials can shape the

13 For example, the Census Bureau recently tested the potential redundancy of new detailed race reporting options and the already-existing question on ancestry of household members (Mills, Heimel, and Buchanan 2019; Terry, et al. 2019). If found redundant, it is possible to drop one of the questions to reduce the length of the survey and remove confusion from asking redundant question.

29 perception of burden that a potential respondent feels toward a survey task (Haraldsen 2004;

Hedlin, et al. 2005; Fricker 2016). For example, research suggest that respondents are more likely to complete a survey if they find the topic interesting or salient to them (Bogen 1996;

Cook, Heath, and Thompson 2000; Groves, et al. 2000; Galesic 2006; Kaplan and Fricker 2017;

Dillman 2019a). Before a potential respondent sees a paper questionnaire or internet survey instrument to know if the questions or topics are interesting to them, they usually have the opportunity to read an invitation letter describing the survey request. How the survey is explained in these recruitment communications can make a survey appear to be more relevant or interesting to a potential respondent, and tasks that are enjoyable are perceived as less burdensome. In turn, response rates may increase as respondents feel an interesting and important survey is less of a burden. Messaging in survey recruitment communications can also reduce burden by sending clear, simple to read instructions on how to complete the survey.

While Changing the questions contained in the ACS survey are beyond the scope of this project, the messaging content of recruitment materials that can frame the ACS task in ways that may reduce the perception of burden are within scope of this project.

Incentives

The use of incentives has consistently been shown to increase response rates to survey requests (Church 1993; Singer, et al. 1999; Lesser, et al. 2002; Singer and Ye 2013). Financial incentives can also be effective at reducing nonresponse bias (Lesser, et al. 2002). The presence, timing, and type of incentive are all important factor that can impact the effectiveness of the incentive on boosting response and potentially reducing nonresponse bias.

30 Research has shown that monetary incentives, especially in cash form, work better than non-monetary or other forms of incentives. Offering raffles (i.e. complete the survey for a change to win a prize) and non-monetary rewards such as stickers have not been proven to consistently improve response rates of survey requests have inconclusive effects on survey response (Warriner, et al. 1996; Lesser, et al. 2002; Birnholtz, et al. 2004). When effective, they have been in context of specific populations with specific rewards. For example, a survey of nurses recently included a lanyard with the words, “Being a nurse is my super power,” embroidered on the fabric. Many nurses wear lanyards to hold their ID’s and key cards, and the specific messages was shown to be appealing. The lanyard was well received, and it may have increased response (Stepler, et al. 2019).

Monetary incentives are consistently shown to boost response rates. The higher the monetary incentive, the bigger the boost in response, but the relationship isn’t linear with diminishing returns and this finding is not consistent in all studies (James and Bolstein 1992).

The timing of the monetary incentive is also important. While many surveys provide a monetary incentive upon survey completion, research suggests that pre-incentive – a small token amount of money (as little as a dollar or two) provided in a survey invitation without any requirement to complete the survey – have been shown to be a powerful trust building mechanism that drives survey response (Trussell and Lavrakas 2004; Dillman, Smyth, and Christian 2014). A token incentive may be able to elicit a sense of reciprocity, as sociologist Alvin Ward Gouldner defines, “a mutually contingent exchange of gratification” (1960, p161). Providing the token incentive prior to the survey being complete can build a sense of good will, trust, and obligation that is returned to the survey organization when the sampled member completes the survey

31 (Dillman, Smyth, and Christian 2014). The impact of the token pre-incentive can produce a larger impact on response rates than much larger sums of money promised after survey requests. Because the ACS is a mandatory U.S. government survey, the Census Bureau is not permitted to provide monetary incentives to respondents. Even if providing a monetary incentive was permitted for a mandatory government surveys, the presence of an incentive might send a mixed message to respondents. Respondent could wonder why they are being paid to do something that is required by law, making it possible that a monetary incentive in a mandatory survey may communicate that the survey response is voluntary (Dillman 2019a).

The most powerful incentive to participate in the ACS is that doing so complies with the law. People are motivated to follow the law and communicating an obligation to reply to a survey request can significantly increase response rates (Edwards, et al. 2009). Not surprisingly, mandatory surveys receive increased response rates compared to non-mandatory surveys

(Dillman 2000; Phipps 2014). However, being required by law does not mean that all sampled households respond to the ACS. While the ACS publishes a 94% total weighted response rate, only about 63% of sampled households respond during the self-response period with additional households responding in more expensive follow-up response operations (Baumgardner 2018).

Census Bureau research has shown that response rates to the ACS have fluctuated depending on how the mandatory status of the survey is communicated (Oliver, et al. 2017; Barth, et al.

2016; Oliver, Risley, and Roberts 2016). It is clear that the ACS being, “required by law” is an incentive that impacts response, but the messaging that relays this information to potential respondents – how the message is written, what mail materials it is placed, and how it is

32 emphasized – also matters. The decision on where, when, and how this mandatory messaging is communicated across a multiple mailings and mail pieces is within scope of this dissertation.

Structuring Request to Respond

Just as there are many modes in which someone can respond to a survey, there are also multiple ways to contact sample members to respond to a survey request. How recruitment contacts are structured, including the number, mode, timing (number of days between contacts), and format (i.e. letter vs postcard) of contact can impact response propensities

(Dillman, Smyth, and Christian 2014).

Increasing the number of contacts with a sample respondent will increase the chance that sampled person or household will respond to a survey request and may even reduce nonresponse bias by gaining compliance from uninterested sample members (Roose, Levens, and Waege 2007). Sending multiple contacts is one of the most consistent and impactful factors that can increase survey response propensity (Scott 1961; Dillman, Smyth, and Christian 2014).

Today, many survey operations use a contact strategy with multiple contacts to convince a sample member to participate. However, it is possible that too many contacts will overwhelm, burden, or make respondents angry (see Zelenak and Davis 2013; Dillman, Smyth, and Christian

2014).

Survey methodologists are acutely aware that the population is increasingly inundated with survey requests. Federal government survey requests have increased 50% between 1984 and 2004 (Presser and McCulloch 2011). In addition, marketing and customer experience surveys are increasingly fielded. Some have claimed that they receive a survey request of some kind every day (Holzberg, et al. 2018). The growing number of survey requests a person

33 receives may lead to an “over-surveying effect” leading to “survey fatigue” (Groves and Couper

1998; Olson 2014). It is important for survey practitioners to write survey invitation materials in a way to set their survey apart from other requests (Tourangeau and Ye 2009). And while response rates increase with the frequency of contacts, survey practitioners must remain aware of the impact of such oversampling and be mindful of the total number of survey requests sent for their survey.

Multiple decisions must be made on how to structure the sending of multiple contacts with potential survey respondents. Depending on the survey frame, the mode of contact can vary or stay the same across contacts with potential survey participants. In-person contacts, emails, and phone calls are possible modes of contact, but not all survey operations can utilize each of these contact modes. An email list may be available for certain populations (such as surveys of college students mandated to use University email address, or employees of a business) email contact is not viable in many survey settings. Factors of changing norms, caller-

ID technology, and cell phones replacing landlines, have combined to decrease willingness of people to pick up phone calls from unknown numbers, making phone contact for survey recruitment increasingly difficult (Dillman, Smyth, and Christian 2014). Moreover, norms of politeness in the mid-20th century may have contributed to higher agreement to survey requests and as those norms have faded, people may be emboldened to decline (Dillman

2019a).

It is important to remember that the mode of contact does not have to match with the mode of survey response (Messer and Dillman 2011; Millar and Dillman 2011). When available, sending a mail contact letter or postcard as a pre-notice to announce an upcoming phone,

34 internet or in-person survey, is a powerful way to establish legitimacy and build trust with respondents (Dillman, Smyth, and Christian 2014). Using multiple contact modes in a single survey design can also be used to effectively increase response, as in the case in email augmentation methodologies where email reminders are sent after an initial regular mail survey invitation (Messer and Dillman 2011; Millar and Dillman 2011). While timely email follow-ups have been shown to be effective following an initial mail survey announcement

(Messer and Dillman 2011; Millar and Dillman 2011) there is no feasible way for the ACS to implement email contact at this time.

Timing of mail contacts is also an important factor. A postcard reminder timed closely after a paper survey mailing package can remind an individual to respond to a survey before the paper survey form is misplaced or thrown away. In other cases, sending reminders more spread out may effectively increase the window of gaining responses and increase the overall likelihood of gaining a survey response (de Leeuw and Hox 2008; Clark, et al. 2015; Clark and

Roberts 2015; Dillman, Smyth, and Christian 2014).

The format of a mailing can also communicate a message to a potential respondent. For example, the physical presence of a letter sent to respondents in a first-class mailing envelope including a professional informational brochure may communicate a higher level of importance than an inexpensive post-card. These mailings cost more money than postcard mailings and too many inserts can annoy or dismay respondents and not improve response rates (Heimel, Barth, and Rabe 2017; Dillman 2019a). Survey operations should vary mailing format of survey communications so that each contact appears to be something new. While letters are useful to communicate the importance of a survey request, a postcard may be more useful for a

35 reminder mailing, because postcards have the added benefit that they do not need to be opened for their content to be seen. New technologies such as Pressure Seal Mailers (PSMs) – a single piece of paper that is folded and sealed by a special machine that makes the outside of the paper an envelops, and the inside a letter – can add intrigue to survey requests (see Risley, et al. 2018). No single format is inherently better than the others, and it would be equally unwise for a survey to send three identical post cards as it would be to send three identical envelopes and letters. It is important for survey practitioners to consider how to best mix the available mailing formats in a single mail communication strategy.

There is not a “one size fits all” standard for how to structure the number, mode, timing, and format of multiple mail contacts in a survey operation. However, the structure of the contact strategy will impact response rates. The number, timing, and format of mailings are factors that sends a message to sampled housing unit addresses, making these decisions in scope for this project.

Communication Content

Many survey operations, including the ACS, ask respondents to self-respond to a survey request through mailed, written communications. The messaging content of these communications has a direct impact on the response propensity of a survey request (Dillman, Smyth, and Christian

2014). Survey organizations must make decisions on what exactly to say in each contact with respondents, with each communication representing an opportunity to convince additional people to respond to the survey request. A survey organization may feel the need to communicate a lot of messages in these contact, including: why a survey is being conducted, how the survey recipient was selected, the sponsor of the survey, how survey results will be

36 used, how confidentiality and survey response data is protected, if and how respondents will be remunerated, how to take the survey, how to access a web instrument (URLs and access codes), and how to get additional information about the survey (Dillman, 2019a). More messages may be required depending on the rules of the survey organization and those mandated by Institutional Review Board (IRB) or, in the case of government surveys, the Office of Management and Budget (OMB) and contact information for the survey organizations and government oversight authorities.

A message can be communicated with words, symbols, graphics and numbers written on a page (Dillman and Redline 2004; Christian and Dillman 2004). Other aspects of the communication can also send a message to respondents. The color, size and weight of the paper, the quality of the printing or materials in a mail contact, the use of embossed paper or official letterhead, the type and size of envelopes or post cards, and more can communicate messages of formality, trust, officialness and importance of the survey request to potential respondents. By comparison to other factors that impact survey response, the attention paid to the messages contained in survey requests is lacking (Dillman 2019a). As is the case with other factors that impact survey response, some is known about how individual elements of communication can impact response rates, but a single theory on how multiple messages fit together across mail pieces to consistently produce the best survey response rates is not settled science (Dillman, Smyth, and Christian 2014). The content of the messaging contained in the ACS mail recruitment materials is the focus of this dissertation making nearly all aspects of the communication content of the survey request materials within scope for this dissertation.

37 Attributes of Potential Respondents

Survey methodologists and practitioners have been aware of the impact that demographic characteristics can have on survey response propensities. Many studies explore how personal characteristics of respondents, as well as the impact of how those characteristics interact with the characteristics of survey, can affect the chance that a sampled person responds to a survey request (Groves, Singer, and Corning, 2000) and how different types of respondents perceive the burden of a survey request (Haraldsen 2004; Hedlin, et al. 2005; Wenemark, et al. 2010;

Fricker 2016). Age, education, income, familiarity and frequency of internet use, household size, community involvement, and a person’s previous history and attitudes toward surveys, the perceived usefulness of the specific survey request, can all impact response rates (Jones 1979;

McCarthy and Beckler 2000; Hedlin, et al. 2005; Fricker 2016; Tortora 2017).

In preparations for the 2020 decennial census, the Census Bureau compiled a list of known factors that lower response rates for individuals (see Evans, et al. 2019). These risk factors include renter status (Erdman and Bates, 2017; Letourneau, 2012); Female led households with non-married women who live with another adult (Erdman and Bates 2017); household with children between the ages of 0-4 years old (Erdman and Bates 2017); households with income less than $35,000 per year (Erdman and Bates 2017; Joshipura, 2008); people with less than a high school education (Joshipura 2008; Nichols, et al. 2015); racialized

Non-White minority groups (Letourneau, 2012); large households with more than four people

(Letourneau 2012); and people living in multi-unit or mobile homes (Erdman & Bates 2017). The

Census Bureau has conducted considerable research in understanding the mindsets of potential respondents to Census Bureau requests (see Macro 2009, Conrey, ZuWallack and Locke 2012;

38 Hagedorn and Green 2014; Hagedorn, Green and Rosenblatt 2014; Reingold 2014a;

Tourangeau, et al. 2014). The added factor of increased distrust of the U.S. government may also be a factor in gaining response to a government survey requests. Also, households and individuals have been inundated by an increase in survey requests, both legitimate requests from government agencies and private companies, as well as illegitimate scams. People in certain segments of the population can be more surveyed than others and can have varying levels of “survey fatigue” (Groves and Couper 1998; Olson 2014).

Survey organizations cannot change demographic characteristics of potential respondents. It is also highly difficult to change a potential respondents’ attitude towards surveys, the U.S. government, and government run surveys. However, understanding the mindset of potential respondents can help survey designers craft messaging that is more strategic. For example, households that immediately respond to the ACS are removed from receiving future mailings. This saves the Census Bureau money and also reduced burden on respondents receiving mailings they do not need. Adapting the messaging in later mailings to reflect the mindset of reluctant respondents who do not respond to early mailings may help boost response and gain compliance from difficult to survey populations. Also, while survey fatigue is an increasing phenomenon, the Census Burau can craft messaging to set the ACS apart from other, lesser important survey requests a household may receive. Without changing the content of a survey, the way that a survey request is presented impacts how potential respondents perceive the value of the survey and their choice to participate.

Table 1 provides an overview of how messaging, the focus of this dissertation, intersect with each of the factors that impact survey response rates.

39

Table 1. How factors of survey response intersect with messaging

Aspect Intersection with messaging

Mode How response modes are offered and communicated

Sponsorship How Census Bureau sponsorship is communicated

Task-Burden Reducing the perception of burden through messaging

Incentives Communicating that responding to the ACS is required by law

Structure of Request Format, number, and timing of mailings

Communication Content All aspects pf communicated content

Respondent attributes Leveraging information on the mindset or participants to change messaging to increase response

Conclusion: Why focus on messaging?

Survey messaging is important to address in this dissertation for three reasons.

First, improving messaging can directly reduce one of the four sources of survey error, nonresponse bias. By making improvements to messaging it is possible that recruitment materials may do a better job of recruiting more households as well as households more evenly across relevant demographics. By increasing trust in the survey request, respondents may answer surveys more truthfully, potentially reducing measurement error as well.

Second, survey messaging is perhaps the most underdeveloped area of research on the factors that impact survey response (Dillman, 2019b). While knowledge exists about individual messages that may impact response, very little is known about how to stage that messaging across multiple mail contacts (Dillman, Smyth, and Christian 2014; Dillman 2019b). Most large mixed-mode survey operations utilize a multiple-mailing contact strategy for recruitment, so it

40 is surprising how little is known on the best messaging to communicate in survey recruitment materials and how to stage messaging across multiple contacts.

Third, as shown in the review of literature in this chapter, messaging intersects with nearly all seven factors that impact survey responses. Some factors that impact survey response, such as adding new modes of contact or modes to respond, is expensive. Messaging can be studied more quickly through experimental design methodologies deployed regularly in survey research, and specifically thought the Methods Panel testing program at the Census

Bureau. Improving messaging can be a way to make a cost effective and timely impact on the

ACS than attempting more large-scale, programmatic changes to the mail-out methodology.

This chapter has presented seven factors that potentially impact survey response. It would not be possible to address each factor in a single project. By setting scope conditions for this project to focus on messaging, I built an argument that messaging is a pragmatic and useful way to potentially improve response rates and reduce nonresponse bias. By focusing on messaging, this dissertation will contribute to a growing body of literature on issues of survey communication so that the ACS, government, and non-government surveys increase response rates in an increasingly difficult survey environment. Before we can look ahead to make recommendations to improve the messaging in ACS mail communication materials, we must first understand the messaging that is currently communicated. In chapter three, I present details on the current ACS mail communication strategy arguing that messaging in ACS communications is insufficient and necessitates a thorough review.

41 Chapter III. The current ACS mail contact methodology

In this chapter, I provide an overview of the current ACS mail communication strategy. Doing so provides necessary context for the analysis and proposed recommendations to improve ACS messaging. Also, detail is provided on recent research that has led to changes in the ACS production methodology and communication materials. I argue that while the ACS has conducted numerous tests to improve the overall ACS mail communication strategy, little of this research has focused on messaging. Moreover, when messaging has been the focus of research, changes to ACS communication materials has been slow and the process of incremental, path-dependent innovation may have led to less-than-optimal messaging. This process has made the ACS mail communication materials incongruent and potentially insufficient to optimize response rates and reduce nonresponse bias. My review also shows that the Census Bureau has not conducted a thorough review of ACS recruitment materials, which it is my goal to provide in this dissertation.

Current ACS Methodology

The five mailings sent during the self-response period are sent to mailing addresses sampled from the Census Bureau Master Address File (MAF); a file comprised of complete address information including a house number and street name (for city addresses) or a rural-route and box number (for rural addresses) and a ZIP (for all addresses). Until recently, complete residential address lists that provided coverage of the general population from a single frame did not exist (Harter, et al. 2016). However, since the mid-2000’s, Address Based Sampling (ABS) has been used successfully to produce high coverage sample that can lead to significantly increases in response rates and has become a gold standard for frames for

42 large, national surveys (Link, et al. 2008; Brick, Williams, and Montaquila 2011; Brick, et al.

2012; Williams, et al. 2014; Amaya, et al. 2015; Dillman 2017a).

ACS data collection begins with a series of web-push contacts in which the first two contacts ask only for an internet response. This strategy builds upon strategies developed from a series of experiments conducted by the Washington State University Social and Economic

Sciences Research Center (see Smyth, et al. 2010; Dillman, et al. 2010; Messer and Dillman

2011; Messer 2012; Edwards, Dillman, and Smyth 2014; Dillman 2017a). The goal of this internet-first push is to reduce the cost of survey operations by obtaining responses quickly and without having to send or collect and process paper responses. Because all people cannot, or do not, want to reply online, a paper questionnaire is sent later in the self-response period.

One main factor that dictates the mailout strategy is that the ACS is sent monthly to roughly 295,000 addresses. Due to functional limitations, all mail-out operations for a single month of data collection must occur during a six-week period. For example, for ACS data collected via self-response in the month of April, the first recruitment letter is mailed at the end of March, follow-up mailings are sent throughout the month of April, with the final reminder sent in May. Households that respond to the ACS are removed from the list address at two times in the mailout process. One of the two cuts to the list of addresses occurs after the second mailing and prior to sending the paper questionnaire, preventing early internet responders from receiving a costly paper survey mail package. A second cut removes respondents prior to the final reminder contact. These cuts reduce confusion and burden on households that have responded and also saves the Census Bureau money on mailing materials to households that have already replied. This mail contact strategy and the mail pieces

43 contained in each mailing, are outlined in Figure 1.14

Figure 1. Current ACS production mailout strategy

After the fifth mailing, respondents are removed again prior to the start of a Non-

Response Follow-Up (NRFU) operation that sends Census Bureau interviewers to some nonresponding households. These operations are controlled by the six Census Bureau regional offices, and NRFU methodology is not a topic of this dissertation. However, it is critical to note that all mailing operations to recruit self-response for the ACS occur proper to this nonresponse operation. This operation is costly, but critically important to the data quality of the ACS.

Driving response during the self-response period is extremely important to the Census Bureau to reduce costs by reducing NRFU workloads.

Mailing 1

This first mailing package includes 5 items. First, everything is contained in a larger-than- standard 11 ½ x 6 inch outgoing envelope (see Figure 2). This envelope has a cutout window where the address and user ID are printed through onto a second item, a 10 ¾ x 5 5/8 inch card-stock insert (see Figure 3 and Figure 4). Along with the respondents address and a 10-

14 Each month has a different number of days and weekends can fall throughout the month. The exact timing of the mailing can vary month to month.

44 character alpha-numeric user ID, this card contains instructions on how to respond online and the ACS response website. This process allows respondents to throw away the envelope without losing the ID number needed to reply online. The back of the card includes information in Spanish.

The third item is a standard 8 ½ x 11 inch invitation letter to the survey that also includes instructions to respond online (see Figure 5). The fourth and fifth items are informational brochures: a quad-fold “Frequently Asked Questions” (FAQ) brochure (3 ½ x 8 ½ inch folded, 14 x 8 ½ inch unfolded) and tri-fold Multilingual Brochure (MLB) (4 ¾ x 8 ½ inch folded, 13 ¼ inch unfolded) with information in 6 different languages (see Figure 6 and Figure 7 respectively).

Figure 2. Mailing 1: Outgoing envelope

45 Figure 3. Mailing 1: Instruction card (front)

Figure 4. Mailing 1: Instruction card (back)

46 Figure 5. Mailing 1: Invitiation letter

47 Figure 6. Mailing 1: Multilingual brochure

48 Figure 7. Mailing 1: Frequently Asked Questions (FAQ) brochure

Mailing 2

Seven days after the first mailing, the same 295,000 addresses are mailed a 8 ½ x 5 ½ inch, bi- fold pressure-seal mailer (8 ½ x 11 when unfolded) containing a reminder letter to complete the

49 ACS online or wait for a paper questionnaire to arrive (see Figure 8 and Figure 9). All sample members receive the first and second mailings. Two weeks after the second mailing is sent, respondents to the ACS are removed from the mailing list.

Figure 8. Mailing 2: Bi-fold pressure seal mailer (outside, without fold lines)

50 Figure 9. Mailing 2: Bi-fold pressure seal mailer (inside, without fold lines)

51 Mailing 3

Two weeks after the second mailing is sent and responders are removed from the mailable universe, a third mailing is sent to remaining nonresponding households containing six mail pieces. First, an 11 ½ x 6 inch outgoing envelope with a cut-out window (same size used in mailing 1) contains the mail pieces (See Figure 10). The address is printed through this window directly onto a 28-page ACS paper survey questionnaire on 10 ¼ x 10 ½ inch paper (see Figure

13 for the front and Figure 14 for the back). The package also includes an 8 ½ x 11 inch letter indicating that sampled households have a choice in how they can respond to the ACS by the enclosed paper questionnaire and online (see Figure 12), a replacement FAQ brochure and a pre-paid return envelope (see Figure 15 and Figure 16 respectively). A 10 ¾ x 5 5/8 inch card- stock insert is also included in this mail package to reinforce response options to the survey and provide an instruction in Spanish (see Figure 11).15

Figure 10. Mailing 3: Outgoing envelope

15 Differing from the instruction card in the first mailing, this card does not contain the mailings address. In this mail package, the address label is printed on the survey form through the cutout window in the outgoing envelope.

52

Figure 11. Mailing 3: Instruction card (front and back)

53 Figure 12. Mailing 3: Letter

54 Figure 13. Mailing 3: Paper survey form (front)

55 Figure 14. Mailing 3: Paper survey form (back)

56 Figure 15. Mailing 3: Frequently Asked Questions (FAQ) brochure

57 Figure 16. Mailing 3: Pre-paid return envelope

Mailing 4

Four days after the third mail package containing the paper questionnaire is sent, a 6 x 4 ¼ inch reminder post card is delivered to the same universe of remaining respondents (see Figure 17).

This four-day window is deliberately close the mailing of the third mail package because the post card reminds respondents that they can reply by paper and by internet. The thought is that 4 days (rather than a longer time delay) between mailings may insure that households still have the 3rd mailing package in their homes with the paper questionnaire. Approximately 14 days after the 4th mailing is sent to respondents, responders to the ACS are removed from the universe of sampled households.

58 Figure 17. Mailing 4: Reminder postcard

Mailing 5

The 5th mailing sent to remaining non-responders is a bi-fold pressure-seal mailing letter to remind respondents to complete the ACS. This is the same size as the PSM sent in mailing 2 (8 ½ x 5 ½ inch folded, 8 ½ x 11 unfolded). For the first time, this letter mentions that respondents

59 can complete the survey by telephone (see Figure 18 and Figure 19). This 5th mailing is the final mailing in the self-response contact period.

Figure 18. Mailing 5: Bi-fold pressure seal mailer (outside)

60 Figure 19. Mailing 5: Bi-fold pressure seal mailer (inside)

The five mailings are spaced out to respondents over a thirty-nine-day period, on average, depending on the month. For logistical reasons, it is critical that these mailings all are

61 sent within this time frame. The ACS is a monthly survey, and immediately after the mailout for a month is complete, the Census Bureau National Processing Center begins mail-out procedures for the next month. It is also critical for the self-response contact period to end on time because twenty-one days after the 5th mailing, a sample of nonresponding households are placed into the nonresponse follow-up universe.16 Table 2 provides a summary of the mail pieces sent and the timing of each mailing.

Table 2. September 2018 ACS Mail Contact Strategy (English)

Mailing Timing Mail Pieces 1 Day 0 Instruction card (internet) Introduction letter Multilingual brochure Frequently Asked Questions (FAQ) brochure Initial mailing outgoing envelope 2 Day 7 Reminder letter (PSM)

3 Day 21 ACS questionnaire Instruction card (choice) Follow-up letter FAQ brochure Incoming envelope Outgoing envelope 4 Day 25 Reminder postcard 5 Day 39 Final reminder (PSM)

In 2017, The Census Bureau ran a comparative test of this web-push methodology versus a “choice” methodology. In the choice methodology, households were provided a pre-

16 housing unit addresses in the NRFU universe may receive additional communications from the Census Bureau, including telephone calls, mailings and personal visits from Census Bureau interviewers. This dissertation does not evaluate the materials sent during the follow-up operation.

62 notice announcement in the first mailing followed four days later by an invitation to take the

ACS by internet or by paper. The hypothesis was that the “web-push” methodology would work best in most locations, but that the choice methodology may work better in locations where demographic and respondent characteristics predict that households are more likely to want, and to respond, by the paper survey mode. By implementing an “adaptive” methodology, the

Census Bureau could send differ mailout strategies to different locations to provide the paper survey early in some locations (for example rural areas with low internet coverage) and provide the web-push mailings to other locations (for example, suburban and urban areas with higher rates of internet coverage). If the test was successful, the Census Bureau could improve response rates and reduce costs while decreasing burden on these respondents by providing a paper questionnaire earlier to households more likely to want or need one. However, the test showed that the web-push methodology outperformed the choice methodology, producing significantly higher overall response rates. The choice methodology did not increase response rates in areas identified to prefer paper, but it did increase the proportion of respondents who responded by paper. Adopting an adaptive would significantly increasing costs to the ACS program (Longsine and Risley, 2019).

Overview of ACS research and innovations that impact messaging

I propose that the current ACS mail communication materials outlined above require a thorough review and potentially a full redesign. To justify this thorough review, I provide in this section a brief history innovation of the ACS mail-contact methodology focused on messaging in the mail communication materials since 2005. This review shows that the messaging in the ACS materials has undergone sporadic, incremental changes that may led to problems in the current

63 ACS messaging materials.

Early Years of ACS Testing

During the first decade of the ACS, the Census Bureau devoted resources to improving many aspects of the ACS methodology. These efforts primarily focused on three tasks: 1) effectively launching the ACS and implementing the initial data collection methodology, 2) Mitigating the impact of the decennial census in 2010, and 3) Adding the internet response option in 2013.

Each is covered below.

Launching the ACS

To launch the ACS, the Census Bureau had to create a mail communication strategy to recruit participants. The decision was made to use an Address Based Sampling (ABS) frame to send a series of mail communications to household. An initial contact methodology, i.e. the materials and messaging sent and the timing between mail contacts, was tested and created (see

Landreth 2003) based on the contact methodology used during the 2000 decennial census, which had been developed and tested in the 1990’s (Dillman 2000).

In this original methodology (shown in Figure 20), the Census Bureau sent all sampled housing unit addresses a prenotice letter to announce that the household had been selected to participate in a survey and to provide information on the ACS. Sending a prenotice prior to a mailing containing a paper survey instrument had been shown to be an effective way to boost response (Fox, Crask, and Kim 1988; Yammarino, Skinner, and Childers 1991, Dillman 1991;

Dillman, Clark, and Sinclair 1995; Raglin, et al. 2004). After the prenotice letter, sampled housing unit addresses were sent a mailing containing the paper questionnaire form, and FAQ brochure, and an instruction booklet with details on how to take the survey. Shortly after this

64 mail package was mailed, all sampled housing unit addresses received a reminder postcard to complete and mail back the survey form. Before the next mailing, all addresses that responded to the ACS were removed from the mailing universe. A forth mailing was then sent that included a replacement questionnaire, a different letter, and the same supporting materials as the first mail package (FAQ brochure and instruction booklet). Sampled addresses that had yet to respond were then eligible to enter the CATI (Computer Assisted Telephone Interview) operation and receive a phone call (if a phone number was available) to complete their survey over the phone with a trained ACS enumerator. After attempts were made to reach households by phone, the Census Bureau would send interviewers to sampled address to complete the survey in person during a CAPI (Computer Assisted Personal Interview) operation (Tancreto and

Poehler 2015).

Figure 20. Initial ACS mail contact methodology

This initial contact methodology became the baseline for all future ACS mail-out methodologies as all future innovations are based on changes to this initial plan. During the early years of the ACS, the Census Bureau did not devote resources to research the messaging contained in ACS mail materials. The ACS operation was new, and the Census Bureau was

65 primarily concerned with producing data products and refining aspects of the ACS data collection operation, such as revising CAPI scripts and optimizing the ACS questionnaire form

(e.g. Gerber and Dusch 2007).

However, research conducted by the Census Bureau around this time found that one message was especially important for the success of Census Bureau requests. A boxed statement on the outgoing envelope that read: “U.S. Census Form Enclosed: Your response is required by law” increased response rates by 8-10 percentage points above other response inducing factors such as multiple contacts (Dillman, et al. 1996). Later research confirmed its importance for the ACS (Raglin, et al. 2004; Griffin, et al. 2004).

Mitigating the impact of the 2010 Census

With the ACS survey operation established, the Census Bureau was able to devote resources to face new challenges. The first challenge, and potential opportunity, was to decide how to mitigate the impact of the 2010 decennial census. To increase response rates of the decennial census, the Census Bureau launched a robust, national-wide advertising campaign that even included a commercial during the National Football League Super Bowl, the most watched television event in the country. This campaign did not mention the ACS, yet over 2 million household would receive mailings to complete the ACS during the decennial census data collection period. The presence of the decennial census conducted by the Census Bureau could increase response rates in the months prior to the decennial census data collection if the ACS was mistook as the decennial census. It could also lower response rates during the months of simultaneous data collection if people thought completing the decennial census absolved them of their duty to complete the ACS. There was a fear that the ACS request could be seen as a

66 scam because it didn’t look like the advertised census. Either way, the Census Bureau was concerned that ACS sampled households could be confused on which form or forms they needed to complete.

Research prior to the 2010 decennial census focused on how to develop mail materials that communicated that households needed to respond to the ACS as well as the decennial census. The recommendations that came out of this research included adding new text on the

ACS envelops and letters to distinguish the ACS communications from the decennial census.

Other ideas, such as using color to differentiate the two data collection efforts, were not found to increase response and were not implemented (Chesnut and Davis 2011; Schwede 2013).

Despite the Census Bureau’s best efforts, the hypothesized effects of the decennial census were felt by the ACS program (Baumgardner 2013; Baumgardner 2017). The ACS received an initial substantial bump in response in the months prior to the census, likely as households mistook the ACS for the well-publicized and well-known census. Conversely, response rates decreased in the months during and immediately after the census (Chesnut and Davis 2011; Schwede 2013;

Baumgardner 2017).

While this research addressed messaging in the ACS mail communications, it was specific to a defined and temporal problem of mitigating the impact of the 2010 census. None of the changes implemented to differentiate the ACS from the census were applicable after the decennial census data collection operation was complete. The ACS messaging in the ACS mail materials returned to normal after the decennial census operation concluded.

Two prominent, permanent changes did come out of research during this period unrelated to the 2010 decennial census. First, inspired by concerns to increase the participating

67 in the census from non-English speaking households, the ACS added a multilingual brochure with messaging in five non-English languages (Pan, et al. 2008; Bates and Pan 2009).17 This brochure was added to the prenotice letter package, providing a way for non-English speakers to take the survey over the phone with someone speaking their native language. This item was not shown to significantly increase response rates but was added permanently because it made the ACS more inclusive (Joshupur, et al. 2010).

The second change was the addition of a reminder postcard sent to some nonresponding households after the replacement survey package, rather than sending those households directly to the CATI data collection operation (Chesnut 2010; Schwede 2008). This additional contact was found to boost response enough to justify the additional cost of a 5th mailing by preventing households from entering the later and more expensive CAPI data collection operations. This postcard was only sent to a sample of addresses where a phone number was not known. Household with identifiable phone numbers from third party vendor list still entered the CATI operation. This changed the mailout strategy, as shown in Figure 21.

17 All messages in the multilingual brochure are written in English, Spanish, Chinese, Vietnamese, Russian, and Korean.

68 Figure 21. ACS mail contact methodology update with additional post card reminder

Adding an Internet response option and pushing respondents to the Web

The most substantial change to the ACS since its inception came when the Census Bureau added an internet self-response instrument to the ACS methodology. This was to increase self- response rates and reduce costs at a time when response rates for mail paper self-response was on the decline (see Chesnut 2010; Tancreto, et al. 2012). In addition to creating a web version of the ACS (see Tancreto, Davis, and Zelenak 2012), adding the option to respond online had an impact on all ACS operations, from how to calculate response rates, new data merging procedures and explorations into mode bias (see Baumgardner, Griffin, and Raglin 2014; U.S.

Census Bureau 2014). Relevant to this research, adding a new data collection mode necessitates changes to the mailing methodology and messaging to recruit households into the

ACS (Ruiter, et al. 2012).

Residents of sampled households previously had two ways to respond before CATI and

CAPI follow-up operations began. They could complete the paper questionnaire mailed to their house or if they called the ACS assistance phone number, they may have been offered to take the survey over the phone (Nichols, Horitz, and Tancreto 2015). Online responses are less costly

69 than paper due to reduced printing, assembly, mailing, and processing costs, and the interviewer assisted CATI and CAPI modes, due to reduced labor costs. The Census Bureau needed to find the best way to “push” respondents to respond by the internet to maximize cost savings provided by implementing the internet response option. It was clear from previous research both inside and outside of Census Bureau that if people were given a choice of responding over the web vs. mail, that the vast majority would respond by mail and overall response rates would not be improved (Gentry 2008; Gentry and Good 2008; Smyth, et al.

2010; Medway and Fulton 2012) thus limiting the potential cost saving of implementing an internet data collection operation. Another study showed that providing a choice of response could improve response, but only when a mail contact was augmented by an internet contact

(email) to remove barriers to response and increases trust in the survey request (Millar and

Dillman 2011). Inasmuch as email could not be used to follow up on ACS , these results also argued against offering choice of response mode in the initial mailings.

Rather than providing sampled households the choice of responding by web or paper, mail communications were designed to offer response modes sequentially, holding back the offer of responding by paper survey form until the third contact. The sequential ordering of response options that starts with an internet option first and adds other response options later are known now as a “web-push” methodologies. The main purpose of this methodology is to

“push” respondents to the web early in the survey process. While evidence suggested that mail-only methods received higher response rates compared to “web-push” methods (Messer and Dillan 2011), keeping the ACS as a paper-only survey would prevent the Census Bureau from realizing substantial cost savings of moving responses to the web from paper, CATI or CAPI

70 and getting returns quickly (Tancreto, et al. 2012; Griffin 2013; Griffin 2014). As internet response becomes more available and trusted, the cost savings and response rates from web- push methodologies are likely to increase (Bandilla, Couper, and Kacsmirek 2014; Dillman

2017a; Dillman 2017b). In 2013, the Census Bureau implemented a web-push methodology to incorporate the internet response mode into the ACS, shown in Figure 22.

Figure 22. ACS mail contact methodology adding the internet response option

The initial ACS web-push mailing strategy fit web-push communications into the existing processes that ACS was already using. This included sending a pre-notice contact to announce the survey request followed by a web-push invitation letter and reminder post card that offered the internet as the only response mode (and noting that a paper survey would arrive later). A paper questionnaire that provided a choice of both paper and internet response modes was sent in the fourth mailing. The methodology included additional reminder post cards after the mail questionnaire to promote self-response by either mode prior to more expensive CATI and

CAPI operations (Matthews, et al. 2012).

Another change to the methodology was the creation of two new mail items. The first was a card-stock item that provided a user ID and information about responding online that

71 was added to the initial mailing. A similar card-stock item was added to the survey questionnaire package that included the user ID to respond online, but also highlighted the choice to complete the survey with the paper questionnaire. These items were seen as necessary to provide user information to reply online, though testing was not conducted to see if these mail items increased response rates or pushed households to the web (see Tancredo, et al. 2012; Matthews, et al. 2012).

Providing a paper questionnaire requires at least two weeks for respondents to receive the paper questionnaire, time to take the survey, and time for the survey to be mailed back to the Census Bureau NPC and processed. With additional web-push mailings, it was more difficult to send a second paper questionnaire in the same timeframe of the ACS mailing schedule

(Matthews, et al. 2012). Despite evidence from survey methodologist that providing a replacement questionnaire was effective in boosting response (see Dillman, Smyth, and

Christian 2009) the replacement paper questionnaire was removed and the timing between mailings was altered (Clark, et al. 2015a). Households that did not respond by internet would receive a single paper survey form or could respond with assisted modes during the CATI and

CAPI operations.

Recent research, innovation and potential problems

Since implementing the internet as a response option in 2013, the Census Bureau has conducted research to improve the web-push methodology to maintain response rates, improve speed of response, reduce respondent burden, and reduce survey costs.

The removal of unnecessary mail pieces led to unintended consequences In 2014, the Census Bureau sought external advice on how to best improve the ACS program.

72 One recommendation provided was to remove unnecessary mail pieces from the ACS contact methodology. Including too many materials in a mail package can increase burden on respondents by giving them too much information to process (Dillman 2014; Dillman 2016). The rationale for these recommendations is that the more pieces that a person has to process, the greater the likelihood that responding will be delayed or not carried out. Also, seven items now had to be coalesced for insertion into the questionnaire replacement mailing.18 The Census

Bureau removed the Instruction booklet from the paper questionnaire package as testing showed removing this item would not harm response and save money (Clark, et al. 2015a).

No subsequent changes were made to other materials or messaging when removing this item. If all items in a communication are integrated and purposeful, dropping an item should necessitate a more thorough review of messaging across mail pieces. More troubling, this test found that removing the card-stock instruction choice card from the paper questionnaire mailing package would save money and not decrease response rates. However, this recommendation was not implemented, and the card remains in production at the time of the analysis.

Another item removed from the ACS methodology was the prenotice mailing. Survey research had consistently found that a prenotice mailing was a cost-effective way to boost self- response (see Dillman, Smyth, and Christian 2009) and that the mailing was useful for the ACS

(Raglin, et al. 2004). However, this mailing may not be necessary in web-push methodologies

(Dillman, Smyth, and Christian 2014). Rather than announce an upcoming survey, the initial

18 This mailing included: 1) Outgoing envelope, 2) letter, 3) instruction card, 4) paper survey form, 5) prepaid return envelope, 6) instruction booklet, and, 7) FAQ brochure.

73 contact could immediately provide sampled households the option to respond online without decreasing response (Murphy and Roberts 2014; Clark, et al. 2015b). The Census Bureau decided to drop the prenotice from the ACS methodology, combining the first two mailings into a new initial mailing package.

Dropping the prenotice may have had unintended consequences. In the original methodology, the prenotice mailing only included a letter multilingual brochure. Sent separately, the second mailing included a letter, FAQ brochure and card-stock web-push instruction card. When the pre-notice was dropped, a multilingual brochure got added to the materials sent in the second mailing, resulting in a mailing with five items: 1) an envelope, 2) letter, 3) FAQ brochure, 4) multilingual brochure, and 5) web-push instruction card). While the same items are sent in one mailing rather than two, this new introduction mailing goes against recommendations to limit the number of items in a single mailing. Recent evidence suggests that including informational brochures in an initial survey contact is not effective (Dirksz 2018).

The impact of an initial mailing with five items was not mentioned in the report that recommended dropping the prenotice mailing (Clark, et al. 2015b). This research also recommended changing the reminder postcard after the initial mailing to a reminder letter because letters can contain internet user ID’s necessary for households to respond online while postcards cannot. The change is shown in Figure 23.

74

Figure 23 . Impact of dropping the ACS prenotice: Combining two mailings into a new initial mail package Original mailing sequence

New mailing sequence

75 CATI operation cancelled without changes to messaging Another procedural change was adopted in October of 2017 when the ACS decided to cancel their CATI data collection operation, resulting in the mailout strategy in Figure 24.

Figure 24. ACS mail contact methodology after dropping CATI

Despite efforts to salvage the CATI operation through innovations to the calling process (see

Griffin and Hughes 2013; Hughes, et al. 2016; Mills 2016a; Mills2016b; Mills 2017) analysis showed that the response rates for CATI had declined to the point that continuing the

76 operation was no longer viable for the program (U.S. Census Bureau 2018). Cancelling the CATI operation meant that Census Bureau call centers would no longer call nonresponding households to recruit participants into the ACS. While outgoing CATI calls were cancelled, potential respondents would still be able to call the ACS helpline and take the survey over the phone with a trained Census Bureau interviewer. Even though demographic analysis suggested that representation in the ACS respondent population would not suffer from the cancelation of the CATI operation, some respondents prefer to take surveys with an enumerator (Bowling

2005) and forcing someone to respond in a non-preferred mode is not good survey practice

(Dillman, Smyth, and Christian 2014). With the end of the CATI operation, it is surprising that the Census Bureau did not choose to highlight the call-in option for taking the survey over the phone, which remains in place as a cost-effective alternative to in-person visits.

Pressure Seal Mailers – not implemented as tested

The ACS conducted an experiment to test the impact of switching some mailings from letters or post cards to new “Pressure Seal Mailers” (PSMs). PSMs are a relatively new mailing technology that uses pressure-activated glue and removable paper tabs to make a folded piece of paper with a letter on the inside into its own self-contained envelope on the outside, as shown in

Figure 25.

77 Figure 25. Example of a sealed tri-fold PSM

PSMs can provide functionality and cost saving benefits compared to letters in envelopes and postcards. While postcards have been shown to provide a cost-effective boost to response (Chesnut 2010), they also have limitations. To complete the ACS online, respondents need a unique ID code that is not permitted by U.S. Code Title 13 to be identifiable from the outside of a mail package. This prevents the ID from being placed and identified on a postcard.

However, a PSM, like a standard envelope, can contain the user ID inside an enclosure. The benefit of the PSM is that it can contain an ID at a lower cost than a letter inside an envelope.

Adding a PSM to the mail contact strategy could also provide a benefit by adding a new

78 mailing format to the ACS methodology. Multiple mailings in a survey communication should all look like the come from the same organization or sender, but also appear to be different from the outside so that a potential respondent does not think the content inside the mail package is something they have already seen before (Dillman, Smyth, and Christian 2014). Introducing a

PSM into the ACS methodology could add variety and entice some households to open the mailings. PSMs are new and potentially eye-catching and they often contain important information, for example, checks, refunds, grades, or official records documents. Sending the

ACS survey request in a PSM may heighten the perception of importance and increase the chance that the content of the mailing is read and acted upon. However, PSMs are unproven in government survey settings.

The Census Bureau tested multiple sized PSMS – a tri-fold PSM the size of a piece of paper folded in thirds as well as a bi-fold PSM the size of a piece of paper folded in half. ACS testing found that PSMs maintained response rates and may have pushed more responses to the internet (Risley, et al. 2017). Storage capacity and machine setting limitations at the Census

Bureau National Processing Center made it easier to use a single PSM format, and not two, in a single survey operation. Based on this limitation, the Census Bureau choose not to directly adopt the treatment using two different PSMs (a tri0fikd and a bi-fold) as tested. Rather than vary the PSM sizes in the mailout, the ACS methodology adopted two bi-fold PSMs sent in the second and fifth mailings. It should also be noted that the PMS adopted in ACS production use different font than letters, brochures, and the survey form.

Mandatory Messaging testing – the slow path to innovation

Previous work showed that required surveys have higher response rates than non-required

79 surveys, but literature was unclear on how to best communicate this to sampled households

(Dillman 2000; Phipps 2014). Because the Census Bureau received complaints about a response to the ACS being by law, there was concern that overstating that the ACS was a mandatory survey could be interpreted as hostile or threatening (Oliver, et al. 2018). The Census Bureau conducted research to see if a friendlier tone and new messaging that softened the mandatory language messages could maintain response rates. This test proved unsuccessful (Oliver, et al.

2018), a finding consistent with previous studies that found emphasizing mandatory messaging increase response rates (Barth, et al. 2016; Oliver, Risley, and Roberts 2016).

In each test, an aspect of ACS mandatory messaging was isolated and tested for impact, which is useful to determine how individual messaging elements work. However, this testing design had some drawbacks. First, the experiments did not consider all messaging on legal obligation spread across multiple the FAQ brochure, the survey form cover, multilingual brochure and instruction cards. Second, the changes tested were each a small variation to existing materials and new ways to communicate mandatory status were not tested. Third, confounding factors in one test that tested some potentially useful changes to messaging alongside the softened mandatory language failed to increase response, likely due to the softening of mandatory messages and not the other changes to messaging (Oliver, et al. 2018).

It may be possible to optimize mandatory messaging when the entire communication strategy is examined holistically, when mandatory messages are still highlighted and communicated, and when new ideas not based on current materials are considered and tested.19

19 Concurrent with this dissertation, the Census Bureau fielded another test this time strengthening the mandatory messaging in addition to other ideas supported by this dissertation.

80 Graphics and format testing – more problems than solutions

The ACS methodology currently includes graphic elements on a few mail items (instruction cards, FAQ and multilingual brochures, and the cover of the paper survey form). The Census

Bureau has made limited attempts to test additional mail materials that incorporate graphic elements. Each attempt has been unsuccessful, and additional graphics have not been incorporated into the ACS methodology. However, the failure of these tests may not have been that graphics are unsuitable for government surveys, but that the graphics tested were poorly designed or inadequately integrated into the mail contact strategy.

First, ACS tested the use of color on mailing envelopes in preparation for the 2010 census to differentiate ACS mailings from census mailings. These envelops used relief print and a marketing style to communicate that the survey was required by law and had the reverse effect, decreasing response rates compared to plain envelopes (Leslie 1996; Dillman, et al.

1996; Leslie 1997; Dillman 2000). Nearly a decade later, the Census Bureau revisited the idea to redesign ACS envelopes with more graphics. Rather than a mail out test, this research showed research participants during a cognitive interview pictures of envelopes asking which they preferred and why (see Reingold 2014a; Reingold 2014b). The envelope that was rated by 95% of participants to be saved and read later was an envelope without the use of heavy graphics, show in Figure 26.

81 Figure 26. ACS cognitive testing envelop - plain

However, an envelope using heavy graphics and color was rated by 94% of respondents to be saved and read later, as shown in Figure 27. This difference was not statistically significantly different (Reingold 2014b).

Figure 27. ACS Cognitive Testing envelop - graphic

This is not conclusive evidence that graphic envelops are ineffective at inducing households to notice and open mailings. In fact, envelopes with graphic elements (such as color and pictures) on the back induced significantly more participants to claim they would save and read the mailing later. An envelope with color text and log on the back (see Figure 28) produced 96% of respondents claiming to save and read the letter later and an envelope back with full graphics

82 (see Figure 29) was only slightly worse, with 94% of respondents reporting they would read and open later (Reingold 2014).

Figure 28. ACS cognitive testing envelop back - simple color

Figure 29. ACS cognitive testing envelop - full graphics

Both envelopes produced significantly (p < .10) more reports of participants claiming to save and open later than a colorless envelope (86%) with text and logos (see Figure 30) or a plain envelop (78%) back (not pictured) (Reingold 2014b).

83 Figure 30. ACS cognitive testing envelope back - colorless text

Despite this finding, the Census Bureau has not tested envelops with messaging and graphics on the back. A graphic postcards (see Figure 31), however, did not produce positive results, inducing the least number of participants to save and read later (69%) compared to a plain post card (77%) shown in Figure 32. Compared to the plain postcard, a postcard that used formatting without graphics performed the best with 84% stating that they would save and read later, as shown in Figure 33. All differences are statistically significant (p<0.1) (Reingold

2014b).

Figure 31. ACS cognitive testing postcard - graphic

84 Figure 32. ACS cognitive testing postcard - plain

Figure 33. ACS cognitive testing postcard - formatted

The evidence on the effectiveness of graphics from this cognitive testing is not conclusive. Postcards appear susceptible to being seen as marketing materials if they incorporate heavy graphic elements but applying formatting to the text of the postcard may be effective. However, the Census Bureau has not field-tested envelopes with graphic elements on the front or back despite some evidence that, at least on the back of the envelope, graphics elements could be beneficial. The Census Bureau did test adding one element to the envelope; the inclusion of a box that says “OPEN IMMEDIATELY” in white type. This use of white text on darker shaded box is known as relief or reverse print. This feature was not isolated in testing

85 but is being implemented into ACS production in 2020. This is despite that previous Census

Bureau testing found the use of reverse type on envelops was problematic (see Dillman, et al.

1996) and that reverse print is more difficult to comprehend (Wheildon 2007). For these reasons, envelope print experts to specifically recommend to never include reverse of relief print on envelops (envelope.org). Beyond the questionable use of reverse print, the phrase

“open immediately” may also come across to some readers “marketing” and non-governmental

(Dillman, Smyth, and Christian 2014; Oliver, Heimel, and Schreiner 2017).

In a second test of graphics, the Census Bureau tested a full-color, graphic “Why We

Ask” flyer to communicate with potential respondents why the questions on the ACS were being asked. This was in response to concern that if potential respondents don’t know why the questions on the ACS were being asked, they may be more skeptical or reluctant to respond

(Dillman 2014). The tested flyer is shown in Figure 34 (front) and Figure 35 (back).

86 Figure 34. ACS “Why We Ask” brochure (front)

87 Figure 35. ACS “Why We Ask” Brochure (back)

This flyer had no impact on self-response (Heimel, Barth, and Rabe 2016). The ineffectiveness of this item was primarily blamed on the use of graphics, but it may have had another cause. Along with graphics, there were perhaps too many messages and it is possible that the graphics on this flyer could appeal to participants if messaging was more limited. It is possible that simple graphics with simple wording may be more effective. The ACS is a long survey. Some respondents only realize this after they start their survey and because they started, they decide to finish. However, if potential respondents know before starting how

88 many topics are going to be asked about in the survey, they may make a decision not to reply.

For every respondent who saw the “Why We Ask” flyer and were convinced to reply, another may have been scared from response because the flyer clearly communicated that the ACS would ask a lot of questions, increasing the sense of burden of the survey task. One additional problem with this flyer was it was added as an additional insert along with all of the existing mail materials. As stated, mailings with too many mail materials may burden potential respondents (Dillman 2014). Not only was this added as an additional mail piece, this item was stylistically very different from everything else mailed from the Census Bureau on behalf of the

ACS. It was never designed to “fit” alongside the other pieces, nor were the other mail pieces redesigned to acknowledge or “fit” with the new graphic flyer.

Third, the Census Bureau tested mailing a physical data visualization tool known as a

“data slide” to sampled households to communicate that the ACS was used to produce aggregate statistics (to reduce fears of completing the survey) and that the survey request was real (as scam operations would unlikely create such a data tool). Figure 36 shows the unfolded designed data. When folded in half vertically, there would be a map and statistics on each side as well as a “pull tab” on the bottom. When the tab is pulled, data for all 50 U.S. states,

Washington DC, and Puerto Rico could be seen through a cutout window in the card, with 26 data locations on each side of the card.

89 Figure 36. ACS data slide (unfolded)

Adding the data slide to the first or third did not produce results to justify the cost of production (Heimel, et al. 2019).20 However, the data slide used in this test did not undergo usability or cognitive testing prior to field testing. For example, anecdotally, some people who

20 The inclusion of the data slide did not increase overall response rates significantly, but it did significantly push more respondents to the internet. While this creates a cost saving for the ACS program, this did not justify the cost of producing and mailing the data slide (see Heimel, et al. 2019).

90 saw the card were confused by the map displaying one statistic but the pull-tab displaying other statistics. In visual design, Gestalt grouping principles suggest that certain factors can lead to elements in a visual design to be perceived in different ways. Visual elements that are close together, that share a common region in a visual space, or that are connected through continuity, or are within the same closed field can be seen as connected together (see Palmer

1999; Hoffman 2004). Because the map shares the same visual field in close proximity to the displayed data, it is not surprising that the meaning of the statistics and the map could be conflated or connected. The visual design suggests this connection. As unconnected elements, a visual element should have separated these features. The data slide may have had an unappealing of confusing design, diminishing any positive impact. Another problem similar to the “Why We Ask” graphic flyer test was that the data slide was added as an additional item to mailings that already included multiple items. Also, the data slide was not designed to “fit” with the other materials, nor were other materials redesigned to fit with the graphics on the data slide.

My conclusion from this limited research is that the Census Bureau has not done a good job testing graphic elements in ACS mail materials. First, when graphic elements produced promising results in cognitive testing, these ideas were never incorporated into field tests. Lab testing is not the same as a field setting, but this study suggested that graphics on envelopes, especially on the back, and the use of formatting on postcards may be beneficial to having mailings noticed, opened, and read (Reingold 2014a; Reingold 2014b).

Second, previous efforts to incorporate graphic items into field tests had confounding, non-graphic related problems that could have caused the failure of the tested graphic items.

91 Some graphic elements contained too many words or were added to mail packages that contained too many materials (Heimel, Barth, and Rabe 2016; Heimel, et al. 2019). Most problematic is that in each case, the graphics tested were added to a single mail piece without a corresponding redesign of other mailing items. This may have made the graphic element stick out of fit poorly in the overall mailing strategy. A flashy infographic that does not share design elements with other mail materials may appear out of place, but if all materials are designed holistically to look and feel as a part of a single package, some graphic elements may work.

Qualitative messaging research – findings ignored

The Census Bureau has conducted multiple qualitative studies on survey messaging and messaging contained in ACS mail communication materials (see Hagedorn and Green 2014;

Hagedorn, Green, and Rosenblatt 2014; Reingold 2014a; Kovacs, Thorne, and Wolfinger 2014;

Orrison 2014).The Census Bureau has also conducted research on messaging and the mindset of potential decennial census respondents that can be leveraged for use to improve the ACS

(see Macro 2009; Conrey, ZuWallack, and Locke 2012). These studies produced a variety of useful recommendations for messaging, including to connect the ACS to the Census Bureau because it is widely known and trusted, communicate benefits of survey participation at the community level rather than at the national level, to include a due date for the survey response, and to communicate that ACS data is needed and important.

Census Bureau experiments with ACS mail communications have not attempted to operationalize many of these messages. It is difficult to incorporate a new recommendation from qualitative research into exiting mail materials used in ACS production. The recommendations are often out of context and require additional work to determine how to

92 best to write and fit qualitatively tested messages into ACS mail materials. Moreover, materials using this new messaging than has to undergo cognitive and field testing prior to being implemented into ACS production. Thus, few recommendations from qualitative work have been tested and none had been implemented at the time of this research. Recent field tests of messaging have included tests that did not come from previous qualitative work, such as how to soften the language used to communicate that response to the ACS is mandatory (see Oliver, et al. 2018) and a modification to the communication of a mode choice (see Clark, et al. 2015a).

Conclusion: Problems with recent innovation to ACS mail communications

I have proposed that messaging in the ACS mail materials requires a thorough review and potentially a complete redesign. To make a case that this is necessary, I provided details on the current ACS mail contact methodology and a brief history of recent research and innovation that led to these current materials. Attempts have been made to test and improve individual elements of messaging in the ACS mailings. This process has led to many useful changes to ACS mail materials, such as the removal of unnecessary mail pieces, and the implementation of messaging to launch a new web-push methodology. However, this testing has had the following limitations:

1) Changes are path-dependent: All innovations to the ACS are implemented as changes from

existing ACS materials. Path-dependent innovation limits the direction of research and the

types of changes that can be made. Because current practices involve changes to existing

messages, completely new messages that could benefit ACS response rates have not been

tested.

93 2) Changes not implemented as tested and recommendations are sometimes ignored:

Changes to ACS production materials do not always follow the exact materials tested.

Adopting materials that don’t follow those that are tested can lead to unintentional

incongruencies in mail materials. In addition, some recommendations from qualitative

research, expert reviews, and field experiments often are ignored or not used in filed

testing.

3) Innovation is slow: It is possible that recommendations from qualitative work and expert

reviews aren’t ignored. The Census Bureau may want to test these ideas but may have

limited resources to do so. This has led to innovation at a slower rate than the rate of

incoming advice and recommendations. There is ample evidence of this pattern. When the

ACS adopted a web-push methodology, it took over two years to test the removal of the

unnecessary prenotice mailing. At time of this project, the Census Bureau is conducting yet

another test on mandatory messaging, this time strengthening the focus of the mandatory

message as well as a second, separate test to incorporate due dates into ACS mail materials.

These tests are based on qualitative research and expert recommendations from 2014. It is

possible that the Census Bureau has intention to implement changes recommended but

cannot accommodate these changes in a timely manner given the incremental and path-

dependent nature of the ACS testing and implementation process.

4) Research has not focused on messaging: The Census Bureau has conducted a lot of

research on the ACS over the past decade. Not much of this, until recently, has been on

messaging and graphics. There does not seem to be an overarching plan for developing and

incorporating changes to messaging and graphics in the ACS mail materials. Overall, the

94 plan for messaging across mail pieces seems to be an afterthought. Mailing and mail pieces

are removed and moved around, and design elements and messaging are tested often

without consideration to how messaging across the newly designed material fits together as

a cohesive messaging strategy.

5) Cumulative impact of incremental changes unknown: Attempts have been made to test

individual elements of the ACS mail contact strategy, leading to a series of isolated changes.

Implementing multiple revisions to the ACS mail materials may have created new problems,

or gaps, in messaging. This review has shown that the Census Bureau has not conducted a

review of the ACS mail communication materials to see if all changes work together in a way

that optimize response rates.

By reviewing over a decade of Census Bureau research on the ACS mailout methodology, I have noted multiple problems in this innovation process of ACS. Due to these problems, I have made a case that the ACS mail communication materials require a thorough and rigorous analysis of the messaging to determine if they are effectively communicating in an optimal way to increase response rates.

To conduct a review of ACS messaging, a baseline for optimal messaging is required.

Because a unified theory of survey messaging does not exist, in the next chapter I build a list of recommendations for ACS messaging based on a multi-disciplinary review of relevant literature to survey recruitment messaging. This review will be used in subsequent chapters to conduct an analysis of current ACS messaging to determine if ACS mail communication materials are following current best practices to maximize the impact of mail communications to increase self-response rates. If the ACS materials and messaging do not follow best practice

95 recommendations, a complete rewrite of the materials may be necessary.

96 Chapter IV. Developing recommendations for the ACS messaging

It is one thing to point out the lack of a cohesive messaging strategy for convincing people to answer the ACS, as just discussed, and quite another to propose what actions will create such a strategy. To propose the specifics of such a strategy, it is essential to provide a theoretical understanding of what factors are likely to influence response.

Scholars and survey practitioners have spent decades developing techniques to improve survey response rates. While individual elements of survey messaging have been tested, a single, cohesive theory on how to improve survey response through messaging does not exist

(Dillman 2019a; Dillman 2019b). Theories and recommendations on survey response come from a wide variety of fields, with many survey practitioners in one field being unaware of theories in other disciplines or from other schools of thought. This chapter’s purpose is to add to the growing body of literature on survey messaging by drawing connections between literatures to develop a set of best practices for ACS survey messaging. This review will bring together research from sociology, survey methodology, communications, psychology, and behavioral sciences and social sciences to identify messaging that may be effective in persuading households comply with the ACS survey request.

Understanding types of respondents from Census Research

At the time the ACS was being developed, there was limited research on the motivations and experiences of respondents to survey requests, and even less on the experience of non- respondents (Singer and Bossarte 2006). To fill this void, the Census Bureau conducted their own research before and after the 2010 decennial census to gain understating on the

“mindsets” and “motivations” of the general survey population. A “mindset” refers to a general

97 disposition that a potential respondent has toward a census request. “Motivation” refers to factors that could convince respondents with different mindsets to respond to a census request. The ACS shares the same survey population as the decennial census, and while a census is different than a sample survey, the mindsets of potential respondents and their motivations to participate may be similar. Two separate rounds of research, one before the

2010 census and one after, each involved over 4,000 interviews with potential census respondents (Macro 2009; Conrey, ZuWallack, and Locke 2012). The first round prior to the

2010 census identified five mindset categories labeled as follows (see Macro 2009):

Leading Edge: This mindset group was the most agreeable to responding to the census and comprises of 26 percent of participants. This group is more affluent and more white (non-

Hispanic) than the general population, and had higher rates of home ownership, longer tenures in homes and higher rates of civic engagement, such as voting. Members of the Leading Edge are also more accurately informed of the purpose of the census and had higher levels of trust that the Census Bureau would protect their data than other potential respondents.

Head-Nodders: People labeled Head-Nodders are the largest mindset group, accounting for 41 percent of the population or potential respondents. Demographically, this group reflects the national average in racial, education, and income levels. This group is characterized as having positive feelings toward the census, but not as positive as the Leading Edge. Persons in this group thought they were well informed on the purpose of the census, but their knowledge of the actual purpose was often incorrect. Despite this disconnect, members of this mindset group feel the census provides benefits to communities and to individuals. This group does not worry about data protection and security of data and trusts the Census Bureau with their

98 response. Due to their positive outlook toward the census, and understanding of the purpose of the census, this group feels that responding is an obligation that they are willing to do.

Unacquainted: Seven percent of the potential respondents were categorized into the

Unacquainted mindset group. These potential respondents had never heard of the census or the Census Bureau. Researchers did not find it possible to rate their levels of trust in the census or rate favorability toward how Census Bureau would use their data. Demographics of this group show that they tend to be foreign born, non-primary-English speakers with lower levels of education, income, and civic engagement than the national average.

Cynical Fifth: A group who had a negative view toward the decennial census and the

Census Bureau called the Cynical Fifth comprised 19 percent of the research participants.

Demographically, this group is similar to the Head-Nodders with race, education, and income rates that mirror the national average. Despite knowledge of the census and agreeing that it is best if everyone responds, this group was skeptical toward the purpose of the census. Members of this group feel that the census is an invasion of privacy and that the Census Bureau or other government entities could misuse their data.

Insulated: Six percent of the respondents in the Insulated group had heard of the census but did not know of its’ purpose. This group has similarly low income and education levels as

The Unacquainted but did not have higher rates of foreign born or non-English speakers. While the Census Bureau claimed to provide benefits to communities and the country, this group was distrusting of this message because such benefits could not be seen where they lived. This group also had the highest rate of distrust that their information would be kept confidential.

In the second round of research (see Conrey, ZuWallack, and Locke 2012) the Census

99 Bureau assessed changes in mindsets after the 2010 census. Both projects used the same methodology to interview over 4000 participants on their attitudes and views toward the

Census. The second study produced seven mindset groups that were labeled somewhat differently.

Government-Minded: The largest mindset group comprised of 19 percent of the potential survey population and was also held the most favorable attitudes toward the census.

Members of this group tended to be more educated and generally accepted the role of the census, the role of the government, and the purpose of the census to support the functioning of the government. Trust in the census and the government to protect data was higher than average.

Dutiful: The second group of respondents, the Dutiful, comprised 14 percent of the potential respondent universe. This mindset represents respondents who are demographically reflect the averages national rates in race, income and education. They have a strong trust in the government and a sense of duty toward being counted in the census.

Compliant and Caring: The Compliant and Caring group comprised 15 percent of the population and was found to be agnostic toward the political purpose of the census to appropriate members of congress. However, this group was generally supportive of social programs that were supported by census data. In general, there were positive feelings toward the purpose of the census, even if members did not fully know or understand that purpose. A desire to complete the census came from feelings of compliance toward government requests, not overwhelming support or understanding for the purpose of mission of the census.

Local-Minded: This group comprises of 12 percent of the potential respondent

100 population. Demographically comprising lower than average educational attainment with higher proportion of minorities and renters. The Local-Minded category includes people agnostic toward the federal government, but who are locally and civically active and trusting of local government.

Uninformed: These potential respondents accounted for 16 percent of the potential census population and is generally unaware of the purpose of the census and skeptical toward the idea that they will ever see an impact from the results of the census. In general, members of the Uninformed have lower rates of education and income than the national average, and a lower opinion of the government in general. This groups reluctance toward census participation stemmed from this low opinion of the government, not from fears of data security or distrust toward the census.

Suspicious: These respondents comprised 14 percent of potential respondents.

Demographically similar to the Local-Minded mindset, this group represents disenfranchised and minoritized groups, a higher proportion of renters than owners and people with lower levels of education. This group not only has the lowest rate of intent to respond to the census, and the lowest awareness of the census purpose, but also feels that responding the census could personally harm them if their data is stolen or used improperly.

Cynical: This mindset category consisted of 10 percent of the potential respondent universe. Cynical respondents do not like or trust the census or the federal government. It is difficult to compare the Cynical and Suspicious respondents on which may have greater negative feelings toward the census, as each group represents a different challenge to convert

101 non-respondents to respondents.21

These two studies are useful in this research because the ACS shares the same universe of potential respondents and provide information on over two time points. The first study may provide information on the normal feelings of potential ACS respondents between census years. The second, more recent, study may include a “census effect” of increased positivity toward the census, the Census Bureau, and the role of the federal government to collect demographic information, due the decennial census advertising campaign or the impact of participating in the Decennial Census. The ACS operates continuously during the decennial census, so the mindsets of people at both timepoints cannot be ignored.

While the methodology used in each study was the same, the classification schemes and mindsets of potential respondents were similar with some notable differences. The first study classified 19 percent of respondents in the Cynical Fifth the second study classified only 10 percent in a similar Cynical category, but also teased out fourteen percent of potential respondents into the Suspicious mindset group. While changes in categorizations certainly impact the interpretation of these studies, some differences may be attributed to real changes in the attitudes of potential respondents, potentially due to a “census effect” (Conrey,

ZuWallack, and Locke 2012). However, it was surprising to find that the percentage of people in the Uninformed mindset was larger after the decennial census (16%) than the similar

Unacquainted mindset before the decennial census (7%) (Macro 2009; Conrey, ZuWallack, and

Locke 2012).

21 An early version of this review can be found in Oliver, Heimel, and Schriener (2017. I want to acknowledge the contribution of Sarah Heimel who compiled this literature.

102 In both studies the mindset groups that represent populations that may be averse to participation in the census comprise of a significant portion of the potential respondent population. While 90% of people have herd of the census, only 11% have heard of the ACS

(Hagedorn, Green, and Rosenblatt 2014).22 Being that the ACS is a Census Bureau survey, this mindset may be as hostile, or more hostile, towards the lesser known ACS request. Both studies also identified sizeable portions of the population that are unaware, uninitiated, or uninterested in the census or its’ purpose. This was found even in second study immediately after a nationwide advertising campaign for the decennial census. If you add up the potentially difficult to recruit groups from the first study, i.e. Cynical Fifth, Unacquainted and Insulated mindsets, and compare that to similar groups from the second study, i.e. the Uniformed,

Suspicious and Cynical, the percentage of hard-to-reach potential respondent rose from 32% to

40% (Macro 2009; Conrey, ZuWallack, and Locke 2012). Communicating with these audiences is extremely important. With lower response propensities, making improvements on the response rate from these groups would significantly boost the overall response rate. Moreover, many in these mindset groups comprise of persons from minoritized populations based on race and ethnicity as well as persons with lower education levels, and renter status. Improving response rates from these groups may also help reduce nonresponse bias.

Both studies found that a majority of the population falls into mindset categories that are agreeable to the census and its mission. In the first study, 67% of respondents were either

22 This finding is striking because the ACS has sampled 3.5 million housing unit addresses a year since 2005, meaning it is likely that more than 11% of households have received the ACS. While comparing households to respondents is not straightforward, this finding suggests some people who took the ACS may not remember the survey by name.

103 compliant Head-Nodders or excited Leading-Edge responders, while in the second study 48% of responders fell into the Government Minded, Compliant and Dutiful categories. By this categorization, it would appear that there was a decline in the “easy” to reach and convert population. However, the second study identified the Local-Minded mindset group. This group comprised of people that tended to be unaware of the census and apathetic toward the federal government, but that cared deeply about local politics that had visible impacts on their lives.

Knowing this creates an opportunity for the census and the ACS to link the benefits of survey response to a more local level, and perhaps not highlight the larger, national and political impact of ACS survey response. With the majority of potential respondents in easy-to-reach groups, it is also potentially problematic that these mindset groups are more homogeneous comprising of people with higher or average levels of education and income, for example. To truly represent the U.S. population, the ACS needs to reach beyond the easy-to-count.

In preparation for the 2020 census, a research study similar to those conducted in preparation for the 2010 census was undertaken in partnership with a research team from

Young and Rubicam. A total of 42 focus groups in 14 locations throughout the U.S. were conducted to examine the motivations, attitudes and barriers to responding to the 2020 decennial census. Focus groups targeted known populations with low response, including racial and ethnic minorities, people with low internet proficiency, young people who recently moved and may be completing the census for the first time, and people in rural communities (see

Word 1997; Joshipura 2008, Nicola, et al. 2015; Erdman and Bates 2017).

The report identified five factors that lower response: 1) A lack of knowledge about the census; 2) apathy and lack of confidence that individuals have the ability to influence the

104 government; 3) confidentiality and privacy concerns; 4) a fear of repercussions and 5) general distrust of the government (Evans, et al. 2019). Concerns about confidentiality were widespread among all groups in the study. Fear and distrust of the government were more prevalent in racial and ethnic minority groups and the rural population. Unlike previous research, this study found that this distrust extended to the Census Bureau as well.

All groups reported factors that would impact their choice to respond in a positive way.

Across groups, community level benefits, such as census data being used to build schools and hospital and to provide public services, was a powerful motivator. The study found, however, that many skeptical groups were unlikely to believe the benefits from survey response would come to their communities. Participants noted that their distrust would be mitigated it they were provided tangible evidence of funding and benefit to their community and tied to providing a better future for their children. Receiving the messages of community benefits from a trusted organization with deep roots in their community would provide a powerful motivator to respond (Evans, et al. 2019).

Focusing attention on the ACS rather than the decennial census, the Census Bureau also used similar respondent mindset research to better understand how the characteristics of potential respondents change through the ACS data collection month. This was done to learn the characteristics of those who respondent early to the ACS request and the characteristics of late respondents (in the NRFU operation) and non-respondents (Berkley 2018). Respondents with similar characteristics to the Dutiful, Compliant, Leading Edge, and Head-Nodders and mindsets responded early to initial survey requests. This left more of the difficult potential responders, with mindsets similar to the Cynical, Uninformed, and Suspicious, for follow up

105 mailings. Going beyond the mindset categories, this research linked response propensity to factors like language, rural status, and education that present barriers to total self-response and early internet self-response (Berkley 2018).

This information can also be used to form more effective messaging in the ACS mail communication materials. This research suggests that many potential responders are easy to reach – they are compliant, dutiful, trusting and support the government. These populations will likely self-respond at high rates and respond early in the data collection period. Pushing these respondents to the web may help reduce costs but will unlikely impact total self-response rates. However, if the ACS mail communications are going to increase total response rates and reduce non-response bias, mail materials should target messaging to gain response from the more difficult and skeptical populations. The Census Bureau sends five mailings to recruit households into the ACS and removes responding households at two point during this operation. The Census Bureau can leverage this information to change the messaging in the early mailings (to target and maximize responses from those likely to respond) and change the messaging in later mailings to target the type of households still remaining in the ACS sample.

If messaging can increase response from the more difficult to reach populations, and push them to respond, and to respond early on the web, paper, or phone prior to expensive CAPI operations, it would be a major cost savings to the ACS program and increase data quality.

The responders categorized as Local-Minded may represent a unique opportunity to increase response rates and reduce nonresponse bias if messaging can link the ACS to benefits in their community. As the most recent research for the 2020 census showed, distrust, a lack of knowledge, general apathy, and lack of confidence that individual actions can influence the

106 government were pervasive feelings that were barriers to response (Evans, et al. 2019). This may be especially true for minoritized and difficult to count populations (Joshipura 2008,

Nicola, et al. 2015). While distrusting, cynical and suspicious responders may be harder to reach, messaging about community level benefits may be able to convert some of locally- minded segment to respond to the ACS survey request. Lower education, rural, and minoritized populations that comprise these groups, including the Local-Minded group, need to be motivated by believable statements that build trust and link benefits of the survey to their community. This becomes increasingly important in later mail pieces as the universe of potential responders removes compliant and dutiful households that respond early to the ACS survey request, leaving a larger portion of skeptical, uninformed, and cynical potential respondents.

Based on the identification of respondent types in Census Bureau research, it is clear that ACS messaging must reach different kinds of respondents. It would be beneficial if the

Census Bureau could leverage the motivations of potential survey participants and write communications materials that speak to those audiences. Because the universe of respondents changes throughout the mail communications during the self-response period, this is possible.

For example, messaging in the early mailings can focus on maximizing response from the complainant and trusting respondents, while later mailings can use different messaging to focus on gaining compliance from Local-Minded, cynical, and distrusting audiences.

Identifying useful messaging concepts from general theories of behavior

Research from communications, marketing, behavioral and social sciences on why we act, the psychology of compliance to requests, and how to best communicate messages with the public

107 is important to understand as background. None of this research directly references or engages the specific task of increasing response rates to survey request. However, each field provides potentially useful insights on ACS messaging that may have been missed by survey methodologists and sociologists who typically produce survey methodologies and survey operation practice.

What is communication?

Communication is the process of sending information from a sender to a recipient via a message. A communication is effective when the recipient understands the message as intended by the sender. Because people can filter a message through their own beliefs and opinions, it can be difficult for a sender to communicate a message accurately to all intended recipients (Munodawafa 2008) especially if the audience is highly heterogeneous, as is the case in national general population surveys (Dillman 2016).

Communication can happen in many ways. A face-to-face conversation, for example, involves a two-way interaction where the sender and receiver can interact and respond to each other in real time. Mail communication is a bit different and can be more challenging. While it doesn’t occur in real time, sending multiple mail communications to a single recipient also involves a two-way communication process. First a sender must communicate a message to the recipient. If they receive the message, the recipient formulates a response to this initial communication. This response can be an action, for example taking the survey, or a feeling, such as suspicion or cynicism. Because the ACS sends up to 5 mailings, the communication is an

“implied” two-way conversation where the sender (the Census Bureau) initiates a conversation that results in a response that may or may not be communicated back to the Census Bureau.

108 The conversation continues as the Census Bureau sends follow-up communications based on assumptions of the response given by the sampled households that did not reply to the survey request.

It is hard, if not impossible, to write a single message that resonates with all recipients, or that anticipates all of the implied responses a household can have to a survey request.

Studies have found that 17% of U.S. adults fall in the lowest category of reading ability and over half (52%) had difficulty with basic comprehension reading tasks (OECD 2013; OECD 2016).

Moreover, people also tend to enjoy reading when it is at a lower level than they are fully capable (Nielsen 2005; Landry 2017). Using simple, straightforward, and noncomplex language can help a message reach a broad audience (McCormack 2014). When crafting messages, it is important to follow plain language guidelines that may increase the chances that people understand and retain information communicated in a message. Moreover, the government also has a legal obligation to write in plain language when communicating with the public, as stated in Plain Writing Act of 2010 (H.R. 946/Public Law 111-274). The Act requires writing that is “clear, concise, well-organized, and follows other best practices appropriate to the subject or field and intended audience” (Plain Writing Act 2010). Writing that is clear and simple to understand can also save survey operations money by reducing confusion that causes complaint calls and emails to survey support staff (Dubay 2008).

Information overload can occur when too much information is presented to someone exceeding their capacity to process. When this happens, decision making abilities can be affected (Gross 1964). Marketing research recommends limiting the number of messages in a communication. One recommendation is to present messages in digestible groups of three to

109 increase retention and understanding (McCormack 2014; Poldre 2017). This recommendation comes from aural communication, such as a speech or lecture, rather than in written mail communication. However, the core of the idea is that a person can only absorb a limited amount of new information at a time, so a single communication needs to limit and group messages to optimize the impact of a message. Some evidence suggests a generational difference in how information is processed, with younger audiences desiring and expecting information presented in smaller, more visually digestible segments (Wilmen, Sherman, and

Chein 2017). Rather than detailed precision reading, skim reading may be the new norm (Wolf

2018). Text needs to be written, and designed, for this style of reading, especially if the target audience of the text is a younger and more tech-savvy audience. Especially when a communication is assigning a task, it is important that the task be presented clearly, without ambiguity. A reader should be able to identify a task, what is expected, and when it is due, on a quick glance without detailed reading. One way to achieve this is to clearly communicate a due date to set clear expectations on the task and to reduce confusion (Allen and Richardson 2019).

Another lesson from marketing is to use graphic elements such as logos and pictures to catch the eye of an audience and make a mailing more noticeable. However, applying certain graphics to government mailings, has been shown to be problematic and has decreased response rates (see Dillman, et al. 1996; Dillman 2000; Heimel, Barth, and Rabe 2016). There may be a way to strike a balance may be to incorporate simple graphic elements that catch the eye and also communicate clearly that mailings are official, from the government and important (Whitcomb and Porter 2004; Hagedorn, Panek, and Green 2014). It is especially important to catch someone’s attention quickly to communicate that a mailing is official on the

110 outside of envelops sent to respondents. Research suggests that people give envelopes a quick scan, on average seven seconds or less, to visually process the information on an envelope in a predictable, patterned way. If the envelope does not communicate clearly that the contents are important, it may never be opened (Lavrakas, et al. 2017a; Vögele as cited in Chewning, 2019).

Theories on human action and behavior

Researchers from many fields have developed theories on how to shape human action and change human behavior. Seeking a survey response is, at its’ core, an attempt to shape a human behavior. This literature provides useful background on identifying specific elements that may influence behavior and strategies for potential messaging in survey communication to convince an increasingly distrusting and cynical population to participate. Moreover, some have found it useful to leverage the science of human motivations and behaviors an effective way to increase survey response rates as well as data quality (Wenemark, et al. 2011).

The Reasoned Action Approach organizes the determinants of human action into three classes of beliefs (Fishbein and Ajzen 2011). First, motivation comes from Behavioral Beliefs about the positive or negative consequences a person expects from an action. Second, motivations come from Normative Beliefs about if an action would be approved by important people in a persons’ life or if others would act in a consistent way. Third, people can be motivated by Control Beliefs about personal and environmental factors that can help or impede an individual’s attempts to carry out the behavior (Fishbein and Ajzen 2011).

In a different attempt to operationalize the motivating factors of human behavior, the

World Bank’s Communication for Governance and Accountability Program developed a list of motivations of human behavior and also provided strategies that can be effective to leverage

111 those motivations to change behavior (World Bank 2017). Some identified motivations, such as responses to far and threats, are not useful to motivate people to comply with a survey request. ACS messaging could address most motivations. People are influenced by their cultural norms, so messaging could be written in a way that respects cultural barriers to actions. People can be motivated by benefits of their actions, so messaging should communicate the positive consequences of survey participation. People are motivated to act in accordance with norms of their society and act in a way that conforms with how others think they should ack. Survey messaging could frame ACS response as a normal action expected by others. People are highly motivated by their attitudes, so messaging should conform a survey response to fit the attitudes of the general population. People can be motivated by a desire to make decisions and stick to a plan, so it is important that survey materials provide messaging that lead potential respondents to make a decision and a plan to respond. This can be accomplished by providing clear direction on how to take the survey. People are more motivated to act if they are confident that they can achieve the task (Wenemark, et al. 2011). Messaging should raise confidence that the survey task is doable. It is important that potential respondents do not feel manipulated (Deci and Ryan 1985; Ryan and Deci 2000; Wenemark, et al. 2010). Feeling manipulated can lead to decisions to act against a recommended action. Survey messaging must be believable and genuine to avoid this reaction (World Bank 2017).

People can be motivated to act for very different reasons. Self-determination theory argues that some people, for some tasks, are more motivated by extrinsic motivations to participate, such as the value of an action to society. Others are more intrinsically motivated and are driven to act by actions that are enjoyable or interesting (Deci and Ryan 1085; Ryan and

112 Deci 2000). Motivation to act can be increased when people can connect the task to the way they are motivated. For example, people who are motivated by extrinsic values can be motivated to participate in a survey if they believe it can help society, even if they disagree with other aspects of the survey. To motivate more intrinsically motivated respondents, it may be useful to communicate that the survey organization has genuine interest in the respondent’s answers measured in the survey. Others feel motivation when they have the opportunity to take personal responsibility for their actions, making survey requests that feel either “forced,”

“coerced,” or that remove the power or commitment to respond away from the respondent less appealing do not empower respondents (Wenemark, et al. 2010).

Research suggests some effective strategies to move people toward a desired behavioral outcome. People tend to value decisions more if they feel that they own the decision to make the choice themselves and have confidence in their ability to complete the task they choose (World bank 2017; Artefact, 2019). People also tend to act in ways that reinforce their personal identities and it is much more difficult to convince someone to take an action if the action changes their core beliefs (Cialdini 2009; Fishbein and Ajzen 2011; Artefact

2019). Based on this, framing a survey request in a way that puts the actor in control may be important. Calling attention to desired outcomes and removing ambiguity of the survey task can also help people make agreeable decisions. This may be possible by removing the number of extraneous details a person has to consider which will allow them to focus directly on a single task: completing the ACS.

People are also more likely to respond to emotional stories that highlight a specific person’s experience, rather than stories focused on facts (Artefact 2019). Evidence on the

113 ability of narrative messaging to persuade is mixed (Reinhart 2006; Reinhart and Feeley 2007;

Dillard and Shen 2012). However, some evidence suggests that narratives can have a more pronounced impact to persuade people than non-narrative messages (Kreuter, et al. 2010;

Ricketts, et al. 2010). Specifically, narrative messages have been found to be more persuasive than statistical and factual based messages (Wit, Das, and Vet 2008). Testimonials can also be more persuasive than facts and statistics especially among audiences with low involvement or exposure to an issue (Braveman 2008). Testimonials can also be effectively inserted into non- narrative communications to increase the overall influence of a communication (Gibson and

Zillmann 1994; Zillman and Brosius 2000). Using testimonials to make emotional appeals can be a powerful motivator to some people.

Another strategy found effective to shape behavior is to ask someone to make a commitment to act (Wenemark, et al. 2010). This helps a potential actor link intended behaviors with a concrete future moment and course of action (Cialdini 2009; World Bank

2017). Recipients of a flu vaccine mailer were asked to write down the date, time, and location at which they planned to be vaccinated. This message increased vaccination rates by 4.2 percentage points (Milkman, et al. 2011). In another study, research participants were 42 percent more likely to achieve goals when they were asked to write them down first (Matthews

2015). Applying the same logic in the ACS messaging may help increase response rates. Rather than asking sampled households to respond to the ACS, a mailing could ask respondents how they plan to respond. Asking for commitment in this way can reduce forgetfulness and procrastination and may establish a positive expectation of completing the survey.

Psychologist Robert Cialdini (1984; 2009; 2016) has identified seven principles that may

114 be effective in persuading people to agree or comply with requests:23

1) Reciprocity: People have a desire to give back to those who have given to them (Gouldner

1960). For example, when customers at a grocery store are given free food samples, they

feel a sense of reciprocity to make a purchase.

2) Scarcity: People desire items and opportunities more if the item or opportunity is scarce (or

perceived to be scarce). This occurs because people are scared to “miss out.” When an item

or opportunity is only available for a limited time, people feel an obligation to acquire the

item or participate in an opportunity while it is available. Providing a price or opportunity,

“For a Limited-Time Only,” is an effective marketing message.

3) Commitment and Consistency: People are more comfortable when they act consistently

with their values and previous actions. For example, prior to an election, phone calls to

registered voters that ask the voter if they will vote during this election have been shown to

increase voter turnout. This message reminds the voter of voting in the past, which

motivates them to take action to be consistent with their previous actions.

4) Consensus and Social Proof: Conformity and societal norms have a powerful influence on

people. People tend to follow the lead of others similar to themselves (Hallsworth, et al.

2014). For example, conference participants were more likely to complete post-conference

surveys if the survey indicated typical response rates among attendees of prior similar

conferences to illustrate it was a normal for peers in this group to complete this survey

(Misra, Stokols, and Marino 2012).

5) Authority: People are more likely to follow the lead of credible experts. Communicating or

23 A version of this list and analysis is also presented in (Oliver, Heimel, and Schreiner 2017).

115 projecting expertise is a powerful way to gain compliance to a request. For example,

doctors hang their diplomas in their medical office to establish their credentials for their

patients.

6) Liking: People are more likely to comply with requests from people they like or admire. For

example, celebrity endorsements, regardless of the celebrity’s actual authority on the

product they are selling, can be effective.

7) Unity: People agree and comply with others when they feel they share an identity. For

example, Cialdini surveyed potential patrons of a new restaurant. Some were asked to

provide their “advice,” others their “opinion,” and others their “expectations.” Patrons who

were asked for their advice were more likely to report a desire to visit the restaurant when

it opened because they felt like part of the team rather than an outsider. Messages that

make the potential respondent feel a sense of unity with the requestor may increase

compliance with a request.

To make requests more effective, Cialdini suggests the use of “pre-suasion” techniques.

According to his pre-suasion theory, it is critically important to establish a connection with a person prior to formally making any request. Each of the above principles can be incorporated into a pre-suasion message. The important thing is to not directly ask for anything in an ignition contact, but to build a relationship. This will make future “asks” more trusted, believed, and acted upon. To accomplish this, Cialdini argues to build trust and create a sense of mystery and intrigue around a forthcoming request. Cialdini calls these “privileged moments” that create a powerful reason to comply with the request prior to the potential respondent needing to respond (Cialdini 2016). According to Cialdini, the importance of pre-suasion is that a person

116 has already formulated a mindset to comply proper to being asked to do anything. This makes subsequent requests that follow a pre-suasion messages more effective at gaining compliance.

Theories about why people respond to surveys

At present, there is no single, comprehensive theory on strategic survey messaging across multiple contacts. However, two prominent theories on survey methodology come from the field of sociology, Leverage-Saliency and Social Exchange. While each is distinct and provides insights on how to maximize survey response through messaging, they share common ground and complement more than contradict each other.

Leverage-Saliency Theory

Leverage-Saliency theory (Groves, Singer, and Corning 2000) presents hypotheses about how survey methodologies can influence potential respondents to cooperate. Each survey request has multiple attributes. The topic of the survey is an attribute of the survey, as is the length of time it takes a respondent to complete the survey request. These attributes are not neutral features of the survey. Each attribute represents a potential “leverage” on a respondent’s decision to participate in the survey. Some survey attributes or leverages, such as lengthy surveys with difficult and sensitive questions, exert a negative impact on a respondent’s willingness to participate (Tourangeau and Yan 2007). Other leverages, such as cash incentives and a survey being sponsored by known and trusted organizations, can have a positive influence on a potential respondent’s willingness to participate (Groves, et al. 2006). Saliency is the extent to which a leverage is understood by a potential respondent. The goal of survey messaging is to make the positive leverages more salient (and limit the saliency of negative

117 leverages) to convince potential respondents that the positives of responding outweigh the negatives.

The problem for survey communication is that recipients do not necessarily experience any specific leverage as equally positive or negative. For example, some potential respondents may find a request to complete a 20-minute survey arduous or burdensome, while others may be happy to comply with the same request (Heberlein and Baumgartner 1978). Burden of a survey task is subjective and can be defined and experienced differently by different people

(Wenemark, et al. 2010; Holzberg, et al. 2018). Similarly, recipients do not experience potentially positive survey attributes in the same way. A survey from a known and trusted sponsor might increase response rates of some individuals, but not others (Groves, et al. 2012).

The power a leverage has to impact response also depends on how salient the leverage is to a potential respondent. For example, a potential respondent may be more likely to complete a 20-minute survey if the survey’s communication materials communicate clearly

(and make more salient) social or personal benefits the individual expects to gain by completing the survey. Leverage Saliency theory argues that a positive leverage can become more powerful in promoting a survey response when it is made salient to the respondent though messaging.

Conversely, while messaging should inform research subjects of the potential negative outcomes of survey participation, it should avoid overemphasizing (and not make salient) potentially negative aspects of survey participation. Because potential respondents could weigh positive leverages against negative leverages, survey designers should make as many potentially positive leverages of the survey salient as possible to increase the likelihood of respondent participation.

118 Research has shown that one particularly powerful leverage that can convince both likely and unlikely potential respondents to participate in a survey is the use of monetary incentives (Groves, Singer, and Corning 2000). Monetary incentives may also reduce nonresponse bias because they produce larger response increases among potential respondents who are uninterested in the survey topic than among potential respondents who are interested in the survey topic (Groves, et al. 2006). Monetary incentives can take many forms, but research has shown that cash incentives produce higher response rates than non- cash monetary incentives, such as gift cards or certificates (Birnholtz, et al. 2004) and they do not guarantee response, even as monetary compensation increases (Singer, et al. 1999).

Because mandatory government surveys, like the ACS, cannot provide monetary incentives (Dillman 2000), making nonmonetary, positive leverages salient is also necessary.

Leverage-saliency theory is a bit limited on suggesting useful leverages. The list is limited to appealing to potential respondents’ sense of civic responsibility, writing surveys with interesting questions to entertain or pique interest of potential respondents, and highlighting interesting survey topics (see Groves, Cialdini, and Couper 1992; Groves, Singer, and Corning

2000; Groves, Presser, and Dipko 2004; Wenemark, et al. 2010). Making salient the connection of a survey to a known and respected sponsor organization can also increase response rates

(Groves, et al. 2006). However, more neutral sponsors, such as research universities and limit non-response bias. Connecting a survey to some sponsor organizations, for example the March- of-Dimes, can increase response rates, but do so from people with positive views of that sponsor organization. This can induce nonresponse bias if the topic of the survey is related to the mission of the bias-inducing sponsor organization (Groves, et al. 2012).

119 Each leverage can have a different level of positive or negative influence on a potential respondent’s decision to participate. One limitation of leverage-saliency theory is that it does not provide guidance to determine which leverages are most important to make salient. The theorists only provide a limited list of potential leverages, and one main leverage, monetary incentives, is not an option for the ACS. Another limitation of this theory that makes application to the ACS context difficult is that the original leverage-saliency work focused on interviewer- assisted modes of data collection, primarily phone interviews, where a direct one-on-one real time conversation was taking place (see Groves, Signer, and Corning 2000). This is different from an implied two-way conversation that occurs during ACS mail contacts. This theory does not address mail communication or communicating with potential respondents in a web-push methodology. Therefore, applying ideas from Leverage-Saliency theory to the ACS mail contact strategy is not straight forward. The theory also provides little guidance on specific leverages and even less on how to communicate those leverages, other than to make positive leverages salient.

However, the main idea that there are multiple leverages that can impact a decision and that multiple leverages need to be made salient to convert different types of respondents, is a useful conclusion to draw from this theory. Apply this to the ACS context, I contend that it may be possible to increase response rates by using communication materials to create multiple positive leverages salient to the respondent, which would allow more respondents to be influenced by the variety of different leverages. Recruiting participants through a mail contact strategy provides multiple opportunities to make a variety of leverages salient. Research has consistently shown that multiple contacts, particularly reminder notifications, are one of the

120 most powerful tools to increase survey response rates (Scott 1961; Dillman 1978; Lesser, et al.

2002; Roose, Lievens, and Waege 2007). While research does not exist from the leverage- saliency perspective on what leverages increase response rates, or how to effectively communicate multiple leverages across a mail contact methodology, a mail strategy with multiple contacts may be effective because it allows additional opportunities to make numerous leverages salient (Oliver, Heimel, and Schreiner 2017).

Social Exchange Theory

Over the past four decades, Dillman and colleagues (Dillman 1978; Dillman 2000; Dillman,

Smyth, and Christian 2009; Dillman, Smyth, and Christian 2014) have developed and refined a theory of survey messaging based on tenants of Social Exchange theory that are derived from a general theory of human behavior (see Thibaut and Kelley 1959; Homans 1961; Blau 1964), which argues that people are more likely to comply with any request if they “believe and trust the rewards for complying with that request will eventually exceed the costs of complying”

(Dillman, Smyth, and Christian 2014). It focuses on three main factors of, 1) communication of benefits and 2) reducing costs of participating and, 3) increasing trust that the perception of benefits seen by the respondent will outweigh the costs of responding. Social exchange theory provides direct recommendations to survey messaging. This theory also acknowledges that survey communication includes the words written in letters, but also involves all aspects of a survey communication strategy, including the messaging on envelopes, supporting materials and paper survey questionnaires. The theory also acknowledges that communication occurs through words, graphics, symbols and design elements (Dillman and Redline 2004; Christian and Dillman 2004). This theory presents a more holistic approach to communication design and

121 provides recommendations on how to apply social exchange principles to build trust, communicate benefits, and reduce burden through the use of mutually supportive components that improve response rates for the entire sample, not just a subset (Dillman, Smyth, and

Christian 2014; Dillman, 2016).

Social Exchange theory argues that building trust and legitimacy is essential for convincing people to respond to surveys and can be applied to the development of multiple tactics for increasing trust (Oliver, Heimel, and Schreiner 2017) and provides several ways that survey communications can build trust.

Building Trust: 1) Connect survey to a known and trusted sponsor: Surveys sponsored by the government

receive higher response rates than surveys sponsored by universities or private companies

(Presser, Blair, and Triplett 1992). The use of logos and official letterhead of the sponsor

organization are effective ways to communicate the connection between a survey and a

sponsor.

2) The design of survey mail materials should match public expectations: For government

surveys, the public expects clean, official, and generally plain envelopes. Flashy mailings

may resemble marketing or “junk mail” from a private corporation (see Dillman, et al. 1996;

Dillman 2000).

3) Send multiple contacts with unique content and consistent design: Prenotice and reminder

contacts are effective at increasing response rates (Dillman, Clark, and Sinclair 1995;

Dillman, et al. 1996; Roose, Lievens, and Waege 2007; Millar and Dillman 2011). To further

build trust, all mail communication materials should share as similar design. It is also

122 important that each mailing appear to be unique so that each is opened and read. This can

be accomplished by varying the format and size envelopes, and by using unique

nonrepetitive, messaging (Tarnai, et al. 2012; Dillman, Smyth, and Christian 2014).

4) Make the request personal: Survey requests should come from a real person with authority

within the requesting organization not from a building or an organization. The authority

figure’s signature should accompany the correspondence to make the request more

personal and communicates to the respondent they are valued (Dillman 2000). Compared

to mass-copied letters, personalized materials were shown in one test to increase response

rates by 3 to 12 percentage points (Dillman, et al. 1999; Dillman, et al. 2002; Dillman 2000).

5) Hold a single, person-to-person, conversation with respondents: All mail materials should

feel like part of a single, two-way conversation and requests should be written in plain

language using the voice of one person asking another person for help, and not in the

language of a bureaucratic organization requesting compliance (Dillman, Smyth, and

Christian 2014). The language should be consistent, phrasing should avoid changes in tone,

and the specific content in each letter or postcard should vary, and limit repetition, similar

to how people talk in conversation. Restating, not repeating, a message can be useful if

stated in the same tone, but in a different way, so as not to feel repetitive (Dillman, Smyth,

and Christian 2014).

6) Follow rules of etiquette and plain language: The request for help should include phrases

like, “please” and “thank you” and should be communicated in a respectful and non-

commanding, tone. Survey requests should incorporate language that indicates deference

123 for the respondent’s time and effort (Dillman, Smyth, and Christian 2014). Adults asking for

help from other adults should not make demands (Comley, 2006).

7) Assure, but do not overstate, confidentiality and privacy: In today’s world of security

breaches and privacy concerns, we are often taught the safest option to a survey request

might be to not respond (Dillman, Smyth, and Christian 2014). Confidentiality statements

are important and often mandated by survey organizations. However, “going to great

lengths and detail to explain how confidentiality works or will be assured… is more likely to

raise concerns than alleviate them” (Singer, Hippler, and Schwartz 1992). It is important to

communicate information about the protections in place to assure confidentiality and

privacy of survey responses, but too strong of a statement may be off-putting to

respondents (Singer, von Thurn, and Miller 1995). Words, like “never” and “always,” are can

be seen as overstatements and are less believed in the context of communicating risks and

privacy concerns while simple, straightforward, clear and nonalarming confidentiality

statements are often enough (Fobia and Holzberg 2017; Fobia, Holzberg, and Childs 2017).

8) Provide ways to verify the authenticity of the survey request: The sponsor organization (and

the person who signs survey letters) should be searchable online and clearly link to the

survey request. The return address for the survey organization should be a physical

location, not a P.O. Box. Email addresses, online survey instruments and mail contact

materials should all connect to the same organization. A phone number should be provided

for respondents to call to verify authenticity, ask questions, or voice concerns (Dillman,

Smyth, and Christian 2014).

124 9) Send a token cash pre-incentives: The inclusion of a small token cash pre-incentive builds

trust because it is given before the response is completed (Church, 1993) and can induce

response from all types of potential respondents, including those less likely to respond

(Trussel and Lavrakas 2004; Millar and Dillman 2011; Messer and Dillman 2011; Singer and

Ye 2013). Nonmonetary pre-incentives, such as calendars, magnets, return address stickers,

pens or other items, may increase response in some contexts, but do not work as well as

cash (Singer and Ye 2013). Pre-incentives can be more effective that post-survey incentives,

as one study found that a pre-incentive sent with a survey request was more effective than

a promised amount six times larger given when a survey was complete (Avdeyeva and

Matland 2013).

10) Spend money on mail materials: Survey contact materials that look and feel professional

communicate that the survey organization values respondents’ time, that the survey

request is real and important, and that the survey organization can be trusted.

Implementing features such as first-class mail, large envelopes, official letterhead, full-sized

letters, return envelopes with pre-paid postage, and real stamps, may increase response

rates and justify increased costs (Armstrong and Luske 1987; Dillman, Clark, and Sinclair

1995; Dillman, et al. 1996; Hembroff, et al. 2005; Brick, et al. 2012; Tarnai, et al. 2012;

Dillman, Smyth, and Christian 2014).

Communicate Benefits: The second factor in Social Exchange theory is to communicate the benefits of survey response.

1) Benefits of survey response must be believable: Social Exchange theory relies on trust to

communicate believable social benefits of survey participation (Stafford 2008). For example,

125 one benefit of participating in the ACS is that one’s community can better plan to meet the

needs of its residents by using ACS data. A respondent mat not be able to easily see that

their response to the ACS was part of any decision that affects their community. For a

respondent to feel that their response helped provide a social benefit, the respondent must

first trust the survey organization will make these data available, and community planners

will use this information to provide the benefit. It may be difficult to link a survey response

to a personal benefit. However, it is important that survey materials do not deny the

existence of benefits from survey participation just because they are experienced by a

community rather than an individual (Dillman, Smyth, and Christian 2014).

2) Emphasize that responding directly helps other people: Some people feel a sense of

accomplishment when completing a task for someone else and may generally feel a sense

of reward when they feel they have helped others. For some, this sense of accomplishment

is heightened when the action provides no personal benefit aside from helping someone

else (Homans 1961; Blau 1964; Dillman, Smyth, and Christian 2014). Surveys that produce

data that benefit those in need can be framed in a way to provide a personal benefit to

some respondents.

3) Make the survey sound interesting: Some people enjoy, and are more motivated to engage

in, activities they find interesting and personally satisfying (Deci and Ryan 1985; Ryan AD

Deci 2000). Starting a survey with an interesting question that respondents find engaging

can spur respondents, particularly reluctant ones, to at least examine and possibly complete

the remaining survey questions. Messages in mail materials can accomplish the same goal

126 by communicating that the content of a survey is interesting and important (Dillman,

Smyth, and Christian 2014).

4) Stress that opportunities to respond are limited: People view a task as more important

when they believe it is an opportunity only offered to a few people (Cialdini 1984). Framing

the survey request as a unique opportunity only provided to a few people can communicate

that the potential respondent is in a privileged position to shape the future of their

community (Dillman, Smyth, and Christian 2014).

Reduce the cost of participation: The main cost of survey participation is the task burden associated with completing a survey request. However, potential respondents can also experience a cost of participating in a survey from mail communication materials, if, for example, the messaging in materials makes the survey request unclear or confusing. Social Exchange theory provides several ways survey messaging can reduce the costs a respondent feels when deciding to participate in a survey.

1) Ask the respondent for help without subordinating language: Messages in survey

recruitment materials should ask a potential respondent for help and make it clear that the

sponsor is dependent on the respondent. Survey requests should not suggest that benefits

would be withheld from a community if a household does not respond (Dillman, Smyth, and

Christian 2014). Survey messaging should not pressure respondents but communicate that

the survey sponsor is grateful if the respondent agrees to participate (Comley 2006).

2) Offer multiple response modes: Some respondents prefer a specific mode of survey

response. Offering respondents multiple modes in which to respond allows them to

127 respond in a way that they are able and comfortable (Gentry and Good 2008; Smyth et al.

2010; Millar and Dillman 2011; Olson, Smyth, and Wood 2012).

3) Provide a deadline or due date: Providing a deadline or a due date reduces the cost of

survey participation by increasing the speed of a response by communicating a clear

instruction and expectation for when the survey task is due (see Henley 1976). Also, a due

date communicates that the survey organization respects a potential respondents time

(Dillman 2016). Due dates have been shown to be effective in survey operations (see

Edwards, et al. 2009) and the decennial census settings, though there is some debate on the

best way to implement or communicate the due date to potential respondents (Martin

2009; Stokes, et al. 2011).

4) Convey that others have responded: People may feel more comfortable when they act in

ways consistent with their previous behaviors, beliefs, or values, and acting consistently can

reduce the phycological cost of acting inconsistently (Festinger 1957 from Dillman, et al.

2014). Messages that communicate that that complying with a survey request is a normal

activity similar with other actions people consistently engage has been shown to increase

response propensity and increase the speed of survey response (Edwards, et al. 2009).

Applying Results from Census Bureau Sponsored Messaging Research

Prior to the 2010 census, the Census Bureau contracted Reingold, Inc., to conduct qualitative research on messaging that could impact the decision to respond to the census. While the ACS faces different challenges than the more widely known census, the audience for both data collection operations is the same. The study conducted 4,064 interviews (3,001 by telephone and 1,063 in person) and asked participants to listen to a series of statements and rank each as

128 either making them more likely to participate, less likely to participate, or no impact on the decision to participate (Macro 2009). Table 3 shows the percentage of respondents who said they were more likely to participate in the 2010 census after hearing a given message, sorted by the success of the message with the overall population of respondents. Table 3 also shows the percentage of compliant and cynical potential respondents who said they were more likely to participate in the 2010 census based on these messages (Macro 2009).

Table 3. Percent of participants who stated a message would increase their likelihood of completing the decennial census

Message All Compliant Cynical 1. Information from the census helps the government plan for the future improvements to schools, roads, 82 88 70 fire and police stations. 2. Filling out the census provides opportunity to help people in your local community get certain benefits 80 88 69 such as health care, school programs, day care, and job training. 3. The census is more accurate if everyone participates 80 86 72 4. Census counts decide a community’s share of $300 billion in federal funds for schools and other 79 85 68 programs. 5. If you don’t fill out the census form, your family and local community might not get their fair share of 77 81 68 benefits. 6. Mailing your census form early helps the government save millions in taxpayer dollars that would otherwise 75 80 66 go toward following up with you if you don’t mail it back. 7. The census determines the number of 67 70 56 representatives in Congress each state gets. 8. To see what changes have taken place in the size, 64 74 48 location, and characteristics of the people in the US. 9. The law requires everyone to participate in the 62 67 57 census. 10. Census employees are subject to a jail term, a fine, or 62 67 55 both for disclosing personal information. 11. The census doesn’t ask or sensitive information; it only asks a few questions such as name, age, date of 56 60 53 birth, hoe people are related, race, and origin. Source: Macro, 2009

129 In general, messages that worked well for compliant respondents worked comparatively well for cynical respondents, with all messages being more impactful on the complaint than cynical respondents. Messages that communicated community benefits ranked as the top two messages overall and with the compliant group. The Cynical Fifth responded most positively to the statemen that, “The census is more accurate if everyone participates” (72 percent), followed by the two “local benefits” messages. The bottom three overall, and with compliant respondents, were statements on legal obligation and data security. The lower ranked messages for the cynical respondents was a message about the data tracking changes taking place. Though it was not a message tested in the study, interview participants volunteered that a due date or deadline would be a strong motivator at convincing them to respond and respond on time (Macro 2009).

The Census Bureau also conducted a series of similar research projects focused on messaging for the ACS, each based on approximately 1,000 telephone interviews with adults who typically handle the mail for their households. In the first study respondents were asked to rank and assess messages based on believability of the message the likelihood that the messages would increase their chance to respond.24 The results of one of these studies, shown in Table 4, ranked the percent of people who were more likely to complete the ACS after hearing certain messages (Hagedorn, Green, and Rosenblatt 2014).

24 Believability was measured on a four-point scale from “very believable” to “very unbelievable” and impact on response was measured on a five-point scale from “much more likely to respond” to “much less likely to respond”.

130 Table 4. Results of the messages in the Benchmark Messaging Survey

% more % found likely to messages Message comply25 believable26 1. There are many ways to respond to the ACS. IT can be completed 52 86 by mail, by phone, online, or in person. 2. State and local leaders use data from the ACS to determine where 51 69 to build new roads, schools, and hospitals. 3. The ACS is used to produce key economic indicators. Businesses use 50 70 the ACS to create jobs, plan for the future, and grow the economy. 4. The ACS helps determine the annual distribution of more than $450 49 66 billion in federal funds that go to communities nationwide. 5. The census has operated continually since , James 48 72 Madison, and the other Founders established it in 1790. Participating in the ACS is an expression of patriotism and civic duty. 6. Nothing in the private sector compares to the ACS. It is a leading 47 74 source of information use to learn about neighborhoods, communities, cities, and states. 7. Even though all household participate in the census every ten 46 82 years, only a small number of households are selected to participate in the ACS each year. 8. The ACS is required by law to be completely nonpartisan and 46 70 nonpolitical. This ensures that the statistics the Census Bureau gathers and produce are both reliable and trustworthy. 9. The ACS is often the most reliable source of accurate and timely 46 66 statistical information essential for decision making. 10. Filling out the ACS is required by law, just like filing out the census 46 55 once every ten years. 11. All individual information collected as part of the ACS is kept strictly 38 56 confidential. The answers from individual respondents cannot be shared with anyone – not even other government agencies. Source: Hagedorn, Green, and Rosenblatt 2014.

The top-rated message for increasing the change to respond was also the most

believable; a message the communicated there were many ways to respond to the survey.

Statements on the local and business uses of the survey also ranked high for likelihood to

increase response as well as believability. Again, the lowest ranked messages were those on

25 Percentage of all respondents who said they were either “much more likely to respond” or “somewhat more likely to respond” after hearing a given message. 26 Percentages of all respondents who said the statement was either “very believable” or “somewhat believable”.

131 security and legal obligation to responds. It should be noted that there was only a small difference among the 11 messages in the message impacting the likelihood of respondents completing the ACS as 10 of the 11 messages ranged from 46 percent to 52 percent. There was a wider range in the believability metric, where believability ranged from 86 to 56 percent with the least believed messages on legal obligation and security.

A follow-up project using the same methodology revised these messages and presented a new list to new participant as the results show in Table 5 (Hagedorn and Green 2014).

132 Table 5. Results of ACS Messaging Refinement Survey 25 26

% more % found Theme Message likely to messages comply believable State and local leaders in [respondent’s state] can use ACS data 61 74 Community to determine where to build roads, schools, and hospitals. State and local leaders across the nation can use ACS data to y 59 77 determine where to build roads, schools, and hospitals. ACS data help determine the annual distribution of more than 57 64 Impact & $400 billion in federal funds to communities nationwide. ACS data are used to distribute funds that build and maintain benefits nearly one million miles of highways and fund over four 55 66

thousand hospitals in communities nationwide. Even though all households participate in the census every ten years, only a small number of households participate in the ACS 54 77 Decennial every year. The ACS provides a more up-to-date picture of our Census communities. The ACS and the census show us not only the number of people who live in the country, but also how we live as a nation 58 83 including our education, housing, jobs, and more. Local charities and non-profit organizations use ACS data to Non- better understand and meet community needs. This detailed, 54 68 government local information is not available from other sources. uses l Small businesses use ACS data to better understand and meet community needs. This detailed, local information is not 49 68 uses available from other sources. Census Bureau employees are prohibited by law from releasing any information that can identify any individual who fills out 55 78 Safeguard the ACS. Millions of people securely participate in the ACS for privacy every year. By law, Census Bureau employees cannot publicly release any ACS information that could identify an individual. The penalties 55 74 for unlawful disclosure can be up to two-hundred and fifty thousand dollars and/or up to five years in prison. The ACS is a leading source of information people use to learn 50 71 Local about their neighborhoods, communities, cities, and states. snapshot The ACS is the most reliable source for accurate data about every community in the country from the smallest rural 54 69 communities to the largest cities. Filling out the ACS online is the quickest and easiest way to complete the survey. A paper survey is sent to people who do 52 78 Convenience not complete the survey online. Filling out the ACS online conserves natural resources and Convenien saves taxpayers’ money. A paper questionnaire is sent to 49 76 ce people who do not complete the survey online. Source: Hagedorn and Green 2014.

133 Community level benefit statements and statements about the impact and benefit of the ACS ranked among the best in likelihood to increase response and believability. However, statements about the large impact of the ACS (for example, that it was used to help distribute

$400 dollars in federal funds) ranked lower in believability than community level statements.

Data security statements again ranked lower but were not the least well received message.

Statements that included information about the scientific accuracy or process of the ACS ranked lower on impact. For example, a statement about small business use was coupled with the phrase, “This detailed, local information is not available from other sources” and statements about the ACS being the, “leading source of information about every community” and “is the most reliable source of accurate data about every community,” also ranked lower than statements linked to benefits.

The Census Bureau contracted Gallup to learn more about what messaging works for the ACS and general feelings about trust, privacy, and confidentiality regarding federal statistics. A total of 4,310 participants were read a paragraph of introductory text about the

ACS followed by a four randomly selected messages from a list of ten about the ACS similar to the messages and methodology used in the Reingold research project (Fulton, Morales, and

Childs 2016).27

27 Respondents were asked to assess their likelihood of responding to the ACS after hearing each message, based on a five-point scale (“much more likely to respond,” “somewhat more likely,” “neither more nor less likely,” “somewhat less likely,” and “much less likely to respond”.

134 Table 6. Percentage more likely to complete the ACS in the Gallup Daily Tracking Survey

% more likely to complete the Message ACS28 1. State and local leaders use data from the ACS to determine where to build 78 new roads, schools, and hospitals. 2. There are many ways to respond to the ACS. It can be completed by mail, 75 by phone, online, or in person. 3. The ACS helps determine the annual distribution of more than $450 73 billion dollars in federal funds that go to communities nationwide 4. All individual information collected as art of the ACS is kept strictly 72 confidential. The Answers individual respondents provide cannot be shared with anyone – not even other government agencies. 5. The ACS is used to produce key economic indicators. Businesses use the 71 ACS to create jobs, plan for the future, and grow the economy. 6. The ACS is required by law to be completely nonpartisan and nonpolitical. 69 This ensures that the statistics the Census Bureau gathers and produce are both reliable and trustworthy. 7. The census has operated continually since Thomas Jefferson, James 68 Madison, and the other Founders established it in 1790. Participating in the ACS is an expression of patriotism and civic duty. 8. No other data collection compares to the level of detail collected in the 66 ACS. IT is a leading source of local information Americans use to learn about their neighborhoods, communities, cities, and states. 9. The ACS is often the most reliable source of accurate and timely statistical 64 information essential for decision making. 10. Even though all households participate in the census every ten years, only 57 a small number of households are selected to participate in the ACS each year. Source: Fulton, Morales, and Childs 2016

The results from the Gallup survey, shown in Table 6, were similar to those from previous research. Community level benefits statements were well received, and scientific statements about the survey (i.e. only a small number of households in the ACS, the ACS being a reliable source of statistical information, and a leading source of data) were the least well received messages. One critical finding in this study is that the data security statement ranked higher

28 Respondents who said they were either “much more likely to respond” or “somewhat more likely to respond” after hearing a given message were combined to describe all persons who were more likely to respond.

135 than in previous research.

One area of interest to the Census Bureau was to learn more about how messaging impacts potential respondents who are cynical or distrusting of the government. As discussed in

Chapter 2, the universe of addresses sent ACS mail communications is cut twice during the self- response data collection period to remove responding households. To better understand the changes in the potential respondent pool during the mail out process as respondents are removed, the Census Bureau leveraged research that compiled the U.S. population into sixty- seven “segments” (Esri 2018). Combining the information on “types” of respondents with known response rate data, the Census Bureau learned how the pool of potential respondents changes as the self-response data collection period progresses. Not surprisingly, households remaining in the sample that receive the later mailings after respondents are removed from the available sample universe are different than the original population. They are generally more distrustful of the government and survey requests, live in more rural communities, and have lower levels of education (Berkley 2018). Finding information on messaging that works with this audience is one way to potentially boost overall response and reduce nonresponse bias.

In one study, results shown in Table 7, participants were asked to rank five statements related to privacy and confidentiality on if the statement would make them trust the Census

Bureau more or less (Hagedorn, Green, and Rosenblatt 2014).

136 Table 7. Comparing five messages on privacy and confidentiality

% who trust % who trust the Census the Census Message Bureau more Bureau less 1. By law, Census Bureau employees cannot publicly release any information that could identify an individual. The 45 16 penalties for unlawful disclosure can be up to two hundred and fifty thousand dollars or up to five years in prison. 2. Millions of Americans participate in the ACS every year. However, the ACS does not release any information that can 36 17 identify any individual who participates. 3. The Census Bureau is different than many other parts of the 35 17 federal government. They are solely a research organization. 4. Participating in the ACS is safe. All individual responses are protected by law and are not shared with anyone–not even 34 20 other government agencies. 5. The U.S. Census has been in existence since the 1790s and the ACS has been conducted in some form or another since 30 15 the 1850s. Source: Hagedorn, Green, and Rosenblatt 2014

Most distrusting people were not moved by these statements. However, the statement that linked a penalty on federal workers who disclose information was ranked the highest at engendering trust in this skeptical, distrustful population. A second study, this time consisting of focus groups in seven U.S. cities29 focused on potential respondents who were cynical, suspicious, or ambivalent toward the federal government. Participants were asked to review a selection of messages and to discuss and debate with focus group participants how they felt about each statement (Orrison and Ney 2014). The most compelling arguments from the focus groups were that the ACS: benefits local communities, provides data for planning and development, allows for smart allocation of federal funding, the benefits outweigh the costs of administration, allows individuals to make their voices heard, provides valuable data to

29 Albuquerque, NM; Atlanta, GA; Dallas, TX; Los Angeles, CA; Richmond, VA; St. Louis, MO; and Washington, D.C. Each focus group consisted of 24 to 28 participants for a total of 186 participants.

137 businesses, is a civic duty, is preferable to its alternatives, such as other government sources of information. The report does not rank order the statements but notes that community level benefits were consistently favored across focus groups in the different cities. The report also notes value in testing “hyper local” examples of benefit that mentions specific benefits provided to a community based on ACS data (Orrison and Ney 2014).

Focus group members also noted reasons not to participate in the ACS, including that the ACS: produced data that seemed redundant with existing sources of data, offered no visible benefits to a respondent’s community, was generally unknown so lacks credibility, the legal obligation to participate was unreasonable, the fine for non-participation was unreasonable, as well as doubts about the ability of the Census Bureau to keep personal information secure or to guarantee confidentiality. In another qualitative study, researchers found that when confidentiality statements were communicated, that simpler versions of the statement were received more positively as lengthier and more detailed statements raised more questions and concerns (Fobia, Holzberg, and Childs 2017).

Expert Review of ACS messaging

The Census Bureau recently sought expert comments on ACS messaging from three different groups: The Social and Behavioral Science Team30 (SBST), National Academies of Science

Committee on National Statistics (CNSTAT), and invited presentations during the Summer at the

Census workshop series. The SBST produced a list of behavioral insights that may improve response rates to the ACS and the census (Feygina, Foster, and Hopkins 2015; Shephard and

30 The SBST was established in 2015 to help federal agencies integrate behavioral insights into their policies and programs (Executive Order No.13707, 2015).

138 Bowers 2016), which includes:

1) Social Norms: People are motivated to conform to the norms of those in their community

and peer group. Survey requests should be framed in a way that makes responding conform

with social norms.

2) Procedural Justice: People are more likely to comply with a request if the request comes

from an authority, and if people feel that they are treated fairly and with respect. Potential

respondents should feel they are part of the survey process. This may be accomplished by

putting a stronger emphasis on the choice of response modes and also but providing

respondents the opportunity to voice concerns, give feedback. All materials should use

phrasing that makes respondents feel that all potential respondents are treated equally and

respected.

3) Patriotism: Patriotism can be a powerful motivator to action. As a federal survey, it may be

possible to frame survey participation as a patriotic duty.

4) Commitment to completion: People are more likely to complete a task if they make a

commitment to the task and can see all of the steps toward competition of the task.

Including a commitment device, such as a commitment checkbox or a calendar that notes a

due date for the survey may help potential respondents commit to the task by visualizing

their schedule and picking a date by which to respond.

5) Information about benefits of the survey: People are motivated by benefits from actions.

Communicating benefits that are believable and framed at the local level may increase rates

of compliance. It may also be powerful to highlight the increased costs that occur from

delayed responses.

139 6) Personalization: People are motivated by requests that are sent to them personally, not

generally.

7) Reduce confusion: Requests for action should be clear. Using plain language may reduce the

confusion felt by potential respondents and especially in households with people with low

literacy levels.

In 2016, the Census Bureau contracted with the National Academies of Science

Committee on National Statistics (CNSTAT) to conduct a two-day public workshop on ways of reducing respondent burden in the ACS.31 In addition to the public workshop, CNSTAT also held four one-day closed meetings with the Census Bureau and experts from a variety of fields who provided recommendations for improving ACS communications.32 The ACS also invited Don

Dillman to comment on ACS messaging during the Summer at the Census seminar series. The recommendation from these expert reviews were (National Academies of Sciences,

Engineering, and Medicine 2016; Dillman 2016):

1) The messaging in each mailing should have a distinct focus but be mutually supportive using

consistent phrasing, tone, and design across mailings and mail pieces and avoid repetitive

and unfocused communications (Dillman 2016).

31 The Census Bureau held another CNSTAT meeting in 2018 that occurred concurrent with this dissertation work. Findings from this dissertation were presented at this meeting (Schreiner 2018b), so it is not cited in this dissertation. Please see National Academies of Sciences, Engineering, and Medicine (2019) for a review of this meeting. 32 The external experts who participated in the CNSTAT meetings on the ACS mail materials were: Don Dillman, Washington State University; Nancy Mathiowetz, University of Wisconsin-Milwaukee (emeritus); Andy Peytchev, University of Michigan; Andrew Reamer, George Washington University; and Sandra Bauman, Bauman Research and Consulting.

140 2) Raise the perception of the value of the ACS by placing greater emphasis on the benefits to

respondents of survey participation (National Academies of Sciences, Engineering, and

Medicine 2016; Dillman 2016).

3) Communicate the role of the ACS in improving the economy, ensuring efficient government,

and sustaining democracy (National Academies of Sciences, Engineering, and Medicine

2016).

4) Specify the benefits of the ACS to individuals within their communities (Dillman 2016).

5) Make the ACS seem like a scarce opportunity by equating being selected into the ACS

sample to winning the lottery (National Academies of Sciences, Engineering, and Medicine

2016).

6) Emphasize culturally relevant messages of empowerment (National Academies of Sciences,

Engineering, and Medicine 2016).

7) Attach the ACS to the Census Bureau brand since it is familiar to many people (National

Academies of Sciences, Engineering, and Medicine 2016; Dillman 2016).

Building tentative messaging recommendations from theory and research

The main goal of survey communications is to increase survey response. Mail communication materials must overcome multiple barriers between the Census Bureau and a sampled household responding to the ACS request (Dillman 2019b). In order to reply to a survey request, respondents must receive a survey request, remember it, open it, start filling out the survey, complete it, and mail it back (Faye, Bates, and Moore 1991). I contend that it is possible that messaging and holistically-designed mail communication materials can overcome these

141 potential breakdown moments. Adapted the list of breakdown moments presented by Faye,

Bates and Moore (1991) and Dillman (2016), I conclude that recommendations for ACS messaging should address the following potential moments of communication breakdown:

1) Respondents don’t notice or remember receiving a mailing 2) They remember receiving a mailing, but they didn’t open it 3) They opened the mailing, but they didn’t read the contained materials 4) The materials were read, but action was not taken 5) The survey was started, but not finished 6) The survey was finished, but not returned. Messaging in the mail communication materials can influence respondents to notice a mailing, open a mailing, read the contents of mailing, begin a survey, complete, and complete and return a survey. A single theory on survey messaging does not exist (Dillman 2019b). The theories and research reviewed in this chapter provide insight into how messaging can be used to increase survey response rates and overcome potential breakdowns that could prevent a household responding to the ACS. Some recommendations are supported by research or theories from multiple fields. In some cases, the recommendations across fields conflict. In this section, I distil insights from this review of literature and research to produce tentative recommendations for ACS messaging that I will use to evaluate the messaging in the ACS mail communication materials in Chapter 5.

Establish credibility and trust in the first mailing

Over the past decades, establishing trust has become more important because of increased feelings of distrust in the survey population (Dillman 2017a). Establishing trust and credibility of the survey is the most important task of survey messaging. This must be done immediately in

142 the first communication with potential respondents. While using pre-suasion techniques have been used in non-survey contexts (see Cialdini 2016) they may not fare as well in survey requests (Dillman and Greenberg 2017). Pre-suasion does bring one important point relevant to survey operations: the initial contact with respondents is critical. The initial contact with sampled households should focus on building trust.

Initial communication should limit messaging, consider what the reader needs to know, and omit extraneous details (McCormack 2014; Poldre 2017). Doing so may clearly communicate a more focused, single message to build trust can between the potential respondent and the Census Bureau. The largest portion of people in the ACS survey population are trusting, compliant, and dutiful in feelings toward responding to government survey requests (Macro 2009; Conrey, ZuWallack, and Locke 2012; Berkley 2018). These respondents do not need much motivation to respond, and overloading mail materials with extraneous details may make understanding the communication more difficult (McCormack 2014; Poldre

2017). In turn, responding to the survey request when the request has too many messaging in too many mail materials may make the response task feel more burdensome, which increases the cost of participation (Dillman 2016). The trusting and compliant potential respondents may only need to know a few key details – that the request is coming from the Census Bureau, the request is real, and that a response is required by law, and how to respond – in order to comply with the request. The first mailing should clearly communicate these messages without distracting details to gain compliance from this trusting group of respondents.

Establishing trust in the first mailing not only helps gain response from compliant potential responders, but it is critical to build trust so that future mailings can convert non-

143 responders. All potential respondents must first trust the Census Bureau before they believe additional messages, for example, on the benefits of surrey participation (Dillman, Smyth, and

Christian 2014). This process begins in the first mailing but continues throughout all mail pieces.

Leveraging the Census Bureau authority as a survey organization (Cialdini 2009; Dillman, Smyth, and Christian 2014), providing simple ways to verify authenticity of the survey request (Dillman,

Smyth, and Christian 2014), connecting the AC to a known and trusted sponsor (Groves, et al

2006; Groves, et al., 2012), personalizing mail materials (Dillman, et al 1999: Dillman, et al.

2002), and sending multiple contacts with a consistent, governmental design (Dillman, et al.

1996; Dillman 2000) can all help build trust across a mailings. But this all begins in the first mailing that focuses on trust building messages.

Clearly connect the ACS to the Census Bureau as the survey sponsor

Multiple theories suggest that connecting a survey to a known and trusted sponsor can increase response rates (Groves, et al. 2006; Groves, et al. 2012; Dillman, Smyth, and Christian 2014).

Government survey requests are trusted more than non-government survey request (Presser,

Blair, and Triplet 1992; Dillman 2000; Brick and Williams 2013; Schwede 2013). From this, some might suggest using messaging to connect the ACS to both the Census Bureau and the federal government. However, public trust in the federal government is at historic lows (see Pew 2019) and people are familiar with and trust the Census Bureau at a higher rate than they trust the federal government (Hagedorn, Green, and Rosenblatt 2014). Sending too many messages can also confuse a potential respondent (McCormack 2014), and also raises mailing costs significantly. So, it would make sense to assume that communicating too many sponsors would be confusing as well. The Census Bureau should connect itself to the ACS as the one and only

144 sponsor of the survey and limit direct references to the federal government or other agencies.

This may help limit the total number of sponsorship messages to focus attention on the Census

Bureau sponsorship.

Simply state confidentiality and data security

In qualitative studies, confidentiality and data security messages ranked consistently among least impactful messages at convincing a potential respondent to comply with a census or survey request (Macro 2009; Hagedorn, Green, and Rosenblatt 2014). This was especially true with cynical and distrustful respondents (Orrison and Ney 2014). While recent research showed that confidentially statements were received more positively than in the past (Fulton, Morales, and Childs, 2016), overstating confidentiality statements can still raise more concerns than simple statements (Singer, Hippler, and Schwartz 1992; Fobia, Holzberg, and Childs 2017;

Dillman, Smyth, and Christian 2014).

Confidentiality and data security statements are required in all federal survey prior to respondents having the opportunity response. Prior to the receiving the paper questionnaire in the third mailing, respondents can only respond online or by telephone. If the statement is placed online with the survey instrument and communicated verbally over the phone to respondents who call to complete the survey with a Census Bureau enumerator, the required mandatory confidentiality statements does not need to appear prior to the third mailing with the paper questionnaire. A simple, plain language, confidentiality and privacy statements should be included the first mailing to ensure that privacy is protected. However, the detailed statements that may raise concerns should be communicated in a later mailing in a way that does not highlight or call attention to the statement. For example, placing required language on

145 the back of a letter or survey form rather than the front still communicate the necessary information for potential respondent that need it and also meet the legal obligation.33

Personalize the survey request

Personalizing a survey request can increase the chance that a household complies (National

Academies of Sciences, Engineering, and Medicine 2016; Dillman 2016). To personalize a survey request, mail materials such as letters should be sent from a real person within an organization,

(i.e. Lawrence Smith, director of the Census Bureau), and not form the survey organization (i.e.

The Census Bureau). Letters should include a formal salutation and include the sender’s signature. Messages in all ACS materials should be written in a single, consistent voice as if the person who signed the letter wrote the letter. People are also more likely to comply with a request they like or feel a sense of unity with the requestor (Cialdini 2009). To build a sense of unity, survey requests should be phrased in a way that communicates that the Census Bureau is asking for help and responding household are on the same team with the Census Bureau together solving a problem or achieving a goal. All messaging should follow basic rules of etiquette, avoid the use of subordinating language. Potential respondents should also be able to verify the authenticity of the person and organization making the survey request.

Leveraging procedural justice insights can make a request feel more personal. People feel more empowered, in control, and have ownership of the actions when they feel they are given a choice to respond (Feygina, Foster, and Hopkins 2015). One interpretation of insights

33 Rules regarding privacy and security statements are controlled by OMB and could change. If the statements are required within the initial contact, the advice to minimize the statements by placing them on the back of a letter would be best practice.

146 from procedural justice might be to emphasize the choice of response modes mat may people feel more empowered and more ownership over their participation decision (National

Academies of Sciences, Engineering, and Medicine 2016). However, research has consistently noted that providing a choice of response modes may decrease response and that sequentially offering response modes may be a better strategy (see Millar and Dillman 2011; Longsine and

Risley 2019). Leveraging this conflicting advice, I recommend that ACS continue to adopt a web- push methodology that sequentially orders available response modes. When multiple modes of response are offered, messaging should offer the mode choice in way that empowers potential respondents. Raising the perception of self-efficacy can raise a potential respondent’s confidence that they can complete the task (World Bank 2017). Also, calling attention to desired outcomes and removing ambiguity can also help people make agreeable decisions.

Removing the number of extraneous details or decisions a person must consider allows them to focus directly on a single desired task. Asking someone to make a commitment34 to act helps a potential actor link intended behaviors with a concrete future moment and course of action.

This may help reduce forgetfulness and procrastination and is a direct way to set a positive expectation that can influence agreement towards a request (Artefact 2019).

Providing an opportunity for respondents to provide feedback is a way to personalize a request, build unity, and communicate respect for a respondent’s time to complete the survey

(National Academies of Sciences, Engineering, and Medicine 2016, Dillman, Smyth and Christian

2014). This device may also trigger feelings of reciprocity. Triggering feelings of reciprocity can

34 Recipients of a flu vaccine mailer were asked to write down the date, time, and location at which they planned to be vaccinated. This message increased vaccination rates by 4.2 percentage points (Milkman, et al. 2011).

147 be an effective motivator to act (Gouldner 1960; Cialdini 2009). After providing everything the

Census Bureau wants by answering all of the questions on the ACS, the respondent should be given the opportunity to provide feedback and to say what they want about the survey task or process (Dillman, Smyth, and Christian 2014).

People are also more likely to act if they are responding to real, emotional stories that highlight a specific person’s experience. ACS messaging should highlight real, tangible stories of how the survey impacts people and communities. Communicating generalized facts about the uses of ACS data, or facts about the ACS survey, are impersonal and may not sway behavior

(Dillard and Shen 2012; Kreuter, et al. 2010). Including a specific example or a personal story may also help trigger a feeling of admiration for the Census Bureau and the purpose of the survey task. People are more likely to comply with a request from people they admire (Cialdini

2009).

Stage messaging across mailings

Survey organizations have many options at their disposal for communicating convincing messages to respondents to convince them to participate in a survey request. However, communication and marketing literature suggest that a single communication should limit the number of messages in a communication so as not to overwhelm a reader. Focusing on fewer messages can increase retention of those messages and may increase compliance with the requested action (McCormick 2014; Polder 2014). Because the first mailing should focus on trust building messages, it cannot simultaneously contain every other message that may be useful to induce a response. A first mailing focused on trust combined with a quick follow-up mailing that shares a similar design to the original mailing may help a skeptical respondent trust

148 that the survey request is real. The presence of the first mailing makes the second mailing more believable. The second mailing may now be more effective at communicating other messages, for example, the benefit of survey participation. With trust firmly established, benefits statements should be the focus of the second message.

Frame ACS participation as a community benefit

Messaging should communicate the positive social benefits of survey participation, particularly benefits to one’s community. Given the heterogeneity in communities in the United States, this may be difficult to accomplish. Some individuals who see an inequitable distribution of resources between neighborhoods may not be as moved by community level benefits messages that promise benefits to their community. For example, many small community’s will never have a hospital or a school, though this is the wording often communicated in ACS letters.

Despite this challenge, community level benefits resonate with potential respondents, including those that are cynical and distrustful (Marco 2009, Hagedorn and Green 2014; Evans, et al.

2019). In comparison, framing ACS as a national level benefit doesn’t seem to work as well to convince a respondent to participate. Benefits aimed at helping local businesses or nonprofit organizations that serve local communities may also be more positively received than framing benefits in terms of large corporations or national level organizations or agencies (Orrison and

Ney 2014). Communicating a real example of benefit to a community similar to theirs may help sway some participants to respond (Dillard and Shen 2012; Artefact 2019). ACS mail materials should communicate a mixture of broad community benefits with more tangible benefits may reach a variety of audiences. The second mailing should focus primarily on communicating tangible benefit of responding to the ACS.

149 Leverage audience-based insights across mailings

The total survey population consists of people who trust the government, people who distrust the government, and those who are generally unaware of the role of government (Macro 2009;

Conrey, ZuWallack, and Locke 2012). As households respond to the ACS, responding households are removed from the sample population (see chapter 2). As this occurs, the characteristics of the remaining potential respondent population changes (Berkley 2018). Messaging in subsequent mailing will be more effective if that messages target the characteristics of those still left in the sample. Households that do not reply after the first two mailings tend to be older, less educated and more rural (update with facts) (Berkley 2018).

All materials sent to the general population should use plain language understood by someone with a basic reading level (Plain Language Guidelines 2010). ACS research found the use of plain language in prenotice mailings increase response rates (Griffin, et al. 2004; Raglin, et al. 2004). All materials should be written in language appropriate for the audience. Using

Plain language is important in all mailings but may be increasingly important in follow-up mailings as the characteristics of the potential respondents change as responding households are removed. While people respond by the internet through the self-response period communicating the option to respond by paper is targeting a different type of respondent, and messaging should reflect that fundamental change.35 All survey requests should be written with knowledge of the potential audience and consider what each audience needs to know and limit the amount of extraneous details. I have argued that the first mailing should focus on

35 Self-response can continue throughout the non-response follow up data collection period as well. These late respondents that may be influenced by NRFU operations are not analyzed in this dissertation.

150 establishing trust, and that the second mailing should focus on communicating benefits of survey response. The third mailing universe is different. It is comprised of respondents who have not yet responded. Many are capable, some are reluctant, and others may feel a sense of burden in responding. The third mailing should focus directly on reducing the perception of response.

Multiple survey recruitment contacts should also feel like a continuous conversation between the survey organizer and the potential respondent. Potential respondents weigh the perceived benefits of survey participation against the perceived costs of participating. Each message in a survey request is an opportunity to convince a potential respondent to complete a survey request. Rather than repeating messages verbatim, multiple contacts should highlight and communicate different reasons to participate in the survey. Critical details, such as salutations and calls-to-action, will be repeated in each mailing, but other appeals should vary.

It is possible that some respondents, for whatever reason, will not receive or read the initial contact. The first mailing received and read may be the 3rd, 4th, or 5th mailing. For this reason, critical details should be repeated. However, repeating other messages only benefits the households that had not read previous mailings. While the act of receiving a follow-up mailing is an important factor for increasing response (see Dillman, et al. 1996), in households that have read previous mailings and were unconvinced to respond, the messaging in a follow-up reminder may only have limited impact if it repeats what has already been said. New messaging, communicating new appeals to respond, maybe more effective, especially if that messaging reflects the changes in the potential respondent pool as responding households are removed.

151 Writing materials in a single tone to hold an implied two-way conversation between a real person from the Census Bureau over the five mailings, with messaging that is not repetitive and that anticipating questions a potential respondent may have, may help boost response rates from noncompliant households. To increase the chance that mail pieces are opened and read, the outer package of each mailing should be unique. Mailings should vary in size and format of PSMs, post cards, envelopes and enclosures. While each mailing and mail piece should be unique to grab a potential respondent’s attention, all contact materials should share a simple, official, and consistent design to build a connection across mail contacts and to meet expectations from the public on how a piece of mail from the government should look.

Leverage behavior insights to communicate personal benefits

While it is critically important that survey materials frame the ACS response in terms of real community benefits, survey communication should also highlight internal benefits to participation in surveys. A personal benefit is achieved when completing a desired task with a desired outcome. For example, some potential respondents can feel a sense of pride when fulfilling a civic duty, some get a similar feeling from helping others in need, while others may also feel enjoyment when completing an interesting or important task. Framing the ACS response as part of a civic duty, as something that will help others in need, or as something interesting may compel some households to respond out of a desire for personal gratification

(World Bank 2017).

Another personal benefit that can be communicated is to frame ACS participation as a normal activity. This can be done in multiple ways. First, people feel more comfortable when they conform to the actions of those around them. Highlighting that millions of households

152 have completed that ACS may communicate that this task is normal, and to conform to this norm, all households should reply. Highlighting the normalness of the ACS request may also increases a feeling of self-efficacy and raises a potential respondent’s confidence that they can easily comply with the survey request, as millions of others before them have as well.

Messaging should reinforce that responding to a survey request is a normal activity similar to their own actions. If ACS response can be framed as a routine civic duty, for example, paying taxes or getting a driver’s license, people may feel more comfortable and less burdened by the response task, because they will see survey participation as an action consistent with their own previous behavior. People generally behave in ways that reinforce their personal identities and behave in accordance with real or perceived social norms. Crafting a survey request as a behavior that is both consistent with a potential respondents’ previous actions, or in accordance with the actions of those in their community and social groups, may be effective to gain survey compliance.

The recommendation to communicate that the ACS is a normal activity, and to communicate that millions of households have already responded to highlight the normalness of this task, comes with a risk. Literature also suggests that survey requests should be framed as a unique, scarce, opportunity. For example, highlighting that a household’s response to the ACS speaks for their community may make a household feel special and important, while communicating that millions of households have responded may communicate that the task is not important. I recommend that communicating both messages may be possible if they are worded correctly and staged appropriately across mailings. It is a fine line, but ACS materials needs to communicate both messages, that while millions of households have replied to the

153 ACS, it is also critically important that each household replies and that response is a special opportunity to participate in civic engagement, help those in need, and benefit their community. sequencing these messages across mailings may help achieve this goal.

Reduce burden by presenting clear, non-complicated instructions and deadlines

People are more likely to complete an action if they can clearly understand the steps an action requires. Responding to the ACS should not be a mystery, confusing, or complicated. Survey request materials should also establish positive expectations of survey response to influence the way people experience the survey request in a positive way. Messaging in mail communication should contain a clearly worded survey request, present easy-to-follow cues to action, simply communicate the desired outcome, and minimizing ambiguity and extraneous details. Materials should directly ask a potential respondent to make a commitment to complete the ACS. People are more likely to complete a task if they make a commitment to the task and can see all of the steps toward competition of the task (Milkman, et al. 2011; World

Bank 2017). Including a commitment device, such as a commitment checkbox or a calendar that notes a due date for the survey, may help potential respondents commit to the task by visualizing their schedule and picking a date by which to respond. Consistent with previous decennial census testing (Martin 2009; Stokes, et al. 2011) adding a due date to the ACS request can reduce the ambiguity of the survey task. This is backed up by qualitative evidence that due dates would be a motivating factor for some people to open a survey request and to complete the survey on time (Macro 2009; Kephart, et al. forthcoming). To reduce burden and make the survey request clear, a due date should be added to the ACS mail request.

154 Highlight that participation is mandatory in a respectful tone

Messaging that communicated that responding to the ACS was mandatory was among the least favored in the messaging in multiple qualitative studies (Hagedorn, Green, and Rosenblatt

2014; Hagedorn and Green 2014). However, in practice the Census Bureau has found that the mandatory message significantly increases self-response rates (see Dillman 2000; Phipps 2014;

Oliver, et al. 2017; Barth, et al. 2016; Oliver, Risley, and Roberts 2016). Respondents might not admit in a qualitative interview that mandatory messaging is a favorable message but highlighting the mandatory nature of the ACS in survey communications has been effective at motivate sampled households to respond. Striking a balance between communicating that the

ACS is required by law, without over stating the point, may be important. Also, including this message on the envelope may be help overcome the first barrier to response: opening the mailing (Dommeyer, Elganaya, and Umans 1991; Dillman, et al. 1996; Dillman 2000; Edwards, et al. 2009; Ridolfo, et al. 2019). While communicating that responding to the ACS is required by law, mailings from the Census Bureau should follow rules of etiquette and not make demands of the respondent (Comley 2006). The communication materials should state, clearly, that responding to the ACS is required by law and communicate that the Census Bureau is seeking the help of a household with an important task.

Keep it simple: Plain language, simplistic but eye-catching design

When reading an overwhelming amount of information, people may shut down and stop paying attention (Gross 1964). Minimize the number of decisions, messages, words and mailings may reduce decision fatigue and content overload. Follow the rules of plain language and write materials in an active voice using short paragraphs, short words, simple phrasing, and familiar

155 language that is accessible and easy to read for people at low literacy levels. Be mindful of demographic characteristics of the audience and consider how these characteristics could impact how messages are received. Envelops are extremely important messaging device that can overcome the first barrier to response, opening the mailing from the Census Bureau.

This process begins immediately on the outside of the first mailing or mail envelope.

Particularly, a critical moment identified by Dillman was that a letter or mailing must first be noticed, remembered, and opened before any message inside a mailing can be read or before a survey can be taken (2016). In ACS cognitive testing, between 15 and 21 percent of research participants claimed to have received the ACS mailing, but did not open it (Nichols 2012). The design and messaging on the exterior of mail packages (envelopes, PSMs, or post cards) that contain and deliver mail materials are critically important because they are the only factor that impacts a potential respondent’s decision to notice and open a mailing to read the contents.

Simple envelops that communicate the mail is from the government, official, and that something contained inside is required by law can overcome the first breakdown at gaining a response. Any design elements on the envelop should be used inside on mail materials to build a connection between mail pieces. Once opened, people can then engage in a pre-cognitive process to quickly take in the visual contents of the enclosed letter (Ware 2004). The visual layout of a letter can have an impact on people’s decision to read the letter or to ignore it. Once the decision to read the letter has been made, then decisions on wording and phrasing of the letter, and how trust and benefits are communicated, can impact a decision to begin the survey.

Conclusion

156 Developing messaging recommendations can draw effectively from many sources—typologies of respondents developed through Census Bureau research, general theories of human behavior, the fields if marketing and communication, and theories about why people do and do not respond to surveys. Each of these topics that have been covered in this chapter suggest possibilities for messaging recommendations. In addition, specific Census Bureau research on messaging proposals that people have responded to either positively or negatively suggest ways to draw from the existing communication and survey response theories.

Based on these considerations, I have articulated tentative messaging recommendations about what must be accomplished by the messaging in the ACS mail communication materials.

These tentative recommendations are used to guide the design of new materials and also provide the basis for undertaking a formal content analysis of current ACS communications in the next chapter of this dissertation. Before new materials are created, I will first examine the content of current communications in an effort to determine whether based upon theoretical research the current ACS materials need to be revised. If the materials follow the best practice recommendations developed in this dissertation, they may not need a full revision. In Chapter

5, I will use the literature summarized in this chapter to quantitatively analyze current messaging in relation to the previous developed theory and research just reported.

157 Chapter V. Analysis of ACS mail communication materials

The Census Bureau sends 13 different mail pieces (letters, postcards, brochures, envelopes, etc.) to households spread over 5 separate mailings (for details, see Chapter 2). This provides a limited amount of physical space to present messages to convince households to respond. It is important to know whether these mailing pieces provide important, mutually supportive information, or whether they are filled with redundant, and generally less-useful information. It seems important that every word, graphic, and message sent in these mailing is specifically written to communicate messages that increase response propensity.

In an effort to understand exactly what is communicated, when it is provided, where it is provided, how many times it is provided, whether it is consistent or inconsistent, and thus how it is like to affect recipients, a formal content analysis was undertaken. This review will show if the ACS mail communication materials follow best-practices for survey messaging.36

Methodology

Content analysis is a research methodology that is used to conduct systematic analyses of text to track the presence of words, themes, or concepts and to uncover meaning and patterns across the content of a series of documents (Krippendorf 2018). Based on the patterns uncovered, researchers can make inferences about the meaning, purpose, or quality of messages within the texts, as well as other analysis possibilities. In this dissertation, content

36 The content of this chapter is based upon an invited presentation I made for the National Academy of Sciences, Engineering, and Medicine, on, Improving the American Community Survey Workshop (Schreiner, 2018b), results of which appear in a National Research Council report (National Academies of Sciences, Engineering, and Medicine 2019). Previous versions of this chapter were also presented at the American Association for Public Opinion Research annual meeting (Schreiner, 2019) and the Southern Demographic Association annual meeting (Schreiner 2018c). A version of this analysis will also appear in a forthcoming Census Bureau report (Schreiner, Oliver, and Poehler forthcoming).

158 analysis was used to identify the presence and placement of different messages across ACS mail communication materials in order to assess the quantity and quality of communicated messages and to identify messages that were not communicated.

To analyze the content of the ACS mail communication materials, I led a team of researchers from the Census Bureau to categorize (or “code”) the messages in the ACS mail communications to uncover underlying concepts or themes (Auerbach and Silverstein 2003;

Gibbs 2007). Coding the messages in the ACS mail communication makes it possible to uncover patterns in the messages communicated across mail contact materials that otherwise would be difficult to discern. Because no similar project analyzing the messaging in survey mail contact communications has been conducted, the first step to code messages was to create a list of codes to categorize the content of the ACS mailing pieces. The process of a priori coding – a coding strategy that uses a pre-existing theoretical framework to develop and operationalize codes – was used. The theoretical framework that was used as the basis for coding comes from the literature reviewed and recommendations developed in the previous chapter of this dissertation (see also Oliver, Hiemel, and Schreiner 2017). Developing codes is a complex, iterative process involving multiple steps. First, an initial list of code, called a codebook, is developed from the original theoretical framework; Second, initial piece of content is coded independently by multiple coders to test the exhaustiveness of the codebook; third, the list of codes is assessed and edited as more refined ways to divide and subdivide content into mutually exclusive and meaningful content categories emerge from the data (Saldaña 2015).

The goal of the initial iterative coding process is to develop definitions for operational codes that can be consistently used by multiple researchers to consistently code the text of

159 documents. This list of operational codes is referred to as a codebook, that once created, is unchanging throughout a content analysis process. If the codebook changes in the middle of a content analysis, all previous materials need to be recoded according to the changes to the codebook. This way, all materials coded will have had the same metric for coding applied, resulting in consistent coding results across text documents (Creswell 2013).

With a codebook in place, multiple individuals may perform the coding of the same source content. Some researchers argue that using multiple coders, especially in grounded coding, can be beneficial as multiple minds can provide insight into better ways of analyzing and interpreting the data. Moreover, the questions and discussions that arise among multiple coders often generate new, richer codes and more refined codes (Olsen, et al. 1994). Once a codebook is made, multiple coders allows for verification that the codes are working as intended be having some text content coded by two different people. If multiple coders can code the same text in the same way, this can be used as evidence of the effectiveness of the codebook design (Scott 1955; Creswell 2013; Lovejoy, et al. 2016). When multiple coders are used an inter-rater reliability (IRR) to mitigate biases in interpretation between coders (see

Walther, et al. 2013).

Developing the ACS codebook and coding ACS materials

In this content analysis, a team of three Census Bureau researchers coded the messaging content of the ACS mail materials.37 Messages can be communicated in a variety of ways, including, words, numbers, symbols, and graphics (see Dillman and Redline 2004; Christian and

37 I would like to thank and acknowledge the contribution of my coding team that included myself, Elizabeth Poehler, and Broderick Oliver of the Census Bureau.

160 Dillman 2004). Figure 37 shows how different parts of the first ACS letter contain messages that were coded:

Figure 37. Example of ACS messaging content

All of the content on the ACS letter can potentially communicate a message to a potential ACS respondent, including: 1) The form ID in the top left of the page, 2) the

Department of commerce logo, 3) header information, 4) the salutation line of the letter, 5) the body text of the letter, and 6) the web address using bold and centered text. The content of this letter can be interpreted in different ways by different people. For example, the form ID “ACS-

13(L)(2017)(6-2017” that is in bold font on the top left of the page can be interpreted in different ways by different people. Some readers may glance over this content without considering what it means, others may be confused by content that doesn’t make sense to them, and others may feel that it is standard and expected for a formal piece of mail from the government to contain such form identification codes or identifiers. The point of this content analysis isn’t to code the myriad ways that each message can be interpreted by a potential

161 respondent. The strategy for coding is to code the intended message that each piece of content in the ACS mail materials is sending from the Census Bureau to sample households.

To code the intended messages sent ACS mailing item, a two-person Census Bureau research team compiled a list of potential messages that could be sent in the ACS mailings. This list came from two places. First, messages tested by the census or mentioned in the literature reviewed in the previous chapter, and second, messages currently contained in the ACS mailings. The list of potential messages is exhaustive for the purpose of this research. The literature reviewed in Chapter 4 identifies messages that are potentially useful for survey communications, meaning, that each is potentially communicated within the ACS mail communications. However, the ACS mail communications communicate messages that are not recommended by the literature reviewed in Chapter 4. Therefore, the list of potential codes had to also include codes for these messages as well so that all message would receive an appropriate code.

In total, a list of 76 codes was compiled. Qualitative research studies can generate 80 to

100 codes organized into 5 to 7 main categories and sometimes up to 20 sub-categories

(Lictman 2012). The coding team next organized the 76 messaging codes into four main categories. The first three categories were adapted from the Social Exchange theory which claims that survey communications attempt to do 3 main things. First, messages could try to establish or build trust with a respondent. Second, messages could communicate the benefits of survey participation. Third, messages could attempt to reduce the cost of participating or the perception of burden of participation. A forth category for other messaging was also added to

162 code messages that did not cleanly fit into the three other categories, such as instructional and informative messaging. This process produced the codebook shown in Table 8.

Table 8. Content analysis codebook

Code Code Description Number 1.0 ESTABLISHING OR BUILDING TRUST 1.1 Establish credibility 1.1.1 Connection to a sponsor (known and trusted) 1.1.2 Leverage authority (e.g., messages that highlight Census Bureau expertise) 1.1.3 Sender is a real person (not an organization) 1.1.4 Provide a way to verify authenticity (e.g., a website, telephone number, address) 1.1.5 Use of a signature 1.1.6 Use of a real stamp 1.1.7 Audience-based, single conversation across mailings 1.1.8 The Census Bureau is an apolitical research agency 1.1.9 History (e.g., the census has been conducted since 1790, ACS-type surveys since 1850) 1.2 Confidentiality/data security 1.2.1 By law, the Census Bureau must protect your data 1.2.2 Census employees face fines and imprisonment if they violate your confidentiality 1.2.3 Federal Cyber Security Act 1.2.4 The Census Bureau cannot share your data 1.2.5 Secure website, encrypted browser, screening system transit data 1.2.6 Oath of disclosure statement 1.2.7 Will not release data in a way that identifies you (i.e., data are aggregated) 1.3 Token pre-incentives 2.0 COMMUNNICATING BENEIFTS 2.1 Community level benefits 2.1.1 Specific mention that the survey benefits others in need (e.g., allocation of services) 2.1.2 ACS data used by non-profits and non-government agencies to provide aid 2.1.3 Not filling out ACS may hinder your community’s ability to gain resources 2.1.4 Distribute $600 billion in federal funds to communities 2.1.5 Used for planning and developing roads, hospitals, schools etc. in communities 2.1.6 Emergency preparation 2.1.7 Provides communities data on education, housing, employment etc. 2.1.8 Data driven or well-informed decisions 2.2 National level benefits 2.2.1 Provides country data on education, housing, employment, etc. 2.2.2 Allocate $675 billion in federal dollars (no mention of communities) 2.3 Personal/interpersonal benefits

163 Code Code Description Number 2.3.1 Scarcity (rare opportunity) 2.3.2 Ask the respondent for help, establish unity 2.3.3 Establish positive expectations 2.3.4 Likability 2.3.5 Responding to the ACS is a way to make your voice heard 2.3.6 Reciprocity – thanking respondents, building good will, etc. 2.3.7 Highlight the survey’s importance to build intrigue 2.3.8 Highlight the survey topic or questions as interesting or entertaining 2.3.9 Patriotism 2.4 Incentive (monetary or non-monetary pay for participation) 2.5 Business use of ACS data 2.5.1 Economic indicators used by business industry 2.5.2 Specific mention of small business 3.0 REDUCING COST OF PARTICIPATION OR PERCEPTIONS OF BURDEN 3.1 Social norms 3.1.1 Consistency (e.g., completing ACS is similar to actions the respondent already does) 3.1.2 Conformity (e.g., others have responded to the ACS) 3.2 Civic responsibility or duty 3.3 Ask for a commitment 3.4 Offer multiple response modes 3.4.1 Another response mode is available later (a push for the first response mode offered) 3.5 Responding online is quick/easy 3.6 Mandatory message – legally obligated to respond to ACS (Title 18/Title 13) 3.7 Provide a deadline date or similar indication (e.g., return as soon as possible) 3.8 Nonresponse follow-up messaging (e.g., If you don’t respond, the Census Bureau will contact you by telephone or in person) 3.9 Provide a way for respondents to get help 3.10 Communicating in a language other than English 3.11 The Census Bureau will pay for your return postage 4.0 OTHER (Messages that may influence an individual’s decision to participate in a survey, but are not classified as trust, a benefit, or burden reduction) 4.1 Scientific 4.1.1 Sampling (i.e., mention of a random sample) 4.1.2 Data accuracy 4.1.3 ACS data can track change over time 4.1.4 ACS data are made available to the public 4.1.5 ACS data are better than alternative sources of data 4.1.6 ACS is a unique source of data 4.1.7 Continuous data collection

164 Code Code Description Number 4.2 Cost or environmental savings or efficiency 4.2.1 Benefits of ACS participation outweigh the cost of participation 4.438 Instructions or information 4.4.1 Call to action – request to complete the survey 4.4.2 Informational instructions (e.g., see enclosures) 4.4.3 Estimated time to complete survey 4.4.4 Information 4.5 Required information not intended to send messages to ACS audience (e.g., form number) 4.6 Cultural expectation for a business letter (e.g., salutations, letterhead, thank you)

Using this codebook, two Census Bureau researchers independently coded all content in the ACS mail communication materials. Beginning in the upper left-hand corner the first piece of content that needed to be coded was the form ID, which was coded by both researchers at code 4.5 (required information). Moving to the right across the page, the next content coded was the header communicating sponsorship information (code 1.1.1). As shown in Figure 38, sentences can contain multiple codes. The first sentence in the first ACS letter states “Your household has been randomly selected,” which was given code 4.1.1 (part of a random sample), while the second half of the sentence which states, “complete a very important national survey, the American Community Survey,” was given code 2.3.7 (highlight importance of the survey). sponsor information, followed by the salutation, and then the first sentence in the body of the letter.

38 An early version of the codebook categorized “Business uses” as code 4.3. However, this code was later moved to code 2.5 under “Benefits.”

165 Figure 38. Example of coded messages

This process continued down across the page until all content in all ACS mail communication received a code. After each coder independently coded the ACS materials. To verify that the coding procedures produced consistent codes between the coders, an inter-rater reliability (IRR) score was calculated using the following formula (Miles, Huberman, Saldaña,

2019):

푛푢푚푏푒푟 표푓 푎푔푟푒푒푚푒푛푡푠 푟푒푙푖푎푏푖푙푖푡푦 = 푛푢푚푏푒푟 표푓 푎푔푟푒푒푚푒푛푡푠 + 푑푖푠푎푔푟푒푒푚푒푛푡푠

An IRR estimate of 0.80 indicates that 80-percent of the variance in the observed scores is due to similarity in ratings between coders, and 20-percent is due to differences in the ratings

(Hallgren 2012). Achieving 100 percent agreement between intendent coding is not the goal of using the IRR formula. What is desired is and an IRR scores above 80 percent on most mail pieces. If this is achieved, it validates the that the coding strategy worked and was mutually understood and deployed by independent coders. This level of agreement also allows a process to adjudicate differences to produce consensus coding (Miles, Huberman, and Saldaña 2019).

166 IRR results are presented in Table 9.39

Table 9. IRR results from coding ACS mail communication materials

Mailing Mail Piece Notes IRR Score 1 Letter Enclosed in a package 87.2% 2 Letter Pressure seal reminder mailer 85.3% 3 Letter Enclosed in a package 77.8% 5 Letter Pressure seal reminder mailer 77.1% 4 Postcard Reminder postcard 92.3% 1 Instruction Card Enclosed in a package 75.0% (internet) 3 Instruction Card (choice) Enclosed in a package 71.4% 1 Multilingual Brochure Enclosed in a package 86.7% 1 and 3 FAQ Brochure Enclosed in a package 86.4% 3 Paper Questionnaire Enclosed in a package 87.0% 1 Outgoing Envelope Package mail 91.7% 3 Outgoing Envelope Package mail 91.7% 3 Return Envelope For return of paper 90.0% questionnaire

Of the 13 mail pieces coded, nine had an IRR score of 80 percent or higher. Two letters

(from the third and fifth mailing) had agreement rates of 77.8 and 77.1 percent respectively.

The two instruction cards had the lowest IRR scores of 75 and 71.4 percent, respectively. These mail materials had very little content to code, which can make achieving high IRR rates difficult as a few disagreements between coders had a large impact on the IRR as compared to disagreements in the other mail pieces.

Based on the results of the IRR coding, an adjudication process was conducted with input from a third independent coder. As an extra step in validating coding results, the third coder would independently code selected mail pieces. All codes were than discussed between

39 To further validated the results of the IRR test I did not compute the IRR salutations. IRR calculations were produced by Broderick Oliver and verified by Elizabeth Poehler of the United States Census Bureau.

167 the three coders until a consensus coding was produced. The adjudication process revealed fewer code disagreements where coders chose the same text and applied completely different codes were present. In most cases of disagreement, one coder identified multiple messages to code within a sentence where the other coder only identified one. With the help of the adjudicator, these disagreements were resolved until coding for all messages were agreed upon by all three coders.

Additional Analysis

In addition to the coding content analysis, two additional methods were used to analyze the content of the ACS mail communication materials. First, to assess the readability of the ACS mail communication materials, Census Bureau researchers conducted two Flesch–Kincaid readability tests: the Flesch Reading Ease test (FRE) and the Flesch–Kincaid Grade Level test

(FKG) (Flesch 1949; Kincaid, et al. 1975). The FRE provides a score that indicates the level of difficulty of a body of text (from 0 to 100, with higher scores easier to read). The FKG provides an assessment represented in terms of the U.S. grade level required to easily read and understand the text (from 5th grade to college graduate). Both tests use word length and sentence length – i.e. wordier sentences and lengthier words being more difficult – as metrics to produce readability scores but use different weighting factors (Landry 2017; Onwuegbuzie, et al. 2013). The algorithm for each of score is:

• FRE = 206.835 – (1.1015 * ASL) – (84.6 * ASW)

• FKG = 0.39 * ASL – 11.8 * ASW

Where: • ASL = average sentence length (i.e., the number of words divided by the number of sentences)

168 • ASW = average number of syllables per word (i.e., the number of syllables divided by the number of words) Table 10. Flesch reading ease score and Flesch-Kincaid grade level

FRE Score U.S. Grade Level Interpretation 90 – 100 5th grade Very easy to read––easily understood by an average 11-year-old 80 – 90 6th grade Easy to read––conversational English for consumers 70 – 80 7th grade Fairly easy to read 60 – 70 8th - 9th grade Easily understood by 13- to 15-year-old students 50 – 60 10th - 12th grade Fairly difficult to read 30 – 50 College Difficult to read 0 – 30 College graduate Very difficult to read––best understood by university graduates Source: Flesch 1949; Kincaid, et al. 1975.

Interpretations of FRE scores and grade level equivalents are shown in Table 10. All materials that had enough text to conduct the analysis were scored based on both the FKG and

FRE readability analyses.40 All of the body text of letters and brochures were included. Some content, such as form ID’s, header information, and brochure cover text, was excluded from analysis. The tests require large chunks of body text, and these buts of content did not fit the analysis. Other factors affect readability include how information is organized, graphic design, and relevance of text to the reader. However, the Flesch-Kincaid Readability tests do not measure these aspects of readability. The tests only assess the difficulty of text based on length of words and sentences (Flesch 1949; Kincaid, et al. 1975).

Based on the logic of the Flesch–Kincaid Readability tests, a second analysis method, word count analysis, was selectively used as an informal, quick, way to note the “volume” or

“space” a message takes up on a written page. For example, two messages coded as

40 I would like the acknowledge the contribution of Broderick Oliver who calculated and verified readability scores.

169 communicating survey sponsorship would both be coded the same. However, one could be a simple statement using 5 words, while the second could be a long sentence containing 25 words. Word count analysis can help assess the volume or space the messages take up in ACS mail communication materials. This is useful, as printed communications have a limited amount of space to communicate messages, so it is important to note if certain messaging takes up an unreasonable amount of space compared to the impact the message can have to increase response to the survey request.

Results of the content analysis

The content analysis uncovered six main findings described in sections below.

• Messaging may be overwhelming • Messaging is repetitious • Messaging lacks strategic purpose • Missed opportunities • Sponsor information is not communicated clearly • Graphics and format inconsistencies

Messaging may be overwhelming

To maximize comprehension and retention, written communications should contain a limited number of messages as too many messages can overload the reader and make it difficult for a reader to understand, remember, and act upon the presented information (Gross 1964;

McCormack 2014; Poldre 2017). In total, the coding process identified 358 messages across all mail pieces, as shown in Table 11.

170 Table 11. Number of messages in each mailing

Number of Mailing Description Messages

Mailing 1 Package mailing comprise of multiple mail pieces 129 messages Mailing 2 Pressure seal reminder mailer 31 messages Mailing 3 Package mailing comprised of multiple mail pieces 146 messages Mailing 4 Reminder postcard 21 messages Mailing 5 Pressure seal final reminder mailer 31 messages

There is no rule to determine when a written communication induces information overload. However, if a single communication contains too many messages, and if those messages contain new, non-familiar information, then the chances of overload increase greatly (Gross 1964). Because of this, an initial written communication to a potential survey responder should be simple, make an introduction, establish a relationship, build trust, and should not overwhelm the reader with details. By accomplishing this in an initial communication, subsequent communications can build off this and include more details.

These details are more effective, because the recipient is primed to receive additional messaging after the introductory communication (see Cialdini 1984; Cialdini 2016).

The initial ACS mailing package contains an outgoing envelope, an introduction letter, an instruction card, a multilingual brochure, and FAQ brochure, as shown in Figure 39.

171 Figure 39. Images of the five mail pieces in Mailing 1

These five mail pieces contain 129 messages using 39 of the 76 total messaging codes, as listed in Table 12.

172 Table 12. Frequency of messages in Mailing 1

Message Frequency Connection to a known sponsor 23 Way to verify authenticity (website, phone number) 14 Required information 11 Other information 8 Messaging that meets a cultural expectation 6 Providing a way for respondents to get help 5 Mandatory legal obligation to complete the survey 4 By law, the Census Bureau must protect your data 4 The Census Bureau won’t release data in a way that identifies you 4 Mention or reference the Federal Cyber Security Act 3 Secure website, encrypted browser, screening system transmit data 3 Data used for planning development 3 Highlighting survey topic or questions as interesting or entertaining 3 Appeal to patriotism 3 Call to action 3 Informational instruction 2 Community-level benefits 2 Provides community-level data on education, housing, employment, etc. 2 Highlighting survey importance to build intrigue 2 Languages other than English* 2 Your response needed for data accuracy 2 Another response mode is available later 2 Sampling (i.e., mention that respondent is part of a random sample) 2 General confidentiality/data security statement 1 Response benefits others in need 1 Used for emergency preparation 1 Used for data driven and well-informed decisions 1 Provides national-level data on education, housing, employment etc. 1 General national-level benefit statement 1 Personal/interpersonal-level benefits to survey response 1 Establishing positive expectations 1 ACS is a continuous survey 1 Businesses use of ACS data 1 Providing a (vague, non-specific) deadline 1 ACS data used to track changes over time 1 ACS is better than other sources of data 1 Cost and environmental savings to responding online 1 Audience-based, single-conversation messaging 1 Length of survey estimate 1 Total 129 * Non-English messages in the Multilingual Brochure were excluded from this tally.

173 This volume of messaging may be too much for any single mailing communication, and especially for an introduction mailing. In addition to the first mailing package containing too many messages, some individual mail pieces may also contain too much messaging for a person to reasonably process. Table 13 presents the distribution of the total 358 codes assigned to the

English-language messaging elements, by mail piece.

Table 13. Number of messages and codes in each mailing item

Number of Number of Mailing Mail Piece messages codes used 1 Outgoing Envelope 9 5 Instruction Card (internet) 12 9 Letter 39 22 FAQ Brochure 42 25 Multilingual Brochure 27 19 2 Pressure Seal Mailer 31 14 3 Outgoing Envelope 10 5 Instruction Card (choice) 13 9 ACS Questionnaire 30 13 Letter 42 23 Return Envelope 9 5 FAQ Brochure 42 25 4 Postcard 21 13 5 Pressure Seal Mailer 31 18 TOTAL 358 205

The Frequently Asked Questions (FAQ) Brochure (sent in both the first and third mailings) and the letter in the third mailing contain the most coded messages, with 42 each. As shown in Table 12, it is common for a message to be communicated multiple times in single mailing or mail item. For example, in the first mailing, 23 different messages communicated that the Census Bureau is the sponsor of the ACS. The 42 messages in the FAQ brochure used

25 different codes, showing that about 60 percent of the messages are communicating

174 something unique to the respondent, while 40 percent of the messages are reinforced by multiple messages within the communication. ACS letters also had more total messages and total number of codes used than other mail items, indicating that letters are the primary communication device and communicate a lot of total messages and also a lot of different messaging codes. This suggests that ACS letters are communicating a lot of messages and may not be focused, communicating a lot of messaging codes as well.

Not only do ACS mail materials communicate a lot of messages, but the messages are written in a way that may not be understood by many recipients. The Plain Writing Act (2010) requires that federal agencies write in a way that the public can read and understand. Studies show that many people in the U.S. read at a low reading level. Over forty percent of adults living in the U.S. have basic or below basic reading skills (OECD 2013; Kutner, Greenberg, and

Baer 2005; OECD 2016). Moreover, most adults prefer reading content that is below their ability level (Nielsen 2005; Landry 2017). Recommendations for the appropriate reading level for a piece of written content varies widely depending on the audience. Some claim messaging targeting the average consumers should be written at the 7th grade level (Landry 2017). Others recommend communications written at the 8th grade level for the general public, and at the 6th grade level or lower if the target population is known to read at a lower level (Kimble 2012).

Still others say that the location of the text matters. For example, text on a homepage of a website should be at the 6th grade level while other linked pages can be at higher levels up to the 8th grade (Nielsen 2005). There is not one agreed upon grade level for the general population, but because a sizeable portion of the U.S. adult population reads at a low literacy level, it is likely useful to write communications at as low of a grade level as possible to

175 maximize the likelihood that the message will be received as intended by a broad audience.

Table 14 present readability scores and corresponding grade-levels for select text-dense English language ACS mail materials.41

Table 14. Flesch reading ease scores of ACS mail materials

Mailing Mail Piece FRE Score Readability of text for a Grade Level 13-15 year-old student 1 Letter 54.6 fairly difficult to read 10th -12th Multilingual Brochure 48.6 difficult to read College FAQ Brochure 44.0 difficult to read College 2 Pressure Seal Mailer 56.9 fairly difficult to read 10th -12th 3 Letter 60.6 easily understood 8th- 9th 3 FAQ Brochure 44.0 difficult to read College 4 Postcard 70.7 fairly easy to read 7th 5 Pressure Seal Mailer 62.7 easily understood 8th- 9th

The introductory mailing seems particularly problematic, with three mail pieces rated as

“fairly difficult to read” (10th to 12th grade level) or “difficult to read” (college level). The PSM in mailing 2 was also rated “fairly difficult to read” (10th to 12th grade level). Not only are these materials written well above the level of low-reading-ability audiences, but these are written above the level that conservative estimates say communications should be written for a general adult population (Kutner, Greenberg, and Baer 2005; Kimble 2014).

This readability analysis suggests that ACS materials could be improved by simplifying the messages, making them accessible to more people. I also recommend reducing the total number of messages and mail pieces. This is especially important in the initial communication.

Because more highly educated household respond early in the self-respond data collection

41 I would like to acknowledge the work of Broderick Oliver of the U.S. Census Bureau for conducting this readability analysis, as well as Elizabeth Poehler for her contributions to this analysis. A version of this analysis will appear in a forthcoming Census Bureau report (Schreiner, Oliver, and Poehler forthcoming).

176 period, reducing the number of message and writing in plain language may be increasingly important in later communications as the potential respondent universe includes a higher percentage of households with residents who read at lower reading levels.

Messaging is repetitious

In a multi-contact communication strategy, it may be necessary to repeat certain messages across mailings. However, if repetition is overused it can make mail materials appear too similar. Potential respondents may then skip reading a mailing because they may think they have read it before (Dillman, Smyth, and Christian 2014; Dillman 2016).

Repetition can occur in two ways. Repetition can be verbatim where messages are copy and pasted identically in multiple mailings. Repetition can also occur from paraphrasing, where the exact wording of a messages changes but the message communicated to the reader is the same. For example, in the first ACS letter, the following message is communicated:

“This survey collects critical information used to meet the needs of communities across

the United States. For example. Results from this survey decide where new schools,

hospital, and fire stations are needed.”

The letter in the third mailing, the following message is communicated:

“Local communities depend on information from this survey to device where schools,

highways, hospitals, and other important services are needed.”

In this example, the Census Bureau uses different phrasing and sentence structure to communicate essentially the same message; that communities use this data to decide where to build infrastructures and locations for services. The only different is that the first message mentions “fire stations” while the second mentions “highways” instead. It may be possible that

177 this message is so important that it should be rephrased and repeated in multiple mailings. I do not think this is the case. Unless someone is more swayed by ACS data being used to build highways compared to being used to build fire stations, it is unlikely that the second paraphrased message will convince anyone new to respond to the ACS. This message does not provide a new appeal to respond to the ACS. Moreover, it takes up valuable, limited space on the letter that could be used to communicate a new appeal to respond.

To visually represent the repetition in the ACS, the highlighted text in Figure 40 shows messages from the ACS letters and postcard highlighted in blue that is repeated in at least three of the seven most text-dense items (the four letters, postcard, multilingual brochure, and FAQ

Brochure).

Figure 40. Messages repeated verbatim or paraphrased in the ACS letters and postcard

178 As Figure 40 shows that the majority of the content in the ACS letters and postcard is repeated in other items. A word-count analysis shows that 78 percent of the body text (excluding header and foot text and logos) is repeated (either verbatim or paraphrased) three times or more across mail material (see Table 15 for the percent of content repeated in each mailing).

Table 15. Repetition across letters and postcard

Repeated Total Repetition Mail piece word words Rate Mailing 1 letter 205 291 71% Mailing 2 letter 122 145 84% Mailing 3 letter 194 279 70% Mailing 4 postcard 122 126 97% Mailing 5 letter 156 184 85% Total 799 1025 78%

The purpose of some of this repetition be to provide similar information in each mailing to ensure that critical messages are communicated to recipients who did not receive, read, or remember information from the previous mailings. Coding showed that each letter and postcard include messages that communicated the following:

• The correspondence is from the Director of the Census Bureau • Instructions for responding to the survey • The survey is the American Community Survey • A number to call for help • A “thank you” message • Responding to the ACS is required by law

These are important pieces of information to communicate and repeat across mailings.

However, there are additional instances of repetition that may be less logical.

One of the main sources of repetition across mail pieces is the communication of legal

179 obligation, data security, and confidentiality messages. For example, the following statements is included and repeated, verbatim, in multiple mail items:

“You are required by law to respond to this survey. The U.S. Census Bureau is required by

law to keep your information confidential. The Census Bureau is not permitted to publicly

release your response in a way that could identify you. Per the Federal Cybersecurity

Enhancement act of 2015, your data are protected from cybersecurity risks through

screening of the systems that transmit your data.”

This messages, and similar messages communicating legal obligation, data security, and confidentiality, take up a sizeable portion of the body-text of some ACS mail items, as shown in

Table 16.

Table 16. Word count and repetition rate of legal obligation, confidentiality, and data security statements

Legal Obligation, Confidentiality, and Data Security Total Repetition Mail Material Word Count Word Count Rate Mailing 1 letter 82 291 28% Multilingual Brochure in first mailing* 83 184 45% FAQ Brochure in first and third mailing 141 413 34% Instruction Card in first mailing* 0 56 0% Instruction Card in third mailing* 0 70 0% Mailing 2 letter 9 145 6% Mailing 3 letter 82 279 29% Mailing 4 postcard 10 126 8% Mailing 5 letter 66 184 36% *Only text written in English was included in this analysis.

In five of the seven mail pieces where the legal obligation, confidentiality, and data security messages are found, at least 28 percent of the text is devoted to these messages.

Confidentiality and data security statements are intended to build trust, but too strong of a

180 statement may be off-putting to respondents (Singer, von Thurn, and Miller 1995). The evidence on the impact of legal requirement statements is not conclusive. Some studies on informed consent statements have found that shorter consent statements can lead to higher levels of trust and compliance (Tait, et al. 2013; Das and Couper 2014; Perrault and Nazione

2016). Another recent study found that the length of survey initiation privacy and legal obligation statements didn’t impact response rates, but a limitation of this study was that it only examined such statements in the “fine print” area of survey materials, not within the body-text of survey invitation letters (see Bucks and Couper 2018). This study also included a cash incentive, that is known to motivate respone in way that can overcome different barriers to response (Warriner, et al. 1996; Grove, Singer, and Corning 2000; Lesser, et al. 2002;

Birnholtz, et al. 2004). These studies echo work completed earlier at the Census Bureau that had a similar finding (Dillman, et al. 1996).

In numerous focus groups and interviews with respondents, data security and confidentiality messages are ranked as some of the lowest in their ability to convince a household to reply to a survey request and in believability (Macro 2009; Hagedorn and Green

2014; Hagedorn, Green and Rosenblatt 2014). Cognitive research also suggests that simple statements that assure security, but do not overstate problems of data security or confidentiality breaches, may be more effective (Fobia, Holzberg, and Childs, 2017). Similar to telling someone to not think about a pink elephant, which puts the idea of a pink elephant into their mind making it almost impossible to not think about a pink elephant, mentioning “secure systems” and “data screening procedures” might make some people pause as they consider the many ways that their data might be unsafe if they respond. While evidence does not

181 unanimously recommend shorter or simple data security or confidentiality statements, none of the reviewed studied found evidence that lengthier statements are helpful to ease fears, build trust, or boost response rates. Qualitative evidence seems to suggest that simple statements may be better.

The current statement that is used, and repeated, in the ACS materials is lengthy and takes up a lot of space in letters and brochures, but it is possible this statement raises more fears than it alleviates. The exact phrasing of this statement is dictated by policy guidelines.

While guidelines dictated that the Census Bureau inform respondents of these statements as written, it is not required that the ACS mail communication materials are the main vehicle of this message, or that the message is repeated verbatim across mailings or mail pieces. Yet the same data security and legal obligation messages are used in all five ACS mailings and multiple times in the initial mailing and third mailing across letters and brochures. Moreover, it is communicated three times in the first mailing package, as shown in the space blocked out in yellow in Figure 41Figure 1.

Figure 41. Space (in yellow) used to communicate legal obligation and data security messages in Mailing 1 mail items

182 The placement of these statements is hardly in the “fine print” on ACS mail communication materials. It is front, center, and repeated. Research suggests that initial communication should be simple, straightforward, and establish a baseline of trust to build on in future messaging (see Cialdini 2016; Oliver, Heimel, and Schreiner 2017). There is no evidence that repeated mentions of the same confidentiality data security message increases respondent trust that their data will be kept secure. The Census Bureau should consider only placing this statement once, and only when absolutely mandated by policy. In other mail materials it may be warranted and helpful to communicate a simple confidentiality statement, but detailed statements on legal obligation and data security may do more harm than good. If it is not required to repeat these detailed statements, they should not be repeated. when these phrases are necessary to repeat, if at all.

Aside from the legal obligation, confidentiality, and data security statements, repetition in the ACS supporting materials (the FAQ Brochure, Multilingual Brochure, and Instruction

Cards) is not as prevalent as what was found in the letters and postcards. It appears that the

Census Bureau developed the letters and post cards to be the primary means of communicating critical messages to sampled household members. The supplemental mail pieces each have a specific purpose, which helps focus the messaging on that purpose. For example, the purpose of a multilingual brochure is to communicate necessary messages to recipients who need assistance in one of five non-English language. The text is first communicated in English side-by-side with a Spanish translation. Inside the brochure, the text is presented in the other four additional languages, as shown in Figure 42

183 Figure 42. FAQ Brochure

Figure 43 highlights phrases that are repeated verbatim or paraphrased from other letters, postcards and brochures in the Multilingual Brochure.

184 Figure 43. Repetitious content in the multilingual brochure (highlighted in yellow)

It makes sense that more than half of the English-language messages in the multilingual brochure is repetitious; the purpose of this mailing is to communicate critical messages to non-

English speakers. It should also include important unique messages specific to the target audience of this mail piece such as the statement “Because you are living in the United States, you are required by law to respond to this survey.” This message is strategic to this mail piece because it communicates that everyone living in the U.S., including non-citizen permeant residents and temporary residents must complete this survey. This message may be particularly relevant to the segment of potential respondents that require help responding in a foreign

185 language.

Aside from specific messaging like this that targets non-English speaking potential respondent, it would be reasonable if all of the content in this mailing would be repetitious.

Critically statements that provide information to respond have been communicated in English throughout other mail pieces and are now in this mail piece communicated to non-English speakers. However, the content that was uniquely worded n this mailing was a bit surprising.

While this message is communicated in other materials, the statement: “In order to make well- informed decisions, a community needs accurate and reliable information. By responding to this survey, you are helping your community to get this kind of information” is uniquely worded in this brochure beyond simple paraphrasing. It is said in a way that isn’t specifically worded to target the population of this mail piece, and this phrasing could be beneficial in other mail items.

Figure 44 highlights phrases repeated verbatim or paraphrased in the FAQ Brochure.

Figure 44. Repetition in the Frequently Asked Questions (FAQ) brochure

Beyond the repeated legal obligation and data security statements, the FAQ Brochure contains

186 three sentences repeated or paraphrased from other mail materials. Most of the messages in the brochure are unique and seem to be written to either answer a respondent questions or to provide more details than what is already communicated in the letters and postcards. The messaging in the FAQ seems to fit the purpose of the mail piece. Because I recommend removing the detailed cybersecurity statements from the initial letter (to reduce the number of messaging, and to alleviate fears raised by these statements) and FAQ brochure, or similar device, might be an appropriate place for these more detailed messages.

The Census Bureau also sends two “instruction cards” in the first and third ACS mailings.

Figure 45 highlights phrases repeated verbatim or paraphrased in the Instruction Cards.

Figure 45. Repetition in the first and third mailing instruction cards

The instruction cards, found in the first mailing and the third mailing, contain very little English-

187 language messaging. These mail pieces provide instructions for how to complete the survey on a hard, cardstock piece of paper with only limited additional messaging. The only new English- language message in the instruction cards is a statement that indicates that information from the address label is needed to respond online. A recent study showed that removing the instruction card does not hurt self-response rates (Clark, et al. 2015a). This may be because these mail pieces are redundant with other mail pieces. Since some survey methodologists believe that too many inserts can potentially distract and confuse respondents (Dillman 2016) it may be time to remove one or both of these cards from the ACS mail contact strategy.

However, this is intertwined with decisions on envelope size. The large piece of cardstock paper allows for the Census Bureau to send a mailing packed in a larger, non-standard envelope. If it is determined that sending a mail package in this larger envelope helps response rates, it may be required to include an instruction card. In that case, I recommend utilizing the space on the card in a more efficient way to make the mail piece purposeful and less redundant. If the card is not needed for production-related purposes, then it should be removed.

Some repetition of critically important details is required. Each mailing should contain instructions on how to complete the survey. These messages should be paraphrased in a way so that the mailings feel unique. However, when this logical and necessary use of repetition across mailings is combined with illogical repetition of highly similar benefits statements and unnecessarily repeated data security, confidentiality, and legal obligation statement, the results are highly repetitious mailings that may not use space provided in the most effective way to communicate messages that can increase response rates. Reducing repetition will help reduce the total number of messages in mailings and provide space to communicate new appeals to

188 induce new households to respond.

Messaging lacks strategic purpose

In the previous section I examined the repetition of messaging in the ACS mail contact materials. One finding was that the multilingual brochure was, as expected, repetitious. The analysis also revealed that this brochure also contained a unique sentence not communicated in any other mail item that linked ACS response to the data that communities need to make informed decisions. This message provides a new motivation to respond but is misplaced in a mailing that most people may not read. Most people do not need the multilingual language support and may ignore this brochure, missing this new and unique appeal to respond. In this section, I identify other examples of non-strategic usage or placement of messaging. This review uncovered that the messaging in the ACS communication materials doesn’t seem to follow a guiding strategy to optimize the effectiveness of the messaging across mail pieces.

Rather, few new appeals are added after the initial mailing to convert non-respondents, some messaging is outdated, and some messaging lacks justification in the literature or is misplaced in the wrong mailing or mail piece.

Survey methodologists argue that different people may be motivated by different leverages (Groves, et al. 2006; Wenemark, et al. 2010). Based on the literature reviewed in this dissertation, I suggest that using a variety of appeals to convince non-respondents to participate is more effective than repeating the same appeals in follow-up communications and across mail pieces in the same mailing. As we saw in the previous section, the ACS mail materials are highly repetitious, especially in later mailings. As a consequence, repeated messages taking up room in these mail materials the content analysis found that the first

189 mailing used 39 of the 76 total codes for potential ACS messaging. By the end of the five mailings, a total of 44 messaging codes were used. This means only five new messaging codes are used in the ACS mail materials after the first mailing. The messages added after the first mailing are:

1. “If you have already replied, thank you…” – Fourth mailing letter (similar messages are also in the 2nd and 3rd mailing) 2. “Please complete the questionnaire and return it now OR go to https://respond.census.gov/acs to respond online” – Fourth mailing letter (similar messages are also in the 3rd and 5th mailings) 3. “Postage will be paid by the U.S. Census Bureau” – Third mailing letter 4. “If you do not respond, the Census Bureau may contact you by personal visit to complete the survey.” – Fifth mailing letter (similar messages also are in the 2nd, 3rd, and 4th mailings) 5. “We asked you to help us with this very important survey by completing it online. But we have not received your response yet.” – Third mailing letter

The first three new messaging codes from the list all represent messages that make sense to communicate after the first mailing. The first message thanking respondents if they had already responded can only be sent in a follow up mailing. He second message, providing instructions to complete and return the paper survey questionnaire, can only be included when the paper questionnaire is sent in the third mailing. Similarly, the third mailing includes a message that the Census Bureau will pay for the return postage of the questionnaire, which also only makes sense in a follow-up mailing.

The fourth message “If you do not respond, the Census Bureau may contact you by personal visit to complete the survey,” seems to be a strategic addition in later mailings. The first ACS mailing letter does not communicate this message. The second mailing contains a version of this message, that states, “Responding promptly will prevent your receiving additional reminder mailings, phone calls, or personal visits from Census Bureau interviewers”.

190 By the fifth mailing, the implication that a Census Bureau worker may appear at your house is explicitly stated. It seems reasonable, and strategic, to phase in this messaging over the course of the five mailings. However, starting this message in the second mailing may be premature. At the time of the second mailing some respondents who want to reply cannot do so because they are waiting for the mail paper questionnaire. While mentioning that Census Bureau worker may come to your house for not responding is not a threat, it may be interpreted that way by some respondents. It might be better to hold this message back until later mailings when all respondents have had the modes available to respond.

The fifth message on this list is a bit curious. Prior to August 2015, the ACS mail contact strategy included a pre-notice that provided basic information about the ACS to the respondent and announced that a subsequent letter would be sent in about 4 days with instructions on how to complete the survey. Prior to 2013, this prenotice announced the paper questionnaire package, and from 2013 to 2015 it announced the coming of an internet web-push letter. Due to changes to the ACS mail contact strategy, this prenotice was removed from ACS production.

Removing the prenotice created two gaps in messaging that have not yet been fixed.

First, the messaging in mail pieces that was once accurate is now outdated. For example, when the third ACS mailing letter states: “We asked you to help us with this very important survey by completing it online. But we have not received your response yet” this appears to a call-back to a previous message that asked the respondents for help. This was arguably true when the prenotice included the phrase, “Thank you in advance for your help”.

This statement does not directly ask the sample house household for their help. There is a difference between asking someone to complete a task and asking for their help with a task. By

191 asking for help, the relationship between the asker and the asked communicates that equal are trying to accomplish a task together, rather than the asker treating the asked as a subordinate.

Avoiding subordinating language is an important aspect promote agreement with a request

(Comely 2006; Dillman, Smyth, and Christian 2014; Oliver, Heimel, and Schreiner 2017). While this statement doesn’t directly ask for help it at least thanks the household for their help when responding. This “help” message was removed when the ACS dropped the pre-notice letter. In current ACS mail materials, there is no longer a direct or indirect help statement prior to the third mailing mentioning that the Census Bureau previously asked for their help. They did not.

This statement is inaccurate and out of date.

Coding coded the statement about “previous help” was identified as a new message because the ACS mail materials had not previously asked respondents for help. Another gap in communication that resulted from the removal of the prenotice letter was not identified by our coding strategy but noticed when reading the content of the mail materials. The multilingual brochure contained in the first mailing contained the statement, “In a few days you will receive an American Community Survey questionnaire in the mail.” This is not accurate. Households currently receive the multilingual brochure along with a current invitation to take the ACS. If responding by the internet, there is no need to wait. The paper questionnaire does not arrive

“in a few days” but rather 21 days after households receive the multilingual brochure in the first mailing. This statement is holdover from when this mail piece was sent with the prenotice mailing. Because the content of this mailing was not reviewed when it was moved from a prenotice to a different mailing, the statement became inaccurate.

It appears that ACS mail pieces were thought to be moveable, without edit, between

192 mailings. This is not the case. The messaging contained in a mail piece should be written to a specific purpose, with specific messaging, that was designed to be communicated at that time, and in that place. Because ACS mail contact strategy has changed, the messaging is now inaccurate and less effective. When communicating with the public, information should be accurate and up-to-date. The mail pieces sent to potential respondents have limited space to communicate messages. Therefore, it is critically important that each message included have an expressed purpose and play an integral role in inducing a survey self-response. In addition to

This content analysis uncovered messages that are communicated in the mail contact materials that is not justified by the literature reviewed in this dissertation.

Statistical agencies like the Census Bureau may think it necessary to include scientific messaging in the mail materials to promote the importance, utility, and accurate use of the data. However, the literature does not provide evidence that scientific messaging is influential in gaining survey cooperation. In multiple cognitive studies, this messaging was not found to rank highly in terms of convincing respondents (Hagedorn and Green 2014; Hagedorn, Green, and Rosenblatt 2014). Literature from survey methodology, communication, and marketing argue that emotional and personalized messaging can work better than factual statements

(Dillman, et al. 1999; Dillman, et al. 2002; Dillman, et al. 2007; Braveman 2008; Wit, Das, and

Vet 2008; Dillard and Shen 2012). The survey methodology literature does not address technical statements as either good or bad in terms of their impact on response. For this reason, scientific messages were coded under the category of “other” because there was no evidence that these statements increase trust, communicate a benefit of survey response, or reduce the cost or perceived burden of responding.

193 Despite this lack of evidence that these statements are effective, the ACS mail communication materials spend valuable space on multiple scientific messages. For example,

The FAQ Brochure contains four sentences that contains scientific messaging:

1) “The American Community Survey collects information about population and housing

characteristics for the nation, states, cities, counties, metropolitan areas, and communities

on a continuous basis.”

2) “Based on the ACS, the U.S. Census Bureau can provide up-to-date data about our rapidly

changing country more often than once every 10 years when the census is conducted.”

3) “In order to make well-informed decisions, a community needs accurate and reliable

information.”

4) “We may combine your answers with information that you gave to other agencies to

enhance the statistical uses of these data.”

Including all four sentences in a single mail piece may overwhelm potential respondents, especially those at lower reading levels or levels of education.

Two additional scientific statements in the first mailing may actually reduce response propensity.

• “Your household has been randomly selected to complete a very important national survey, the American Community Survey.” – First mailing letter

• “The Census Bureau chose your address, not you personally, as part of a randomly selected sample.” – First mailing letter, Third mailing letter

The purpose of using the word “random” in these statements may have been to alleviate a recipient’s fear that his or her household was chosen on purpose or singled out by the Census

Bureau. However, these statements may make recipients feel that their personal participation

194 in the survey is not special and make the survey request feel less important. Psychology and marketing suggest that the potential respondent should feel that their participation is special and that this survey represents a rare opportunity to make a difference (Cialdini 1984; Cialdini

2009; Oliver, Heimel, and Schreiner 2017). The appeal is rooted in the scarcity principle that states people are drawn to things that are exclusive and hard to come by (Levine 2003; Cialdini

2009). This issue was also raised at the 2016 National Academies of Sciences, Engineering, and

Medicine workshop on the ACS mail communication materials. Experts noted the dictionary definition of the term “randomly” which communicates that the selection process was haphazard and not scientific. Because the general public is not aware of the use of the word random in statistical science, the Census Bureau may be communicating something unintended.

The Census Bureau must consider how households with members with low reading levels and low educations interpret all terms and phrases (National Academies of Sciences, Engineering, and Medicine 2016).

The PSMs contain an unnecessary element that the designers of the material may have thought was necessary or useful. Both PSMs features a “security bar” that acts in a way to block the user ID when the PSM is folded, as shown in Figure 46. After the implementation of PSMs into the ACS mailing strategy, Census Bureau staff determined that the security bar was not required. This element takes up considerable room, does not increase the security of the mailing, and does not add a valuable message the boost response.

195 Figure 46. Security and used ID found on ACS PSMs

Moreover, the placement of User ID printed on the PSM is a curious decision. The ID required to reply online is printed along with the address. The example in Figure 47 shows the user ID (in this case 77777-44659) printed on the mail label.

Figure 47. Sample address label

The user ID is the reason that ACS does not print address directly on envelopes, but instead sprays the address through a cutout window, to prevent user ID’s being discarded when envelopes are inevitably thrown away. Rather, addresses (and user ID numbers) are printed on something more important such as instruction cards, letters, and the survey form. When a respondent goes online to take the ACS, there are instructions to identify the user ID on these

196 materials. In a PSM, the address and User ID are printed on the same pieces of paper (on the outside) as the letter (on the inside). The letter does not need to include a user ID as respondents can follow online instruction to find the user ID on the address bar, in the same way they have to do with other mailings. Including the user ID here may be a holdover design features from and older ACS letter that included the user ID in this way.

The use of space on the ACS letters is not always the best. In addition, the ACS mail communication materials do not optimized benefit statements. Research reviewed in this dissertation suggests that benefit statements are best communicated at the community level or local level (Macro 2009; Conrey, ZuWallack, and Locke 2012; Hagedorn and Green 2014;

Hagedorn, Green, and Rosenblatt 2014). These statements may do a better job motivating all types of respondents and can be particularly useful in converting locally minded respondents who have low opinions of the federal government. However, not all benefits messaging in the

ACS mail materials are framed this way.

There are multiple instances of national level benefits messaging in the ACS mail materials. For example, the first ACS letter states, “The U.S. Census Bureau conducts this survey to give our country and up-to-date picture of how we live—our education, employment, housing, and more” (see Figure 5). Highlighting that the ACS is an important national survey may be useful for making the survey request appear important. However, the benefits of participating in the survey should be communicated at the community level. Another example, in the first mailing states, “The Census Bureau is using the Internet to collect this information in an effort to conserve natural resources, save taxpayers’ money, and process your data more efficiently” (see Figure 5). This message is likely intended to promote internet response over

197 paper questionnaire, as a means of conserving natural resources and saving [the government] tax payer money. However, there is no evidence in the literature that suggests appealing to the conservation of natural resources or saving the federal government money increases response.

It is possible that by mentioning “tax payer dollars” and “saving natural resources” reminds potential respondents that the ACS has an environmental and financial cost. Both of these statements are framed at the national level, which cognitive testing suggests may not resonate strongly with potential respondents than focusing on local impacts (Macro 2009).

Research also suggests that if one mentions a benefit to businesses, that benefit should reference a local or small business rather than a benefit to a large corporation (Orrison and Ney

2014). Across all mail materials, the ACS only communicates one business level benefit in the

FAQ brochure that states, “The data also are used to decide where to locate new highways, schools, hospitals, and community centers; to show a large corporation that a has the workforce the company needs; and in many other ways.” With limited space, it may not be optimal to communicate national and corporate level benefits at the expense of community or local business level benefits, or other useful messaging. Also, research suggests that one of the barriers to response from skeptical respondents is that they do not see the benefits promised by the federal government in their community. By mentioning that this survey helps a large corporation know if a town has the workforce a company needs, this can be interpreted negatively by that have experienced factories closing and moving away. The interpretation may be that the ACS can be used by companies in their decision to close business as well.

The ACS should write clear, community level benefit statements that cannot be

198 misinterpreted or seen by some respondents as negative. By focusing more attention on the mindset of respondents and leveraging advice that community level benefits written at a personal level, I recommend that that ACS spend more space communicating actual, real examples of community level benefits. Also, inaccurate message should be removed. All mail materials designed should be specific to the mailing they are intended. Any mail piece that can be moved around without consideration to the entire communication design is likely not an effective mail piece. Mail pieces should acknowledge that audience as it changes throughout the mailings schedule, and messaging within those mailings should change as well. Most importantly, new messaging should be added throughout the mailings to convert non- respondents. As we’ve seen, the ACS mail communication materials are highly repetitious. This repetitious messaging is unlikely to convert respondents who need to be convinced by a new appeal. By recognizing the changing audience and leveraging messaging that the ACS hasn’t communicated, the follow-up mail pieces can be more effective at increasing response rates.

Missed opportunities

The ACS mail communication materials communicate 44 of the 76 total potential codes for messaging. This means that there are a lot of options for messaging that the ACS could potentially communicate in their mail communication materials not currently communicated.

Each unused code represents an opportunity for ACS mail materials to communicate something new to convince more, or different, potential respondents to participate in the ACS. However, there is only limited space to communicate messages to potential respondents in the mail communication materials. The Census Bureau also needs to avoid information overload and be strategic in the messaging that is communicated in ACS mailings. Based on the list of unused

199 codes, and the literature reviewed in this dissertation, the following are potentially missed opportunities to communicate messages that may positively influence response rates to the

ACS. These are listed in priority order based on my assessment of the potential usefulness of each message.

1) Deadline: Providing a due dates for a task, like a survey request, has been shown effective in

multiple studies, both qualitative and quantitative, is recommended by multiple fields, and

by experts in survey methodology (see Edwards, et al. 2009; Macro 2009; Martin 2009;

Stokes, et al. 2011; National Academies of Sciences, Engineering, and Medicine 2016;

Dillman 2016; Allen and Richardson 2019; Kephart, et al. forthcoming). ACS mail

communication materials should include a due date to motivate response, increase the

clarity of the task, and reduce potential confusion and burden on respondents.

2) Conformity: Some people can be motivated by a desire to conform and tend to follow the

lead of others similar to themselves (see Cialdini 2009; Edwards, et al. 2009; Fishbein and

Ajzen 2011; World Bank 2017). Over two million households each year participate in the

ACS, making the act of responding to the ACS a normal activity. ACS materials should frame

survey participation as conforming with the normal actions of others. Moreover,

highlighting the number of participating households and farming the ACS as a normal

activity may alleviate fears of responding the ACS without communicating detailed

confidentiality statements.

3) Consistency: People can also be motivated by a desire to act consistently in a way that

matches their values, beliefs, and previous actions (see World Bank 2017; Fishbein and

Ajzen 2011; Cialdini 2009). Reminding people of previous actions summarily to ACS

200 participation (such as responding to the decennial census) or with actions that are

consistent with their values and beliefs may trigger this motivation.

a. Civic-minded people may value civic responsibility and their identity as someone

who is civically engaged. Participation in civic activities gives some people a sense of

pride and empowerment (Gottlieb and Robinson 2002). ACS can be framed as a civic

duty to activate this motivation.

b. Reminding a respondent that the ACS is required by law can motivate a response by

reminding a potential respondent to act in a consistent, law-abiding way.

4) Tangible, local benefits Statements: Survey communication materials should communicate

multiple benefits to survey participation to reach different types of respondents (Oliver,

Heimel, and Schreiner 2017). Currently, the ACS mailing materials only communicates a few,

similarly worded community level benefits, such as references to planning roads and

hospitals. Communicating that the survey helps communities in other ways, for example by

supporting small and local business and non-profit organizations, may also resonate with

respondents (Hagedorn, Green, and Rosenblatt 2014; Hagedorn and Green 2014; Orrison

and Ney 2014). Moreover, providing real, specific examples from a community that

benefited from ACS data or testimonials of how ACS data were used to improve a

community can provide powerful motivation to act (Dillard and Shen 2012; Artefact 2019).

5) Scarcity: Research has shown that people desire items and opportunities more if the item or

opportunity is scarce or perceived to be scarce (Levine 2003; Cialdini 2009). Because the

ACS only sampled a few households each year (about 1 in 35) only a small percent of the

U.S. population is asked to participate. Participation in the ACS should be framed as rare or

201 unique opportunity for a household to be the voice of their community.

6) Leverage authority: Some people are more likely to comply with a request if it comes from a

credible expert (Cialdini 2009). Communicating or projecting the expertise of the Census

Bureau may be a powerful way to gain survey compliance. In addition, communicating that

ACS data is used to distribute $675 billion dollars of federal money may communicate that

the survey task is important as well as highlight the authority of the Census Bureau and the

ACS.

7) Unity: People are more likely to agree and comply with a request if they feel a sense of

unity with the requestor (Cialdini 2016). Messages should be written to make a potential

respondent feel included in the same group as the requestor and avoid subordinating. One

way to do this is to ask for help, rather than assigning a task to the potential respondent.

This puts the Census Bureau in the subordinate role, needing the assistance off the survey

recipient. Also, asking for help puts both the survey and the recipient on the same team

solving a problem together (Dillman, Smyth, and Christian 2014)

8) Personalization: Personalizing a survey request is hypothesized to increase response rates

by making the requests seem important and less generic (Dillman, et al. 1999; Dillman, et al.

2002; Dillman, Smyth, and Christian 2014). To personalize the ACS survey request, the

survey invitation letters should be sent from a real person with authority in the sending

organization, such as the director of the Census Bureau or the Chief of the ACS.42 Letters

should include the senders signature and potential respondents should be able to verify

42 At the time of this review, the Census Bureau had an acting director and the acting director’s name and signature was not added to the ACS mailings. Other Census Bureau surveys did add the acting directors name and signature to their survey communications.

202 that the person sending the survey is a real person in the sending organization.

9) Commitment: Asking people to actively commit to a completing a task can increase task

completion propensity (Cialdini 2009; World Bank 2017; Matthews 2017; Milkman, et al.

2011). Including a commitment message that asks potential respondents to write down

their plan for completing the ACS could help potential respondents commit and follow

through and complete ACS.

To avoid information overload, communicating these recommendations needs to occur strategically across mailings. Some recommendations can be combined into a single message.

For example, a testimonial from a credible expert might simultaneously communicate Census

Bureau expertise as well as communicate a tangible, real world example of community level benefit. Other recommendations may seemingly confuse potential respondents if they are communicated in a single mailing. It is true that the ACS is completed my millions of households each year, making ACS a normal activity. The ACS is also a rare opportunity to serve as the voice for ones’ community. These messages both have evidence that they can convince some people to respond to the ACS request, because they can perhaps be misinterpreted as contradictory to each other, it may be best to communicate these messages in separate mailings.

Sponsor Information is Not Communicated Consistently

The recipient of a survey solicitation must trust that the survey is legitimate to respond.

The most prominent messaging tries to establish trust that the survey request is legitimate, as shown in Table 17:

203 Table 17. Count of the assigned codes by messaging category

Messaging Number of Percent of Category Codes Total Trust 140 39.1 Benefits 49 13.7 Burden Reduction 51 14.2 Other 118 33.0 TOTAL 358 100.0

The most frequently communicated trust building message was messaging that communicated a connection between the ACS and the Census Bureau, comprising of nearly one out of every six messages (16.5 percent) using a variety of messaging elements, such as text, logos, web addresses, postal addresses. People are more likely to trust a survey request from a survey that they have heard of or a survey conducted by an organization they know and trust

(Dillman, Smyth, and Christian 2014; Groves, et al. 2012; Herberlein and Baumgartner 1978;

Presser, Blair, and Triplett 1992). Lack of familiarity with the ACS is a main barrier to gaining survey cooperation. Only 11 percent of potential respondents have heard of the ACS

(Hagedorn, Green, and Rosenblatt 2014). Communicating that the survey is the “American

Community Survey,” is not be enough to establish trust that the survey request is legitimate.

However, over 90 percent of potential ACS respondents are familiar with the Census Bureau

(Hagedorn, Green, and Rosenblatt 2014) and nearly half of potential ACS respondents reported a positive attitude toward the Census Bureau and its’ purposes (Conrey, ZuWallack, and Locke

2012). An annotated version of the letter in the first ACS mailing shown in Figure 48 illustrates the frequency and placement messaging that connects the ACS to the Census Bureau.

204 Figure 48. Census Bureau references (highlighted) in the initial Mailing letter

It is difficult to say what frequency of messaging is required to make a connection between a survey and a survey sponsor. On first review, the ACS letters appear to do a good job of communicating that the Census Bureau is conducting this survey. However, to effectively connect the ACS to the Census Bureau, consistent messaging and design across multiple mailings may help establish trust and that the ACS request is legitimate and to clearly establish that that each contact was sent from the from the same trusted sponsor organization

205 (Dillman, Smyth, and Christian 2014; Dillman 2016). However, current ACS mail communication materials do not consistently communicate the connection to the Census

Bureau clearly across mail pieces.

First, messaging that mentions multiple government entities may confuse respondents as to the sponsor of the ACS. In addition to the U.S. Census Bureau, each mail piece mentions additional organizations that may be interpreted as the survey sponsor, which may obscure the link between the ACS and the Census Bureau. As shown in Figure 49, “UNITED STATES

DEPARTMENT OF COMMERCE”, “Economic and Statistics Administration”, and “OFFICE OF THE

DIRECTOR”, are all reference along with the “U.S. Census Bureau” in the letterhead used in

ACS invitation letters and postcards.

Figure 49. Letterhead used in ACS mailing letters

Communicating multiple sponsor organizations in this way may confuse potential respondents. One sample member lodged the following complaint to their local newspaper:

“I received a letter today from the director (unnamed and unsigned) of The United

States Department of Commerce, Economics and Statistics Administration, U.S.

Census Bureau, stating that my household has been randomly selected to

complete the American Community Survey. Further, it must be done online at

https://respond.census.gov/acts (but a paper questionnaire will be mailed if

needed) and my response is required by law with a penalty imposed for not

206 responding. Is this legitimate?” (Murray 2017).

This anecdote notes that a potential respondent noticed all of the listed federal entities associated with the ACS but still did not find the communication legitimate.43 A recent eye- tracking study, showed that the heading with multiple sponsors took more time for readers to process than a simpler letterhead with a single sponsor, the U.S. Census Bureau (Tuttle, et al.

2019). By including the Department of Commerce seal (rather than the Census Bureau logo) and placing the Census Bureau third on the letterhead (in a smaller font and not in all caps) may not clearly communicate that the Census Bureau is the agency that conducts this survey.

Evidence suggests that the Census Bureau is a known and trusted sponsor (see Brick and

Williams 2013; Schwede 2013) but there is inconsistent evidence on connecting surveys to other government agencies or directly to the federal government. While the Census Bureau is an agency within the federal government, this agency endears a different level of trust than the federal government, which suggests that there are potential disparities of trust between government agencies. There is not any supporting literature on how the additional government agencies mentioned in the ACS mail communications are interpreted by potential respondents.

Moreover, there is no evidence that referencing multiple agencies adds to legitimacy. Removing the additional government entities from the ACS mail communications may help communicate more clearly the connection between the ACS and the Census Bureau.

This multi-layered bureaucratic organization is communicated in all mailings, not just

43 This quote come from an article titled, “American Community Survey from Census Bureau can creep you out: Money Matters,” first published in The Plain Dealer, October 8, 2017. In the article, The Plain Dealer staff answered the question, “Most likely, what you received is from the U.S. Census Bureau. But that doesn’t necessarily mean you want to respond to it. Props to you for being paranoid, especially when we are living in an era where there are scams, phishing everywhere, and when there are reasons to avoid being in too many databases with too much personal information” (Murray, 2017).

207 the letters. Table 18 shows how each ACS mail piece uses different combinations of logos and text references to communicate potential sponsorship.

Table 18 . Application of sponsorship in ACS mail materials

LOGOS TEXT REFERENCES U.S. U.S. (U.S.) U.S. Economics and Office of Census Department Census Department Statistics the Bureau of Bureau of Administration Director Mail Piece Commerce Commerce Outgoing ✓ ✓ ✓ ✓ ✓ Envelopes Letters ✓ ✓ ✓ ✓ ✓ ✓ FAQ ✓ ✓ ✓ ✓ Brochure Multilingual ✓ ✓ ✓ ✓ Broch. Paper Survey ✓ ✓ ✓ ✓ Instruction ✓ ✓ ✓ Card Postcard ✓ ✓ ✓ ✓ ✓ “✓”: Logo or sponsor is present

Text references to multiple government entities (either 2, 3 or 4) occurs in each mail piece, mostly occurring in the letter head. Logo use is also inconsistent, with outgoing envelopes, brochures, and Instruction cards containing just the U.S. Census Bureau logo, the paper questionnaire and postcard only containing the U.S. Department of Commerce logo, and letters containing both agency logos.

In addition to the inconsistency of logos used, there is also inconsistencies and non- optimal placement of logos in mail pieces. This occurs immediately on the outside of the first envelope. One of the first barriers to gaining a survey response is that a survey invitation mailing needs to be opened before the messaging inside to be read (Dommeyer, Elganayan, and

Umans 1991; Dillman 2016; Lavrakas 2017a). The envelopes used to send respondents letters, brochures, and survey forms present an opportunity to connect the ACS with the Census

208 Bureau sponsor that may help overcome this first barrier and increase the change a recipient opens their ACS mailing. When a person picks up an envelope, their eye first observes their own address in the middle of the envelope before their eye tracks to the left and then up to the corner to read the return address to see who sent the piece of mail. Next, the reader may glance to the right to view the postage stamp. In total, this process takes about seven seconds and can determine if the envelope is opened or ignored (Vögele as cited in Chewning 2019). As shown in Figure 50, the current ACS envelope does not place a Census Bureau logo in the natural vision pathway that most people follow when processing the content of an envelope.

Instead, the top left of the envelope contains a three-sponsor return address that lists the

Department of Commerce first (in larger font) and the U.S. Census Bureau third. Neither logo is used in this prominent location. The Census Bureau logo is located on the bottom left of the envelope.

Figure 50. Placement of Census Bureau logos on envelopes

Also as shown in Figure 51 current ACS envelops can contain a “callout box” to communicate that response to the ACS is required by law.

209 Figure 51. ACS production envelope callout box

This box presents an opportunity to connect the mailing to the Census Bureau, but instead names the lesser known ACS. Referencing that the envelope contains a Census Bureau survey may lead to more people opening the envelope.

Another missed opportunity to leverage authority and build trust is also not taken advantage of on the envelope. Some methodologists suggest first-class mail because it gives you multiple advantages over bulk-rate mail, as first-class mail goes through to postage system faster, is forwarded and not discarded like bulk-rate mail, and it may communicate that the mailing is important (Salant and Dillman 1994; Dillman 2000; Daly, et al. 2011). The first-class mailing used by the federal government is different than that used by private businesses and non-government entities, as shown in Figure 52.

Figure 52. ACS outgoing mailing first-class mail postage area

This first-class message clearly states that this mailing was paid for by the U.S. Census Bureau.

This message, if noticed, confirms that this mailing is not a scam, as a fraudulent sender claiming to be the Census Bureau could not use this first-class postage. Industry experts

210 recommend that when first class postage is used that this is highlighted with additional messaging on the envelope to draw attention to the more expensive and important delivery postage (Huntsinger 2019). Drawing attention to this postage is an opportunity to connect the mailing to the Census Bureau and increase the chance the mailing is opened that is not taken advantage of on the ACS envelopes.

The less-than-optimal messaging that communicates that the Census Bureau conducts the ACS is continued in the ACS letters. Reader’s process documents with heavy text that is evenly distributed and homogenous, like the text is presented on the ACS letters, by starting in the top left of the page and then moving across and down the page (Lidwell, Holden, and Butler

2003). Based on this reading pattern, the top left of a text-dense page is a prime location to place a message or a graphic to be noticed by a reader. Rather than a Census Bureau logo in this location, the top left of the letter corner of the letter contains a form ID used by the Census

Bureau’s National Processing Center when printing and assembling the mail materials. The logo is placed, like on the envelopes, on the bottom left corner, as shown in Figure 53.

211 Figure 53. Placement of Census Bureau logos and form ID on an ACS letter

Each mail piece must have a form ID for tracking, storage, and mail processing, but the location of this form ID is flexible. The form number, which is relevant to handful of Census

Bureau employees, is not relevant to respondents. Rather than take up the most prime location on the page, the form ID should be relocated to a less prominent position. Doing so would allow for the placement of the Census Bureau logo in a more prominent position on the letter.

More puzzling than the misplaced logo on ACS mailing envelopes and letters is that the

Census Bureau logo does not appear on the ACS paper questionnaire. The third mailing sent to sampled households contains an ACS paper questionnaire to gain responses from households that either cannot or choose not to respond by the internet. While this mailing contains a

212 letter, instruction card, information brochure, and a pre-paid return envelope, research suggests that the only mail piece some ACS recipients look at in the mail package is the paper questionnaire (Schwede 2013). It may be necessary that the paper questionnaire connect the survey directly to the Census Bureau. Figure 54 highlights in yellow the times the Census Bureau is mentioned on the cover of the ACS paper survey.

Figure 54. Census Bureau references (highlighted in yellow) on the front of the paper survey

The term “U.S. Census Bureau” is mentioned once in small type in the top right corner below

“U.S. Department of Commerce” and “Economic and Statistics Administration” as part of a multi-entity letter head. The word “census,” and not the Census Bureau, is stated two more times, but both times within web addresses. No statement on the front of the survey form

213 communicates directly that the ACS is conducted by the Census Bureau and the Census Bureau logo is not used. The top of the survey form prominently displayed the Department of

Commerce logo and the “American Community Survey” name instead. The back of the questionnaire also does not draw a strong connection between the ACS and the Census Bureau.

Figure 55. Census Bureau references (highlighted in yellow) on the back of the paper survey

Similar to the front, the Census Bureau logo is not used on the back (shown in Figure 55) of the survey form and there is not a statement that directly communicates that the Census

Bureau conducts the ACS. There are two mentions of the Census Bureau contained in the fine print text in the lower right corner. Also, the U.S. Census Bureau is mentioned in the return addresses for people who discarded the enclosed pre-paid envelope and need to address their

214 own return envelope to mail back the ACS form.

There is no evidence that these messages are widely read by potential respondents. If read, one of the mentions of the Census Bureau on the back of the survey form may illicit confusion or distrust. Survey communication should provide recipients multiple ways to easily verify the authenticity of a survey request (Dillman, Smyth, and Christian 2014). For example, potential respondents may search online for the survey sponsor, the name of the person sending the survey, and the sponsor’s physical location to verify that a survey request is real verification purposes. Table 19 summarizes the four difference addresses used in ACS mail items.

Table 19. Addresses used on ACS mail materials

Address Location communicated Purpose 201 E 10th Street, Return address on the Postage requires a return Jeffersonville, IN outside of envelopes and address 47132-0001 postcard. P.O. Box 5240, On the back of the paper Location where Jeffersonville IN, questionnaire and the pre- respondents mail paper 47199-5240 paid return envelopes questionnaires 4600 Silver Hill Road, On the back of the paper For recipients to send AMSD 3K138, questionnaire. comments about the Washington, D.C. survey. 20233 Washington, DC In letterhead Unclear. Perhaps 20233-0001 communicating that all four federal entities are in Washington, DC

Complex survey operation like the ACS may require the use of multiple facilities in different states. However, communicating multiple address, especially in different states, may cause confusion, distrust, and obscure the connection of the ACS to the Census Bureau.

215 Someone claiming to be the Census Bureau could send letters on fake Census Bureau letterhead and ask scammed respondents to send their personal information to a non-Census

Bureau location. It is important that the purpose of each address is communicated clearly and that any address communicated is easily verifiable, especially if the address is in an unexpected location like Jeffersonville, IN. It may be problematic that none of the addresses listed in ACS materials communicate the actual, physical location of the Census Bureau headquarters at,

4600 Silver Hill Road in Suitland, Maryland. Despite an address claiming this, there is not a Silver

Hill Road in Washington DC. It does not exist. It is possible that a DC address was selected for the Census Bureau because people may assume that the U.S. Government is located in

Washington DC, and not in Suitland Maryland. This may have made sense prior to the internet.

Now with the availability of search engines and the ability for people to verify information online, using this non-existent forwarding address cause more confusion than it alleviates.

To simulate what a respondent might do to verify the address of the Census Bureau, I conducted multiple internet searches from different computers, browsers, and locations. I consistently found through these searches that the location of the Census Bureau is in

Jeffersonville, IN and Suitland, MD.44 Only one search returned a location in Washington, DC, but the address displayed online was a different than those communicated in the ACS mailings, as shown through examples provided in Figure 56.

44 This process was exploratory. It was not systematic or scientific. Additional research on this issue is necessary.

216 Figure 56. Examples of locations of the Census Bureau from internet searches

217

It would be best to send only one, consistent, and verifiable address to potential respondents. All mailings should use the verifiable Census Bureau Headquarters address in

Suitland, MD. From there, mail can be automatically routed to the National Processing Center in

Jeffersonville, IN. If this is not possible, additional messaging is needed to clear up any confusion. For example, additional messaging could label the Silver Hill Road address as the location of the Census Bureau Headquarters, and the two Jeffersonville, IN addresses could be labeled as the location of the Census Bureau National Processing Center. If possible, the Census

Bureau should work with search engine companies to optimize the search experience so that the correct, verifiable address of the Census Bureau is displayed.

Graphic and format inconsistencies

Additional inconsistencies across mailings were found.

Color: The materials vary in their use of color. The multilingual brochure uses blue as the primary color, while the instruction cards, FAQ brochure and the paper survey use different

218 shades of green that do not match across mail pieces. He letters, postcards, envelopes, and

PSMs do not contain any color to connect them to the other mail items.

Graphics: The use of graphics is also sporadic. The multilingual brochure contains full- color pictures of an American flag, a school bus, the Statue of Liberty, a skyline and cards in traffic, shown in Figure 57.

Figure 57. Excerpt of ACS multilingual brochure of full-color graphics

How these pictures specifically relate to multilingual audiences is not made clear. This brochure is also the only mailing item to include a QR code (a square barcode read by some smartphone to direct people to websites).45 Near the QR code, the multilingual brochure has icons to social medial platforms that have Census Bureau content. This also appears on the FAQ brochure, but without the QR code. The social media icons are not included on other materials. The FAQ

45 I did not include VR codes on my materials. Additional research is required to understand if and how this technology could be leveraged by the Census Bureau to increase response rates to the ACS.

219 brochure uses a large American full color American flag graphic on the outside, and a faded

American flag on the inside, as well as overlapping faded green images of people, houses, cars, and the American flag, as shown in Figure 58 and Figure 59.

Figure 58. ACS FAQ brochure (close-up of graphics).

Figure 59. Excerpt of FAQ brochure inside American flag graphic

These shadowed images are also used on the top of the instruction cards, as shown in Figure

60. The use of an image across multiple mail items can draw a connection between items and make them feel part of the same mail package. However, the use of these images in the background with overlaid text is difficult to see and understand and may make the text difficult

220 to read as well. It does not draw a clear connection, and the images are not used on other items. Moreover, the use of this image makes the FAQ and multilingual brochures appear very different, as if they were not designed to be in the same mailing strategy.

Figure 60. Excerpt of the top of the ACS instruction card graphic

Two mail pieces use icons to communicate messages to respondents. First, the mailing 3 instruction card includes icons accompanying the response modes options to respond online or by paper survey form, as shown in Figure 61

Figure 61. Excerpt of mailing 3 instruction card to highlight use of icons

The computer icon is outdated, and these icons do not appear in other mail items. Second, an icon for a landline telephone is used on the cover of the survey form, as shown in Figure 62.

This icon is also outdated. Younger ACS respondents may have never seen thus type of landline phone. A phone icon does not appear in other mail items. All icons are used once, though using the same icon across mail items may be useful.

221 Figure 62. Excerpt of the survey form to show telephone icon

Another outdated, or unnecessary, use of graphics is found on the bottom of the PSM. The

Security bar is meant to hide the login information when the PSM is folded, as shown in Figure

63. However, this is unnecessary for policy and practical reasons. It is not needed and there is no evidence that suggests this element communicates anything useful to respondents, but it takes up room that could be used to communicate other messages.46

Figure 63. Security bar from ACS PSMs

Another inconsistency is that the two PSM mailers (the second and fifth mailings) use a different font than the other letters, postcards and brochures. The survey form also uses a different font. Headers used across letters, survey form, and brochures use different fonts and

46 While the security bar still appears in production ACS materials at the time of this Dissertation, the Census Bureau has decided to remove this security bar from future ACS PSMs.

222 font elements such as bold and all-caps text. The use of larger and bold font is also inconsistent across mail pieces. For example, the PSM letters in mailing two and five use bold font to highlight the name of the survey while the first and third mailing letters do not. Style elements, such as the use of call-out boxes in the ACS letters, also differs across letters.

It may not be likely that any respondent will consciously notice these discrepancies.

However, to communicate the official and important nature of the ACS response task, and to establish a clear connection across multiple mail pieces, all mail materials should be designed using the same design elements such as color, graphics, logos, font and style elements. This review shows the ACS mail materials do not do this.

Conclusion

The content analysis of the ACS mail communication materials revealed the messaging contained in the mailout materials does not follow the tentative recommendations that I compiled from a review of relevant literature and research on survey communication.

First, Literature across many fields all suggest that when communicating with the general public to write simple communications to that avoid overwhelming or overloading a reader, and especially readers with low literacy. However, ACS mail materials contain a lot of messaging written at a higher-than-recommended grade level. This problem is most prominent in the first mailing which contains five mailing items, three of which are written at a college reading level. The Census Bureau should reduce the total number of messages and rewrite materials following plain language guidelines.

Second, the messaging contained in the mail materials was also found to be highly repetitious, both in terms of verbatim and paraphrased repetition. Over 90% of the content in

223 the fourth and fifth mailings repeated, verbatim, messages from multiple previous mail items. I do not see how this repetition can convince a reluctant household to respond. Repetitious messaging only benefits a household that has skipped or missed previous communications.

New messaging, on the other hand, can benefit audiences that have read, or have read, prior communications. I recommend strongly that the Census Bureau rethink their strategy towards repetitious mailings. Follow-up mailings do not add new messages to motivate respondents with different appeals to respond. While some repetition is required for a respondent that may be reading a follow-up mailing as a first contact, new messaging in each contact would benefit new readers as well as people how had read previous messages while repetition mailings only benefit the new readers.

Third, rather than repeat messages in each mailing messaging should adapt as responders are removed from the mailable universe to reflect this change in potential respondent mindsets and attitudes. Writing messages that communicate new appeals written to this changing audience may be more effective to convince non-respondents to participate in the ACS. This content analysis uncovered useful unused messages that literature has found useful to motivate people toward an action. However, to add new messages some messages will need to be removed. While reducing the amount of repetition can help, this review also found some messaging that lacked strategic purpose. All messages in the ACS ail contact materials should be reevaluated. Every message, and every bit of space, needs to be purposefully designed to maximize response rates. The Census Bureau should leverage this list and develop mail materials that strategically communicate the appropriate messages across mail pieces so that the total number of messages remains low while new appeals are added to

224 convert non-respondents.

Fourth, and perhaps most importantly, the ACS mail items need to draw a clear and unquestioned connection to the Census Bureau as the sponsor pf the survey. While one out of every six messages currently communicate this, the content analysis found that the linkage between the Census Bureau and the ACS is potentially confused by including additional government sponsors, inconsistent use of logos and design and insufficient or poorly placed messaging. Streamlining the communication of Census Bureau sponsorship can declutter the mail materials and draw more attention to the sponsorship organization that matters most. In

Chapter 6, I discuss specific ways to redesign ACS mail communications to meet these recommendations.

Lastly, a concerted effort should be made to redesign mail materials with a consistent use of graphics, logos, color, font, and style elements. In a society increasingly skeptical of survey requests, every effort possible must be made to make mail communication materials flawless, so that potential respondents do not pause to question the validity of a survey request. The current use of these messaging elements is inconsistent and sporadic. When used strategically and designed as a single set or mutually supportive materials, these elements can draw a connection between items, increasing the level of trust of later contact attempts, and heighten the professionalism of the mailings.

This review suggests that the current ACS mail communications were originally written without the guidance of a strategic plan for messaging and design across mail items. The path- dependent, incremental style of innovation used by the Census Bureau to update the ACS mailout methodology has not created contact materials that are written to optimize response,

225 reduce costs, and decrease bias. In chapter 6, I propose a set of revised materials that follow these messaging recommendations.

226 Chapter VI. Proposing a new implementation strategy for the American Community Survey

In this concluding chapter, I propose a revised ACS implementation strategy for testing against the current implementation strategy, with the aim of achieving high response rates at lower cost. This proposal differs significantly from the piecemeal, incremental, and path- dependent innovation favored by the Census Bureau that has have kept most of the current implementation strategy intact while testing single ideas. The proposal in this chapter draws upon past theories as well as previous research reviewed in this dissertation to create entirely new ACS mail communication materials.

My recommendations in this chapter need to be viewed as an attempt at “break- through” research. It is aimed at finding out if a new package of messaging of elements, unconstrained by current implementation procedures, can improve ACS response. My design leverages interconnectedness between mailings and mail pieces in a way not currently done by the Census Bureau. I propose that the complete proposed package of messaging elements be tested against current procedures. The kind of research proposed here, if successful in improving response to the ACS, will not be able to explain exactly which specific new elements are responsible for improvements. However, If the messaging strategy introduced here is found to be effective at improving response or reducing nonresponse bias, testing for causality would be a logical next step. The ideas detailed here are a starting point for research to improve materials, not the end.

My specific plan is designed to strategically communicate messaging recommendations from this dissertation effectively across multiple mailings and mail pieces. Following recommendations from previous research and theorizing, I try to avoid overloading each

227 contact with too much information by reducing the total number of messages communicated in each mailing. And, I try to communicate new messages across the mailings to convert non- respondents who were not convinced by the preceding messages to comply with the ACS response request. As respondents are removed from the mailing universe between mailings, the audience for the ACS mailing changes. My plan acknowledges this change in audience and employs new messaging, rather than repetition, to communicate messages designed to increase response rates from specific audiences in each mail communication. While the messaging will differ across mailings, the mail contacts are designed to share a similar design and be written in a similar tone so that the all communications feel like they are part of a single conversation from a single source.

All materials use consistent design elements and plain language techniques to communicate in a way that is accessible to respondents with lower reading and education levels. My materials are designed specifically to appeal to a new generation of respondents who are more accustomed to receiving processing information in smaller packages, while also maintaining a look and feel expected in government communications. I hypothesize the materials following this plan will not only increase response rates, but also reduce nonresponse bias by increasing the diversity of respondents to better reflect the national population.

My mailing strategy (shown in Figure 64) is also designed to reduce cost by achieving greater response in the self-administered part of the implementation strategy. Increasing overall response during the self-response period saves money by lessening the number of cases that are sent to expensive nonresponse follow-up operations, including in-person contacts.

Second, my strategy is designed to maximize response rates from trusting and compliant

228 audiences earlier in the mail contact period. By pushing more respondents to the web and

obtaining response earlier, the Census Bureau will save money on follow-up mailing costs and

processing costs for paper survey forms. Third, my plan removes some repetitious enclosures

(one instruction card, both FAQ brochures, and the multilingual brochure) from some of the

individual mailings. It also adds a 6th mailing hypothesized to increase response rates enough to

justify the added cost of the additional mailing by reducing the no response follow-up during

the in-person data collection workload. In total, the reduction of mail pieces, as well as

increases in response rates and early internet responses, will save the ACS program money.

Figure 64. Proposed redesigned ACS mailout methodology

The first two mailings are sent to the entire monthly sample and our geared towards gaining

response from internet-ready, trusting, and compliant populations. Respondents are removed

and all remaining households are sent mailing 3 and 4. Mailing 3 introduces the paper survey

while mailing 4 acts as a reminder to respond while also adding the option to respond by

phone. Similar to current ACS production, respondents are removed after the forth mailing.

Departing from production, the third mailing universe now has two mailings (5 and 6) that

incorporate specific messaging and design features to increase response from distrusting

populations to reduce nonresponse bias and reduce costs of NRFU operations. Table 20 shows

229 a comparison of my design to ACS current production.

Table 20. Mail strategy comparison: ACS production vs proposed redesign

ACS Production Dissertation Redesign Mailing Day Mail Pieces Day Mil Pieces 1 0 Instruction card (internet) 0 PSM introduction letter Introduction letter Multilingual brochure FAQ brochure Outgoing envelope 2 7 Bi-fold PSM letter 7 Letter

Remove Respondents Language assistance card 3 21 ACS questionnaire 21 ACS questionnaire Large Outgoing envelope Instruction card (choice) Letter Letter Pre-paid return envelope FAQ brochure Outgoing envelope Pre-paid return envelope Outgoing envelope 4 25 Reminder postcard 25 Reminder postcard Remove Respondents 5 39 Bi-fold PSM letter 31 Graphic letter 6 NA NA 38 Bi-Fold PSM letter #10 outgoing envelope The First Mailing

The primary goal of this mailing is to obtain response from easy-to-reach households and to establish trust so that future communications are believed by all audiences. The current first mailing for the ACS contains 5 mail pieces (envelope, letter, multilingual brochure, FAQ brochure, and a cardstock instruction card) containing 129 total messages. This amount of content may overwhelm potential respondents and cause them to delay responding. Recent evidence suggests that informational brochures to do not work in the first recruitment mailing

(Dirksz, et al. 2018). My initial survey communication is a single Pressure Seal Mailer (PSM) containing 24 English language messages. The first barrier to receiving a survey response is that

230 the initial communication needs to be opened in order for any messages inside to be read

(Dillman 2016). This makes the envelope, or in this case the outside of the PSM, extremely important because the messaging on this portion of the mail piece need to convince a potential respondent that the mailing is real, important, and should be opened. This process can happen, on average, in seven seconds or less, so the outside of the mail package needs to be designed to be comprehended quickly, and clearly (Vögele as cited in Chewning, 2019).

PSM’s are single pieces of paper that act as both an envelope and a letter. Printed on the outside of the paper is the mailing information, just like an envelope, including sending address, return address, stamp and other messages. On the inside of this piece of paper is printed an introduction letter. When folded, a machine binds and seals the edges of the paper.

To open the PSM, perforated edges of the paper need to be ripped, revealing the letter on the inside. This is a relatively new technology in mailing, and these mailings are often used to mail important information (such as rebate checks, school grades, legal notices, etc.) which may make them more likely to be noticed and opened (Risley, et al. 2017). The only potential drawback of a PSM compared to an envelope is that they can only contain a single side of a piece of paper to communicate messages. They cannot contain inserts, survey forms, or double-sided letters. However, because the piece of paper containing the letter is also an envelope, they require less assembly costs and are cheaper to mail than a letter inside an envelope. Because it is critically important that the first mailing is noticed and opened, and because I want to limit the number of messages in the initial contact, a highly-visible and important-looking PSM with a single letter on the inside is the ideal choice an initial survey communication. The outside of the PSM I developed is shown in Figure 65:

231 Figure 65. Mailing 1: Redesigned PSM (outside)

The red lines on Figure 65 represent where the PSM is folded. The middle portion of this page is the “front envelope, that is visible when the tri-fold PSM is folded for mailing. This page features a single Census Bureau logo in the top left, a simplified, verifiable return address to

Suitland, MD, and a callout box that says, “YOUR RESPONSE TO IS REQUIRED BY U.S. LAW”. I removed the text from the currently used call-out box that said, “The American Community

232 Survey: Your Response is Required by law”. Mentioning the American Community Survey on the envelope is unlikely to increase the chance that the envelope is opened because most people have not heard of the ACS.

I also removed two statements currently on the ACS production envelope. Frist, a statement that read “equal opportunity employer” was on the envelope due to an old policy that required this statement. However, this is no longer required. As mentioned, on envelopes the Census Bureau only has seven seconds to communicate with a potential respondent

(Vögele as cited in Chewning, 2019), and this message was unlikely to convince a household to open the letter. Instead, I added a new message that states “Official U.S. Government Mail” to the top of the mailing. The placement of this message is strategic. Some experts recommend placing a message (for example “FIRST CLASS MAIL” in red type and capitalized letters) below or near the stamp area, to highlight that the envelope uses a more expensive and important delivery postage than bulk rate mail (Huntsinger 2019). My message, “Official U.S. Government

Mail,” was a bit too long to place under the stamp area. In my design, I placed this message along the sight path commonly used to read envelopes to draw the eye towards the official first-class mailing stamp, as shown in Figure 66 (Vögele as cited in Cogan, 2019).

Figure 66. Sight pathway commonly used to read envelopes

233 As the respondent’s eye tracks along the envelope from the middle (their address, to confirm the letter was sent to them) to the top left (the return address, to see who sent the letter) the eye then tracks to the stamp area. Placement of this message is so that the message is noticed and draws attention to the postage area. The postage clearly communicates that this mailing could only be mailed if it came from the Census Bureau. This stamp confirms authenticity, if it is noticed. It is a powerful message that may be seen more often with the added message.

Second, I removed a statement that said, “Official Business Use, penalty for private use

$300”. It is possible that this message may make an envelope appear official, but research on this is lacking. This statement is meant to communicate that private persons cannot reuse the pre-paid first-class envelope to send personal mail. However, PSMs are not reusable. This message existing on the PSM is a design hold-over from a statement that may be needed on an envelope. Another holdover from ACS production exists on the PSM that is not necessary. For

ACS production purposes, all mail pieces require a form ID. The current ACS production PSMs have this form ID on both the inside and the outside of the mailer, placed in prominent locations below the return address on the cover of the PSM and on the top left of the letter, as shown in Figure 67.

234 Figure 67. Current ACS PSM outside and letter with form ID circled in red

Because the PSM is a single piece of paper, this form ID only needs to be printed on either the inside or the outside in one location, not in both. In my design, the form ID is on the outside, back of the PSM page. When the Census Bureau designed PSMs for the ACS, they based design decisions on previous envelopes and imported messages and designs that are not useful, nor make sense, on a PSM. My envelope and PSMs remove extraneous messaging to draw attention to the messages that communicate that the mailing is coming from the Census

Bureau and that something inside is required by law.

235 The inside of the PSM contains an invitation letter to take the ACS online. It is extremely difficult to communicate with many audiences in a single communication (Munodawafa 2008).

This mailing targets trusting and compliant respondents with a simple, straight forward message that states 1) a household is in the ACS, 2) that responding is required by law, and 3) procedural information on how to respond online. My initial contact resembles the length of a prenotice communication but includes instructions so that households can immediately respond. By communicating only necessary and basic information, this mailing can effectively convince complainant potential respondents to participate without distraction. Too much information, especially at the start of a multi-point communication, can overwhelm a potential respondent (McCormack 2014; Poldre 2017). Tech-savvy, educated, busy respondents who are used to receiving information in small, digestible chunks (Wilmen, Sherman, and Chein 2017) and is accustomed to skim reading (Wolf 2018). My materials focus on communicating only critical information needed to reply. Convincing these potentially compliant respondents to the web immediately is a primary goal of this mailing.

By sending a simple mailing to households, this mailing serves as a “foot-in-the-door” introduction that establishes an initial connection with households that may respond to later mailings. When making an introduction to set up a future sale, or ask, of a person, the initial contact should be simple and establish trust and not get bogged down in details. (Cialdini 2009;

Cialdini 2016). While some households will not act immediately based on this mailing (as they require additional motivations to comply), the first mailing with simple, limited messaging will still establish the initial connection between the household and the Census Bureau increasing the chance that future communications can be seen and trusted. The recipient of a survey

236 solicitation must believe that the survey, and the survey sponsor, are legitimate. Most people have not heard of the ACS and this lack of familiarity with the ACS is a barrier to gaining survey cooperation (Hagedorn, Green, and Rosenblatt 2014; Walker 2015). Conveying that the ACS is from the Census Bureau, a well-known and respected federal agency, will help establish legitimacy of the ACS. Features in this mailing, and throughout the all mail pieces, are designed to make the connection between the ACS and Census Bureau clear. An example of my design is provided in Figure 68.

237 Figure 68. Mailing 1: Redesigned PSM (inside letter)

238

Figure 68 shows the inside contents of the PSM with the red lines indicating the folds.

Features of this letter include reduced text that removes extraneous messaging and focuses on establishing the survey as legitimate and real. A more personalized greeting (“Dear Resident”) is used replacing “From the Director of the U.S. Census Bureau”. The signature line includes a reference to the signee of the letter as the director of the Census Bureau instead. Each line in the letter that follows is strategic. The first line of the current ACS letter states that the household is part of a random sample. This may confuse potential respondent or make people feel the survey request is unimportant. My introduction sentence is a factual statement that states the Census Bureau contacts households every year to conduct a survey. It removes a statement that had communicated that households are part of a random sample, a phrase that can be misunderstood to make the ACS request appear less important.

The second sentence communicates that the ACS is used to distribute billions of dollars in funding and is so important that it is required by law, citing the laws that mandate response.

Noting that the responding to the ACS is required by law is the most critical message in this communication, as shown in numerous Census Bureau studies (Dillman 1996; Phipps 2014;

Oliver, et al. 2017; Barth, et al. 2016; Oliver, Risley, and Roberts 2016). Placing this message in a prominent location above the first fold and in bold text was strategic. However, statements about being required by law are most effective when they also communicate why something is required by law (Dillman 2016). Current ACS letters do not frame the “required by law” messaging with a justification. The messaging in this mailing provides justification by first linking the required by law statement to the ACS being used to distribute 675 billion dollars in federal funding to communities. Second, it cites the laws that mandate response. For many compliant

239 audiences, this might be enough to garner a response. Citing the specific law that requires response may be useful for convincing less-trusting respondents who still value following the law to response. The details legal information was previously placed in the FAQ brochure, which may go unseen. Citing the law in the first letter adds credibility to the statement that the survey is required. According to plain language guidelines, this sentence may be a too complex for some audiences, but the target audience of this mailing is specifically compliant internet users with a higher level of education.

On the top half of the page is a shaded call out box that highlights that people can go online to response. In previous ACS communication, the website was presented as

“https://respond.census.gov/acs”. The “https://” is unnecessary and I found no evidence that including this in a printed communication increases response or trust in an internet survey request. The “s” in “https” does signify that the website is on a secure server. While this was once rare, today many websites use such servers. Any limited benefit this may have has a drawback. The additional text takes up space and a less tech-savvy person who wants to respond online may think these characters are required to type into get to the website.

Coupled with the message to respond by smartphone, these extraneous characters may make typing the website address more difficult.

Another new element that is added to this letter is the use of color in the call outbox.

The current ACS mail communication materials incorporate color inconsistently across mailings.

In my design, shades of green (which is the color of the ACS paper survey instrument) are used strategically across mailings to build a connection between mail pieces. Immediately after the call-out box is a sentence noting that respondents can take the survey on their computer or

240 smartphone. This added language about smartphones is designed to get people into the survey instrument and started on a more convenient device. Respondents may not even be aware that they can take the survey from their smartphone. This statement contains a three-word statement about data security when it asks people to log into “our secure website”. Telling people to log into a secure, government website to take a mandatory survey from the U.S.

Census Bureau may be enough to gain response.

Removed from the current ACS production letter is the detailed data security and cybersecurity message. Census Bureau policy dictates that all respondents are provided an opportunity to see these statements prior to responding. Currently, a residents of a sampled housing unit address have only two ways to respond: self-response over the internet and assisted-response by phone. Phone response is not yet communicated as an option, but it is possible someone calling for help is converted to respond by phone. Rather than place the lengthy and potentially off-putting data security statements in this letter, the data security statements will be provided at the mode of response. The internet instrument will display the statements so anyone replying online has a chance to see them. Very few people complete the survey over the phone early in the response period, so households that call prior to the third mailing will be read the confidentiality statements. This small amount of burden placed on a few telephone calls will reduce the amount of text sent to over 3 million households each year.

This new strategy meets the policy obligation to communicate data security and confidentiality without adding alarming statements in the first introduction mailing.

The following sentence communicates that a paper questionnaire is coming for those who want to respond in that way. Next, a bold sentence that states, “Your response matters”

241 was added. This phrase will be included as a “tag line” across mailings to build a connection, and trust, across mail pieces. This next line adds a reason for why the survey response matters and also provides a simple benefit statement without going in to specific details. The placement of these messages below the response option is strategic. The most compliant potential respondents may have already stopped reading. Above this sentence, I strategically placed only messages that are required to respond (name of the survey, it is required by law, and how to respond). If someone was not convinced, they can continue reading and see these simple additional statements. Remember, every sampled household in my mailing strategy receives both mailings 1 and 2. Because my mailings are designed to work together as a single conversation, I designed these two mailings to act as a, “one-two punch” rather than as two separate mailings. This allows me to strategically place messaging across two mailings knowing that all households receive both mailings. I do not need to include every potentially useful message in the first mailing. Moreover, doing so would add clutter and reduce the impact hose statements. Mailing one is deliberately brief, while mailing two will add additional messages on specifics benefits statements for potential respondents that need those appeals to respond.

The bottom third of the mailing contains a “thank you” statement and the signature of the director of the U.S. Census Bureau, and for the purposes of this dissertation, a pseudonym,

“Lawrence Smith,” was used, not the name or signature of the real Census Bureau director.

Current ACS letters all contain the same “thank you” statement at the bottom of each letter. I am writing these letters to feel like a conversation with a real person and repeating the same thank you statement five times is disingenuous. In my design, each letter will contain a different

“Thank you” statement. The statement in this letter, “we appreciate your help, and thank you

242 for your time” is written to build a sense of unity with the respondents that will be continued in follow up communications with specific messaging. This statement uses a larger word

“appreciate” than subsequent statements. This phrase was used in this mailing because the audience is compliant, education and internet-ready households. This word, which may be too difficult for some audiences, will not be used in follow-up mailings.

The letter also contains a message in Spanish that serves two purposes. First, Spanish speaking households may see this message and call the number to respond. Second, non-

Spanish speaking households may see this and find the inclusion of Spanish text makes this letter seem official, governmental, and important because it is common, and perhaps even expected, to see Spanish-language messaging on mailings from the government. The current

ACS production mailing package contains a fold-out multilingual brochure with messaging in six languages. This one line replaces that brochure with the language most likely needed by non-

English speaking respondents. Putting this message on the letter, rather in a brochure, also makes it more likely to be seen, maximizing any benefit. Additional language support will be provided in mailing two.

As required by Census Bureau policy, the bottom of the page contains the “census.gov” website in small font. The heading on the top has also been streamlined. A single Census

Bureau logo is now placed in the most prominent position on the top left of the page (moved from the bottom right location). I removed the lengthy, problematic header and replaced it with a simple header with the return address of the Census Bureau’s main office location in

Suitland, MD. This address is verifiable and matches the one used on the outside of the PSM,

243 which removes a discrepancy from previous mailings that had different address on the outside and inside of mail packages.

The Second Mailing

The main purpose of this mailing, which is sent one week after the first mailing is to communicate benefits to compliant and trusting recipients of the ACS response request. It will be sent in a large envelope along with a letter from the director with an insert that is essential for mailing and providing the identification number for the respondent’s address. All sampled members receive both the first and second mailing. The first mailing contained limited messaging to communicate necessary messages to trusting and compliant audiences. This mailing works in conjunction with the first mailing to provide more reasons for a trusting and compliant respondent to participate. People are more likely to perform a behavior when they feel the action will have consequences that benefit them in some way (Fishbein and Ajzen

2011; World Bank 2017; Artefact 2019). Perceiving that there are social or community benefits to survey participation may motivate some individuals to respond. Framing the benefits of survey participation as a community level benefit works across all types of respondents (Macro

2009; Hagedorn, Green, and Rosenblatt 2014). Specifically, community level benefits messaging can instrumental in at recruiting local-minded potential respondents who distrust or are apathetic toward the federal government, but deeply involved in their local communities.

Mailing one contained one broad, nonspecific statement of community benefit. This mailing expands on that statement to communicate real, tangible benefits that ACS data has had in communities in the United States. By communicating how the ACS has been used to help other communities, it may help respondents believe that the ACS can help their community as well.

244 The envelope for this mailing follows all of the same recommendations outlined in the outer PSM package and is shown in Figure 69.

Figure 69. Mailing 2: Redesigned outgoing envelope (front and back)

Using larger envelopes, increases the chance that a mailing will get opened (Tarnai, et al. 2012).

This mailing arrives one week after the tri-fold PSM, so designing these mailings with considerable size and format contrast is strategic. The differences in size and format will signal to respondents that this mailing contain different information.

There are four differences on the outside front of this envelope compared to the outside of the first mailing PSM. First, I changed the call out box to say, “RESPONSE TO THIS

CENSUS SURVEY IS REQUIRED BY LAW”. The larger envelope has more space and allows room to add this direct connection to the Census Bureau in this location. The added text provides a stronger appeal to open the mailing, as people have heard of the Census Bureau and the call out box clearly says what is required by law – a response to a Census Bureau survey. Second,

245 the statement, “Official Business Penalty for Private Use $300” was added, as is required by

Census Bureau policy on envelops. Third, for processing and assembly purposes, the envelope has a cut-out window where an address can be sprayed through onto enclosed materials. This sprayed on address also contains a log in ID that respondents need to take the survey online.

The Census Bureau cannot print this directly to an envelope, as envelopes are can be discarded immediately when mailings are opened. PSMs do not have this issue, as their exterior is the same piece of paper as their interior letter, making it impossible to throw away the form ID.

The back of the envelop (shown in Figure 70) is completely different form the redesigned PSM back of the PSM in mailing 1 and also from all envelopes used in current ACS production. First, differing from my designed PSM in mailing one, the second, the envelope has a flap which includes a Census Bureau logo so that the logo will appear on both sides of the envelope. PSM’s did not have a flap. The larger difference is that I added text to the back of the envelop. In cognitive testing, text on the back of the envelope caused more research participants to claim they would save and open the envelop later. I added a lengthier mandatory message, “Your response to this U.S. Census Bureau survey is required by law,” as well as language support statements to the back of the envelope. The language support statements on the envelope are strategic, as someone who does not read English may not open an English language envelope. With statements in multiple languages on the back, the envelop does not need to be opened by sample members who speak these languages. The envelops used in ACS cognitive testing also had language support but were shown to English monolingual audiences and still produced more positive feedback thank blank envelopes. English language

246 speakers may find government mailing that include language support as official because the mailing was important enough to be translated.

Figure 70. Mailing 2: Redesigned envelop (back)

With the address sprayed through a window on the front of the envelope, this mailing must contain a piece of paper on which the address and form ID are sprayed. To send a mailing in the large envelope, Census Bureau processing required that the mailing contain a cardstock insert slightly smaller than the envelope. The current ACS methodology uses such a card, but my analysis found this card provided little useful information. Because the card is required for shipping, I designed a cardstock insert that communicates new messages, giving the mail piece an additional purpose, as shown in Figure 71.

247 Figure 71. Mailing 2: Redesigned language card (front)

The front of the cardstock has a large white space for the address to be printed. The card also includes the website to respond online, a message that highlight the location of the used ID needed to respond online, and the tag line “Your response matters”. The graphics on this card are also simplified compared to ACS production, which include a shadowed image of people in the background. This image was only used in one place and did not connect with other graphic elements in the mail out procedure. Instead, I imply use the color green to connect this mail piece to other mail materials.

The current ACS production materials contain a multilingual brochure to communicate messages to non-English speaking households. Because this mail item did not increase response rates (see Joshipura 2010) but the information is important for representation of the ACS respondent population, I propose removing this important content from the multilingual brochure to the back side of the cardstock insert. The first benefit of this is that the information can be seen on the card without unfolding or opening a brochure. This card will devote more

248 space to the largest non-English speaking population in the United States. I rebranded the back of the card was rebranded to say The American Community Survey in Spanish, “La Encuesta sobre la Comunidad Estadouidense” as well as the tagline “Your response matters” as “Du

Respuesta Importa”. The Census Bureau provides the ACS survey instrument online and in paper in Spanish, so I include detailed instructions in Spanish for these response modes. All other languages need to call the Census Bureau help line to take the survey over the phone. For these languages, I reduced the text from the multilingual brochure to two simple sentences:

“Because you are currently in the United States, you are required by law to complete this survey. Please call [1-866-xxx-xxxx] to complete the survey with someone who speaks [insert language]” as shown in Figure 72.47

Figure 72. Mailing 2: Redesigned backside of the Identification Card

47 The Spanish-language text on this card is not professionally translated.

249 This language-support card retains the useful, unique messaging from the multilingual brochure and places the text on something that does not require to be opened, like a brochure. It is contained in an envelope that also has language support, meaning this envelope may be opened by non-English speaking households. It combines two mail pieces into one, reducing the total number of messages and mail pieces.

The letter that accompanies this mailing is shown in Figure 73.

250 Figure 73. Mailing 2: Redesigned Letter (front)

This letter uses the same shades of green as mailing 1 and the instruction card to connect the designs of the mail pieces together. It also contains the same design features used in the first mailing, including the same header, logo, a date mailed, and a signature from the Census

Bureau director. The first sentence acknowledges the previous mailing and thanks respondents

251 if they have already responded. The purpose of this message is twofold. First, it is good practice to thank people who have responded. Second, for non-respondents, it thanks them for something they haven’t done, which may activate feelings of reciprocity. A statement immediately says that if they have not responded to and the letter provides a similar green call- out box with the response website in bold, larger font.

The following text uses green headers and an F-pattern design that organizes content into smaller chunks of text which can ease readability, guide the reader through the text, and white space to make the text appear less dense.48 By phrasing the headings as questions, the letter implies a conversation with director and builds a sense of unity between the Census

Bureau and a potential respondents. The content of this letter is paraphrased from the first mailing, because this content is critically important. By placing it in a new format, different respondents may see and act upon the content. The first bullet adds new information that the

ACS is used to distribute 675 billion dollars in federal funding, a new appeal that highlights the importance and authority of the survey. The first bullet also has directions to look at the back of the letter for more information.

The backside of this letter is used to communicate tangible benefits of ACS response.

Current ACS letters do not contain print on the back. By using both sides of the letter, this mailing maximizes the available space without adding more expensive additional pieces. An example of this is contained in Figure 74.

48 The idea for the F pattern design and use of color in the heading words was first proposed by Broderick Oliver of the U.S. Census Bureau. My letter and messaging is adapted from his original design.

252 Figure 74. Mailing 2: Redesigned Letter (back)

The back of this letter is designed to reach audiences with a new set of appeals and design elements. It is important to communicate the benefits of ACS response to local

253 communities, but it is impossible to link ACS to a benefit in every community. By communicating real examples of the benefits of the ACS through testimonials on how the ACS has been used to help communities across the United States, potential respondents may believe that the ACS can help their community too.49 On this mail item, I used narrative messaging to create an emotional and personal connection. This type of messaging can be more influential than fact-based or statistical messaging (Reinhart 2006; Reinhart and Feeley

2007; Dillard and Shen 2012; Wit, Das, and Vet 2008; Kreuter, et al. 2010; Ricketts, et al. 2010;

Artefact 2019). This is especially true among audience with low-investment in an issue or topic

(Braveman 2008), which is likely the case among randomly sampled survey participants. The testimonials provide narrative evidence that reinforces the fact-based message on the front that states the ACS is used to allocate 675 billion dollars of funding. The combination of using a narrative message in a non-narrative communication can be persuasive (Gibson and Zillmann

1994; Zillman and Brosius 2000). Two of the testimonials are from “expert” users of ACS data.

Such expert statements may add the appeal of authority to the communication (Cialdini 2009).

Other testimonials rely more on an emotional appeal, for example, by mentioning disaster preparation and helping families. Because evidence suggests that narrative messages work better in audio and video delivery modes than written messages (Braveman 2008) my design uses pictures to add a visual element to the emotional testimonials to increase their potential influential impact.

49 Tangible benefit examples and testimonials used in this mailing were compiled by myself, Broderick Oliver, and Dorothy Barth of the U.S. Census Bureau.

254 The target audience for this mailing is still an educated, internet-ready audience, but this mailing extends the messaging to try to specifically convince the local-minded respondents to participate. The language may be difficult for some potential respondents, but being that the target audience is still more educated, internet-ready households, the phrasing should meet the expected reading level. The use of pictures may help communicate with lower literacy audiences. After this mailing all respondents will be removed. Follow-up mailing must acknowledge this shift in audience, and it will be increasingly important to write all remaining communications in plain language.

As noted in my analysis, the Census Bureau has done a poor job incorporating graphics into the ACS mailings. The current ACS mail contact materials contain an inconsistent mixture of graphics and colors. When testing new graphic elements, such as graphic flyers, data slides, or envelopes, the graphics were included in a single mail piece, were too extreme, and not integrated into the mail strategy in a meaningful way. In my plan, consistent use of design elements draws a connection between all mail pieces. The bottom of this letter contains the same “tag line” statement used in mailing one, “Your Response Matters” which will also be used in future mailings. This connection is also made through the use of color. Mailing 1 contained only a slight use of color. The letter in mailing 2 will be folded in a way that the front of the letter will be seen first, which contains a little more use of color. The back of this letter contains some more use of green and two simple picture that are only seen after trust in the mailing has been established. This use of color matches the instruction card, further building the connection between mail pieces. In my design, graphic and color have been integrated into the overall design in a more intentional way. This will continue throughout the mailings.

255 Third Mailing

Prior to mailing 3, which occurs fourteen days after the first mailing, all respondents are removed from the mailing list. The purpose of the third mailing is to speak directly to these non-respondents to reduce the burden they may feel toward responding. This is accomplished in two ways. First by providing a paper survey form for households that do not want, or cannot, respond online. Second, this mailing is designed to address an audience that has not responded to the streamlined initial contact or the mail package with detailed information on community benefits through the use of targeted messaging to reduce the perception of burden toward responding. The mailing format is now a large, approximately 9 x 11.5 envelope. During the writing of this dissertation, the Census Bureau decided to change the size of the ACS to an

8.5x11 booklet to accommodate changes to the wording of some of the ACS questions.50 This survey booklet cannot be folded, so the mailing package must use a larger, approximately 9 x

11.5 envelope. This thicker survey may be perceived as longer and more burdensome than the previous ACS booklet. This procedural change heightens the need for messaging that can reduce the perception of burden people have towards taking the ACS.

All mailings in my design have a different appearance, which may help each mailing get noticed and opened. In the current ACS production, the first and third mailings use the same envelope, and the second and fifth mailing use the same PSM format. Though the Census

Bureau did not notice an impact from changing the size of PSMs (see Heimel, et al. 2019), this mailing format strategy is repetitive and goes against recommendations from survey

50 Changes were to accommodate a new Race questions which now includes detailed reporting write-in lines for respondents who identify as White or Black or African American.

256 methodology literature (Dillman, Smyth, and Christian 2014). Inside the new envelope is a paper survey form, a letter, and a prepaid return envelope, removing the instruction card and

FAQ brochure used in current ACS production survey package.

The redesigned outgoing envelope is shown in Figure 75.

Figure 75. Mailing 3: Redesigned outgoing envelope (front)

257 The design elements on the front of the envelope mirror those in other mailings with a few minor changes. First, the “required by law” call out box now also says “U.S. Census Survey Form

Enclosed” to indicate that the form is now inside the envelope. The current ACS mailing use this space to say “American Community Survey Form Enclosed” which is a weaker appeal to open the envelope, as people do not know the ACS. Second, I added a tagline to the bottom of the envelope to match a phrase used in other mailings. Similar messaging on envelopes was shown effective for other survey operations (Dommeyer, Elganaya, and Umans 1991; Edwards, et al.

2009; Ridolfo, et al. 2019). Repeating the tagline here may also help draw a connection before mailings, and also communicate that the respondent is important. The back of the survey form includes the Census Bureau logo on the flap and also adds two new symbols, one that the mailing was printed in the United States, and the other, a universal symbol that the mailing is printed on recycled paper, as shown in Figure 76. These icons are useful in this mailing because it contains a 48-page survey form. I did not include other messaging on the back of this envelope for a few reasons. The second envelope had messaging on the back, and I didn’t want to repeat the same messages. Also, this envelope is slightly larger than a standard sheet of 8 ½ x 11 inch paper. It will likely be noticed by households with or without messaging on the back.

258 Figure 76. Mailing 3: Redesigned outgoing envelope (back)

The envelope contains the ACS paper survey form. The content of the survey form is out of scope for this dissertation, but the front and back cover of the form contains space for messaging that can influence a potential respondent’s decision to participate. The front cover of the redesigned survey form is shown in Figure 77.

259 Figure 77. Mailing 3: Redesigned paper survey form (front)

In contrast to the current ACS survey form cover, this survey form clearly connects the

ACS to the Census Bureau. A Census Bureau log appears on the top left replacing the

Department of Commerce logo and the multi-government-entity header was removed. Because

260 research suggests that when a mailing contains a large survey form, other mail pieces in the package go ignored (Schwede 2013). The survey cover presents the option to respond online or by paper, now with accompanying icons that may help draw attention to the options of response. Second, a sentence clearly states that the survey is conducted by the U.S. Census

Bureau, removing any doubt of the survey sponsor. This is followed by a simple two-sentence statements that communicates that response is required by law, and in turn, the Census Bureau is required by law to protect respondent information. This phrasing follows research that suggests that communicating the Census Bureau’s legal obligation to data security can help build trust even with some distrusting audiences (Hagedorn, Green, and Rosenblatt 2014). The legal codes are added to lend legitimacy to this statement, and the phrasing is building a sense of unity by saying both the respondent and the Census Bureau’s actions are regulated by laws.

An outlined box on the bottom left contains information for select audience, such as a phone number for those that need help, a phone number for those that need TDD assistance, a phone number for those who need help in Spanish. Because all three statement involve telephone assistance, a phone icon was added. This icon is a more modern update from the current ACS mail survey that uses the icon shown in Figure 78. Younger recipients of the ACS may have never seen this style of landline telephone. Despite landline phones being outdated, the picture of a handheld phone receiver icon that I used is still used as the universal call icon on smartphones, making the symbol recognizable to a wider audience.

261 Figure 78. Outdated telephone icon used in current ACS production

The right side of the survey form contains all questions currently on the front of the ACS survey. Because the content of the survey was not in scope for this dissertation, I left these questions as-is on the cover. However, during a full survey redesign, the decision to include these specific questions on the cover of the survey can be addressed to create room for other instructions or messaging. Removed from the survey form is a statement that provided a website to go for more information. This web address was different than the response website and is a distraction to respond. To streamline the process, the only website provided on the survey form is the link that will lead to the ACS response page. From this page, a potential respondent can click other links to find information. The ACS information page provided on the current survey does not provide an easy-to-find link to respond.

The redesigned back of the survey, shown in Figure 79, adds a more prominent “thank you” message on the top of the page. Second, I added an area for respondents to provide their feedback on their experience with the ACS. This is important to communicate a sense or reciprocity to respondents; after nearly 50 pages of providing information the Census Bureau wants, respondents should have an opportunity to say what they want. I simplified the mailing instructions and used the same Suitland, MD address communicated in other mail materials.

The bottom third of the survey form contains mandatory statements that I did not edit.

262 Figure 79. Mailing 3 Redesigned paper survey form (back)

263 A pre-paid return envelope is enclosed for respondents to mail back their survey form, as shown in Figure 80. All mail materials use one address, the location of Census Bureau

Headquarters in Suitland, MD. Having a mailing that claims to be from the Census Bureau (that has been established in one location) but the mailing includes a pre-paid envelope that sends a respondent’s personal information to a different location in Jeffersonville, IN, may appear to some to be a scam. This envelope can use the Suitland, MD address and can be programmed to be automatically routed to Jeffersonville, IN, for processing. There is no need to communicate the Jeffersonville, IN, address.

Figure 80. Mailing 3: Redesigned pre-paid return envelope

264 The third mailing package also includes a letter, as shown in Figure 81.

Figure 81. Mailing 3: Redesigned letter

265 This letter contains similar elements and use of color with other mail pieces previously sent and would match those on the ACS paper survey form. Because the audience that responds by paper tends to be older and more likely from rural communities, a conversational, respondent friendly tone is used throughout this formal letter. A respondent friendly tone had been shown useful in previous ACS testing (Raglin, et al. 2004). There are now two ways to respond, making the callout box less necessary. Instead, the response options are accompanied by icons that will match those on the paper survey form and later in other mail pieces. In current ACS production, the internet option to respond is always listed first, as the goal is always to push people to the web. However, in the phrasing in this letter, where the director acknowledges fulfilling a promise to send the paper survey form, listing the paper response mode first made more sense. The option to respond by the internet instrument is still listed and is accompanied by an icon so that someone skimming the letter may take notice. Both response options ask respondents to respond by a specific date. Now that the paper survey form has been provided, all households have a way to respond in the mode they prefer, which makes adding the “respond by” date possible.

Under the response callout box, three headlines that group accompanying text are used similar to those communicated in previous letters. The first headline repeats the tag line used in other mail pieces and on the envelop of this mailing. The first sentence communicates a “thank you” message to those that have responded. The second sentence adds a new message that communicates a benefit of ACS response that is worded to make the potential respondent feel empowered. The last line, “Together, we can help prepare for a better future” is designed to build a sense of unity between the Census Bureau and the potential respondent.

266 The second heading presents multiple new appeals to respond. The first sentence uses a new phrase to introduce the required by law sentence comparing ACS to over civic duties like paying taxes and reporting for jury duty. This message may motivate some to respond to the survey because doing so would be similar to other actions they have taken in the past. It also appeals to a respondent’s sense of civic responsibly, which can also be a powerful motivator.

After stating that the survey is required by law, a statement that communicates thousands of households have already responded this month to frame ACS as a normal behavior that conforms to the expectations and actions of others. The purpose of these messages is to reduce a sense of burden on respondents by framing the AS normal, both consistent with previous civic actions, and conforming to the actions of others. The third heading has messaging to communicate the help phone number and instructs the reader to see the back of the letter for more information, which is shown in Figure 82.

267 Figure 82. Mailing 3: Redesigned letter (back)

268 As previously stated, the Census Bureau is required to communicate certain messages to respondents prior to their responding. Up to this point, these messages were communicated either online (with the web instrument) or over the phone (if someone called for help and took the survey over the phone). Now with the option to respond by paper questionnaire, these messages must be communicated in print in this mailing. Rather than include them on the front of the letter and repeated them in an FAQ brochure and across multiple mailing, as is done in current production, I included these statements a single time on the back of the third mailing letter.51

The purpose of the back of the letter is to communicate required information as well as some additional information that some respondents may need to feel more comfortable to respond. The first three bullets contain added and potentially useful information, such as hours of operation for the ACS call center, added explanation that the “.gov” web address confirms authenticity, detailed about the sampling procedures for the ACS and the number of sampled households that can make response feel more normal. The fourth and fifth headers contain the required information, much of what as previously communicated in multiple letters and the

FAQ brochure. The placement here serves two purposes. These statements are not up for review, so they cannot be written to follow plain language guidelines.52 Because these

51 The design and phrasing for the back of the third letter was a collaborative effort with Census Bureau staff. 52 The policies that mandate this language can change, but the process to do so takes a long time and involves numerous stakeholders. I recommend that the Census Bureau revise the language that is mandated on survey communication materials. Because the complexity of these statements is high, I decided to hide these statements within other text, and idea originally proposed by Dorothy Barth of the US. Census Bureau. If the complexity and detail of statements can be reduced, I would rather write new statements in a way that help people. Because of policy regulations, this was not possible for this dissertation. Reducing the text, the number of messages, and the complexity of the message may help communicate messages that bolster confidence in response and do not raise fears.

269 statements may raise more concerns than alleviate, surrounding these statements with other messaging may make the messages less prominent and less likely to be read. The text is still provided for those that are motivated to read it and may find it helpful, but it will likely not stick out or be read by anyone not deliberately looking for it.

On the bottom of the letter-back is additional Spanish-language support. I do not speak

Spanish, and I am not familiar with customs, norms, and phrasings used across this heterogeneous community. However, members of this community may be able to recommend better messaging than I have provided here, which is repeated from the mailing 2 language support card. Spanish-language respondents are important to the success of the ACS, and this country, and the Census Bureau should devote appropriate resources to researching and varying Spanish language message across mail pieces as I have done in English.

Fourth Mailing

My plan for the fourth mailing is to send a reminder postcard four days after the paper survey package. The use of a postcard and timing are both purposeful. Postcards remove the first barrier to response (opening the mailing) because content of a postcard is immediately viewable, making postcards the perfect format for a reminder. The timing of this mailing to arrive four days after the survey form is deliberate so the postcard can arrive while the household still has the paper survey form.

Figure 83 shows the post card with the tagline used in previous mailings, “Your response matters”. The first line of the postcard announces that response to the survey is required by law. Second, it communicates a message that respondents can respond in the way easiest for them, a new phrasing to introduce the option of modes. The internet is still communicated as

270 the first option. The paper survey form, that arrived just a few days prior, is presented as the second option. While respondents could have responded by phone by calling the help line, this postcard is the first time that this is presented as an equal response option to responding online or by paper survey. All icons on this post card have been used in previous mail materials.

Figure 83. Mailing 4: Redesigned postcard (front)

Communicating the option to respond by phone in the fourth mailing may help drive response from people who need additional assistance. Converting as many respondents to self- respond is still important, but in the fourth mailing, I’m equally concerned with getting response in whatever way possible to avoid the cost of in-person visits. Officially offering the option to respond by phone can reduce costs in follow up operations. Providing this option on the postcard is also strategic. By placing this on a postcard, all remaining households can see this option without having to open any mail. Lastly, Survey requests should incorporate language that indicates deference for the respondent’s time and effort, so the postcard

271 includes a statement to thank respondents for their time. A Due Date on the outside of a mail package was reported by cognitive testing participants to increase their likelihood of opening a mailing (Kephart, et al. forthcoming). While post cards do not need to be opened, and because the content side of the post card communicates that response is required by law, I placed a due date in the call out box of the post-card on the mailing side to emphasize the date and increase the chance that the post card is noticed and read, as shown in Figure 84.

Figure 84. Mailing 4: Redesigned postcard (back)

Fifth mailing

After the fourth mailing, all respondents are removed from the monthly mailing sample.

Current ACS production uses a repetitive (in content and mailing package) PSM as a last attempt to gain self-response prior to expensive NRFU operations that repeats. Because converting these households to respond would significantly reduce the number of expensive personal visits, this mailing is critically important. The 5th mailing presents a unique opportunity to communicate new messages to a new audience of the most reluctant households. Rather than repeat messages and formats, my plan attempts something new in the

272 5th mailing, as shown in Figure 85.

Figure 85. Mailing 5: New graphic letter

The format of this mailing is a graphic letter hypothesized to attract respondents that did not respond to previous four mailings. This mailing highlights the due date and links the due date directly to the survey being required by law. The message “there is still time to respond” is also

273 worded to add a sense of urgency to the response, to indicate that time to response is available, but running out. The phrase “how will you respond to the American Community

Survey” is used to seek commitment from respondents in in a way not previously tried. Three response options are provided with icons to ease understanding and Spanish language is added.

The bottom of the letter has a new element. Inside of a green callout box, a message states, “We want to help” and, repeats the tag line from earlier mailings “Your response matters” before adding a new message, “If we do not hear from you soon, someone from the

Census Bureau may visit your home to help you”. This plain-language sentence is accompanied by a picture of Census Bureau staff at someone’s home to conduct the ACS. This picture and accompanying message communicate a critical new appeal to respond. People may not want to have someone form the Census Bureau come to their home. The use of the picture is meant to grab the attention of someone who may not be reading the text closely. The audiences for this mailing are the distrusting and cynical that have not responded to four mailings from the

Census Bureau. This audience, more than others, may want to avoid a visit from someone from the government. While they may not want to take ACS, they may want more to avoid a personal visit at their door.

The problem here is that for this message to be received, someone must open the envelope to see this message. Distrusting households may not open the mailings from the

Census Bureau. Following successful cognitive testing that was never implemented into an ACS field experiment (see Reingold 2014a; Reingold 2014b), and evidence that messaging and pictures on envelops can motivate potential respondent to open a mailing (Ridolfo, et al. 2019),

I designed an envelope that places this same message and a picture on the outside back, as

274 shown in Figure 86.

Figure 86. Mailing 5: New graphic mailing envelope (front and back)

The back of the envelope communicates a few critically important messages in addition to the message that a Census Bureau interviewer may visit a non-respondent’s home. First, it states that this mailing is from the Census Bureau. Second, it communicates a due date on the

275 outside of the mailing package, something that has been shown in cognitive testing I increase the change that a mail package is opened (Kephart, et al. forthcoming). The tagline “Your

Response Matters” Is also included on the envelop and will also appear on the back of the letter.

My plan for the back of the letter is to develop a simple and sleek infographic that communicated benefits of the ACS. I provided a rough example in Figure 87.

Figure 87. Mailing 5: New graphic letter (back)

A professionally designed infographic that includes plain language messaging with simple

276 graphics to communicate how the ACS is used to help communities across the country may help respondents who are skeptical see responding to the ACS as a positive activity. Combined with their desire to avoid a visit from the Census Bureau, some reluctant households may respond.

Sixth mailing

The current ACS mail strategy sends fives mailings. Additional contacts have been shown to increase response rates and reduce costs for surveys in general (Dillman, Smyth, and

Christian 2014) and specifically for the ACS (Chesnut 2010). Because the NRFU operation used by the ACS is so expensive, a sixth and final attempt to boost self-response is warranted. An increasing in self-response rates will offset any costs of producing and mailing this final contact.

As opposed to ACS mailing contacts in the past that could be moved, removed or added with little consideration for other mailing in the mail contact strategy, this contact is specifically designed to follow the graphic-heavy fifth contact. I hypothesize that a mailing with a picture of a Census Bureau employee going to households would improve response from reluctant households. However, graphic letters may not be seen as official by all audiences. The sixth and final mailing is a specific appeal to an audience that expects a more traditional mailing. The format is a bi-fold PSM. This mailing format has not yet been used, making it, again, stand out to respondents and it has been over a month since the first mailing, tri-fold PSM, which may have gone unread or unnoticed. Sending a second PSM, in a new size, is strategic for the last mailing to induce response prior to the expensive NRFU operation. The outside of the mailing package contains a new message to promote the mailing to be opened for sampled households to respond, as shown in Figure 88.

277 Figure 88. Mailing 6: Redesigned PSM (outside)

The call-out box, now in red, states “Past Due: Response Required by law”. The use of red ink and the “Past Due” phrase tied to the “required by law” message will add a sense of urgency for people to open the mailing and to respond. The inside of the PSM is shown in Figure 89 echoes this message.

278 Figure 89. Mailing 6: Redesigned PSM (inside)

The “past due” message is repeated in red ink at the top of the letter above the tag line.

The letter then communicates a message that acknowledges the multiple attempts to gain response from the household. This is necessary, because there is a chance someone could miss the previous five mailings. If this mailing is opened first, it needs to communicate the previous contacts to justify the ACS response being past due. The second sentence links the “past due”

279 message with wording that states the survey is still required by law, despite being past due.

After, a statement communicates that the response to the ACS is important enough for the

Census Bureau to send someone to the household to complete the survey in person. The following section presents the response options in the style of a “commitment device” that other work has shown useful to gain compliance to a requested task (Cialdini 2009; Milkman, et al. 2011; Matthews 2015). The choices of response mode no longer are accompanied by icons, but instead, bullet points that look like check boxes. This is to signal to respondents that they must choose one of the modes of response to respond. The rest of the content is similar to other letters (such as a signature from the director, Census Bureau logos and a phone number to call for help). With limited amount of space in this mailing, I wanted to avoid writing too many messages. I’d rather potential respondents focus on the important messages, that the response is past due, someone may come to their home, the ACS is required by law, and how to respond.

Summary and Conclusion

As discussed at the beginning of this dissertation, costs of survey operations and distrust in survey requests are on the rise. The Census Bureau is taking active steps to stem the tide of declining response rates that have impacted other survey operations from impacting the ACS, the premier source of demographic information for the United States. Finding new ways for survey operations like the ACS to improving response rates and reducing costs is increasingly important. The current Census Bureau strategy to recruit participants into the ACS produces, on average, a 63% response rate via self-response modes of data collection (Baumgardner 2018). If this response rate can be improved to reduce the costs of in-person follow-up, or if response

280 can be pushed to the internet earlier in the self-response period even by a marginal amount, the cost savings to the Census Bureau could be in the millions of dollars. Moreover, increasing the ability of the ACS mail communications to recruit difficult-to-survey populations may increase data quality by reducing non-response bias.

In this dissertation, I focused on the messaging in the ACS mail communication materials as a way to make practical, actionable changes to the ACS that can increase response rates, reduce costs, and to improve data quality by reducing nonresponse bias. As a topic relevant to sociology and of great importance to social science researchers, demographers, policy analysts, business leaders, and community organizers, improving the quality of the ACS is an important task requiring an independent and through investigations. To build an argument that messaging in the ACS mail communications was insufficient, I presented a history of ACS research an innovation to show that the Census Bureau has a history of incremental and path-dependent innovation leading to some improvements, but the improvements have been slow, and the process of innovation has not paid sufficient attention to messaging.

A single theory on survey messaging does not exist, so I combined literature from fields including communications, marketing, sociology, survey methodology, and tests from the

Census Bureau to develop a list of recommendations for survey communications. I then used this list to conduct an analysis of the current ACS mail communication materials. If this analysis of the current ACS messaging showed that ACS materials followed most of the best practice recommendations, it would be possible to improve the ACS response rates by making minor edits existing materials. However, my review showed that the process of innovation of ACS materials has led to materials that have gaps and inconsistencies across mail pieces and that

281 many of the recommendations found in relevant literature are not implemented in the current

ACS materials. Rather than continuing to make minor alterations to existing materials, this review suggests that a complete redesign of these contact materials is required.

In this chapter, I presented how I would best operationalize the recommendations from the previous chapters and the lessons learned from my analysis of current ACS materials to create a series of mutually supportive mail communication materials that are designed to increase response rates and reduce nonresponse bias. By linking broad recommendations into specific mail pieces across six mailings, the Census Bureau can use these ideas to conduct cognitive and field tests of these ideas against current ACS production materials. By incorporating recommendations from relevant literature, and lessons learned from the comprehensive review of current ACS materials, this series of recommended designs make numerous unique contributions to the ACS methodology. These recommendations could only have come from the process used in this dissertation, that took a broad look at the ACS methodology and recommended changes not based on current materials and previous design decisions.

My design reduces the total number of mail pieces sent to respondents in each mailing.

Reducing the number of mail pieces in a single mailing is critically important to reduce the potential for information overload. My initial mailing reduces the number of messages by over seventy-five percent compared to the current ACS initial mailing. My review found that the

Census Bureau originally sent a pre-notice notification of a similar length to the first mailing I propose in my design. However, due to changes in ACS methodology that removed combined mailings and move mail pieces, the result was the current ACS introduction mailing contains 5

282 mail pieces and far too many messages. Also, the individual mail pieces in the same mailing, are highly repetitive because each mail piece was design independently and were originally sent in different mailings. Almost by happenstance these mail pieces all ended up in the initial mailing, creating a bloated and overwhelming package of information for potential respondents. By comparison, my introduction mailing is a much more approachable and is designed specifically to drive response rates earlier in the self-response period from compliant and trusting audiences. This audience is predisposed to respond but tends to have higher rates of education and issued to receiving information in smaller, manageable chunks. My initial mailing is designed to speak to this audience and drive up response immediately by removing unnecessary messaging and barriers to response.

I leveraged knowledge of the changing ACS audience throughout my design to write mail communication materials following plain language guidelines and with messaging that is specifically designed to increase response. The first mailing was designed to establish trust and to gain response from the most compliant respondents. The second mailing continues the conversation with respondents and adds messages that communicate tangible, real examples of community level benefits of ACS response. Community level benefits are consistently ranked as more influential than other messages (Conrey, ZuWallack, and Locke 2012; Hagedorn and

Green 2014; Hagedorn, Green, and Rosenblatt 2014) and communicating benefits through real examples may be a more effective way to motivate people to comply with a requested action

(Dillard and Shen 2012; Artefact 2019). These messages directly speak to a specific mindset of potential respondent, the local-minded, who value actions that directly benefit their community more than actions that may abstractly benefit their nation. Current ACS mailings

283 only mention vague community level benefits, such as building road, hospital and schools, and then repeats this same message, nearly verbatim, across mailings.

Follow-up mailings in a survey operation are often referred to as reminder notifications.

However, the high degree of repetition in the ACS mailings implies that the follow-up mailings aren’t reminders, they are replacements. If a respondent wasn’t convinced the first time, why would the same message sway them the second time? It appears the logic in the current ACS operations isn’t to remind, or provide new information in follow-up mailings, but to repeat and replace an old mailing, assuming perhaps that the original mailings were missed or unread. The logic for repetitive mailings is that they assume that respondents are not reading pervious mailings. If a mailing is seen as a replacement, the purpose of the mailing is to communicate the same messages because a household might be seeing any given mailing as the first mailing.

While it is helpful for each mailing to contain certain messages (such as how to reply to the ACS), the logic of repeating messages across mailings and mail pieces is limiting. For example, imagine for a moment two households. The first household was on vacation when the first four mailings were sent and the open the fifth ACS mailing as the first mail contact from the ACS. To this household, it would not matter what was communicated in the previous mailing. Any message in this mailing would be new. Anything could help induce a response.

However, if a different household received all five ACS mailings, and each was opened, and at least skimmed, then how does a repetitive mailing help this household? The previous messages haven’t resulted in a response, but new messaging would at least present a possibility for a new motivation to illicit a response. In short, non-repetitive mailings speak to all households, while repetitive mailings speak only to households that have missed previous mailings. My design

284 applies a completely different logic and approach to the mail contact materials. To maximize potential response, each mailing should communicate messages useful to someone who has not read previous mailings, but also not be repetitive so it maximizes impact to someone who has read previous mailings.

The presence of a reminder contact is, alone, a powerful tool to induce a response because simply receiving anything in the mail can remind someone to respond. By incorporating new messaging in each contact, my designs maximize potential response by communicating new messages that speak to all households, not just those that have ignored previous Census Bureau communications. Many surveys look to the Census Bureau and the ACS for guidance on how to conduct their survey operations. Many surveys use similar repetitive messaging across mail communications. By changing the ACS and applying an entirely new logic to the design of materials, the designs proposed in this dissertation can provide an example to other survey operations to reinvent their mail contact strategies to be less repetitive and more purposeful.

Messages are added in the third mailing to reduce the perception of burden of responding by a framing the ACS response as a normal civic duty similar to respondents’ previous actions and the actions of those in their community. Additional messages are added to communicate that thousands each month have already responded to this request. This message can reduce the sense of burden on respondents worried about providing information to the Census Bureau, because thousands of others have done so as well. The ACS is a normal activity of civic duty. Previous Census Bureau messaging has not communicated this to respondents. Adding a due date in the third mailing further demystifies the task of response by

285 communicating a clear expectation for when the task is due. At work, I would never assign a task to a colleague without informing them when the task is due. That would be confusing, make the task seem unimportant, and potentially confusing. The third mailing adds the option to respond by paper for those who were previously unwilling, or unable, to respond online.

To further reduce burden on some respondents, the fourth mailing does something that the current ACS mail materials does not do: it explicitly communicates that sampled households can respond by calling the Census Bureau by phone as an equally valid response mode to responding by the internet or by paper. By highlighting the option to respond by phone, this mailing may help gain response from people that require or desire assistance. By introducing this in the 4th mailing, households have more time to call the ACS to complete the survey before the far more expensive personal visit operation begins.

Similar to current ACS procedures, all respondents are removed from the mailing sample after the fourth mailing. After this cut, the Census Bureau sends one final reminder to respond, a bi-fold PSM. This mailing format has already been used (in the second mailing), and about 90% of content is repeated in multiple previous mailings. The logic used by the Census

Bureau seems to be to send one final communication to induce response, but there doesn’t seem to be much effort in this attempt. This is surprising, because the final mailing is critically important to increase response rates prior to expensive personal-visit NRFU operations. This is odd because the Census Bureau devotes resources each month to conduct the “cut” of the sample universe so that this mailing is not sent to households that have already responded.

While this process respects responding households, the procedures are already in place to make this mailing something unique that speaks to this new universe of the most reluctant non-

286 respondents. It is a great opportunity to send something new, with targeted specific messaging.

This opportunity is not currently taken advantage of by the Census Bureau.

In my design, I replace this single, repetition mailing with two new mailings. The first is something completely new: a full graphic letter. At this point, the remaining household have not responded to four mailings. Albert Einstein perhaps said it best when he said, “the definition of insanity is doing the same thing over and over again but expecting different results.” Rather than a repetitive mailing, I apply a new approach that may help gain response from remaining households. We know at this point that households are more rural, less educated, and more distrusting. A professionally designed graphic letter containing simple language and graphics may catch the attention of this audience.

Specifically, I propose that the letter and envelope in this mailing contain a picture of a

Census Bureau employee at someone’s house conducting the ACS. This would communicate to potential respondents, without words, that someone from the Census Bureau may come to their home if they do not respond. I pair this picture with a message that communicates that the Census Bureau does this because they want to help, and because the respondent is important. It also communicates to skeptical household, that they may want to avoid: a personal visit from the Census Bureau to their home. The use of a picture is strategic to communicate potential respondents who may only skim the text of letters. This picture can communicate without words that someone from the Census Bureau may be coming to their home, and this may motivate response from both compliant and distrusting audiences that do not want such a visit. Adding this picture to the outside of the envelope may also catch the attention of households wanting to avoid a visit.

287 The letter inside would also include a full-color infographic on the back to communicate benefits of survey response in a way approachable to people with low literacy skills. This use of design and color would match the use of design and color in other mailings. Throughout my design, I incorporated the use of the color green, the color of the ACS mail survey instrument, to tie mailings together. Though the use of graphic elements has not been shown to increase response rates to the ACS in the past, I have noted the flaws in the designs and tests of previous graphic elements. In short, the Census Bureau has done a poor job incorporating new graphic elements into the ACS mail contact strategy. The strategy currently uses multiple graphic mail pieces (two instruction cards, an FAQ brochure, and a multilingual brochure) that use inconsistent color and design elements. The ACS then tested adding new graphic materials to this already confusing set of graphics, and unsurprisingly, those attempts failed to boost response. I propose testing a series of mail pieces that purposefully incorporate varying levels of graphic elements to see if specific, integrated use to graphics can increase overall response rates or response rates from less trusting populations.

My review of the current ACS materials pointed out perhaps the most egregious flaw, that the ACS materials may not draw a strong connection between the ACS and the Census

Bureau. In my designs, I focus messaging to highlight the strength of the Census Bureau brand.

Envelopes, return addresses, and headers were all simplified to clearly communicate that the

Census Bureau, a known and trusted government agency, conducts the ACS. Mentions and logos from extraneous government entities were removed. The Census Bureau logo was placed in a more prominent position, and incomprehensible Form ID’s were moved to less prominent positions.

288 The sixth mailing uses a new PSM format to, once again, vary the size and look of the mailing. All six mailings in my design use different sized envelopes so that each mailing is unique are more likely to be noticed. A critical message is also added in in the last mailing, that a response to the legally mandated ACS is past due. My design provides a “respond by” date in mailing three. This is strategic to drive as much response prior to the fifth mailing (to reduce follow-up mailing costs to mail mailing 5 and 6), and to drive response to avoid in-person visits.

Following this, my design escalates the language to communicate a “response due” date on the front of the fourth mailing postcard and on the fifth mailing graphic letter to further push response prior to the costly in-person follow-up operation. In the final step of this staged messaging across mailings, my plan communicates a “past due” messages in sixth and final communication. The past due message is critical to communicating a due date for the ACS.

The Census Bureau accepts ACS self-responses months past the initial self-response month. Most notably, some households choose to self-respond during the NRFU period either because they were just late getting to the survey task or because someone from the Census

Bureau came to their home and convinced them to self-respond rather than to respond via in- person interview. I do not want to lose these late responders and communicating a due date could lead to a drop off in response past this due date, despite responses being accepted by the

Census Bureau. Something else needs to be communicated later so that households know that they can still respond past this due date. The combination of a “due date” message with a later

“past due” message may drive higher rates of response prior to the due date and also motivating other households to respond past the due date. Incorporating respond by, due by,

289 and past due messages would be a new feature.53

Even with adding more costly graphic and color letters and a sixth mailing, my mailing strategy is designed to reduce costs. First, my design removes repetitive and unnecessary mail pieces. The materials that my plan removes, the two different color brochures and the cardstock color instruction card, are some of the more expensive mail pieces to print and ship. I use the direct savings from dropping these materials to produce the new full color graphic letter in the fifth mailing and to add the sixth mailing, increasing the total number of contacts. If my plan produces the intended results and increases response rates and push more households to respond prior to in-person interviewing, this mailing strategy can significantly reduce costs to the ACS program.

My designs are part of a larger process currently underway to test new ideas to improve the messaging of Census Bureau surveys and to stay ahead of the curve of declining response rates. This work presents unique contributions to the effort to redesign ACS mail communications materials. However, all work is collaborative. While this dissertations purpose is to break the cycle of incremental, path-dependent innovation, it must also still fit within current Census Bureau practices. One limitation of large survey operations within the federal government is, that to simplify something is often very difficult. It is likely that the designs in this dissertation will not be tested as the package that I propose here. However, these designs can serve as an example of how I would operationalize the concepts and recommendations

53 After the start of this project, the Census Bureau planned to test a due date in the ACS mailout process. However, the test only communicated a due date in the fifth and final mailing (Kephart, et al. forthcoming). The test does not follow the plan in this dissertation to stage the messaging of the due date across multiple mailings with language that escalates the message from “respond by,” to “response due by,” to “past due.”

290 developed in this project and can be combined with other ideas and revisions from Census

Bureau staff. These ideas can then undergo necessary cognitive testing and usability testing that can inform redesigns of materials prior to field testing. In the end, the results of field testing, and evidence on if these design features do in fact boost response, reduce cost, or reduce nonresponse bias, will dictate if any of these ideas are incorporated into ACS production. One unique contribution of this work is that by removing the design process from the incremental and path-dependent process used by the ACS, this project has produced a set of testable materials at a much faster rate than could have been accomplished within the current system of design and innovation. Testing any of these concepts may take a year, and likely more. The strength of this dissertation, that is thinking outside the current Census Bureau box, will also be a limitation to having these ideas tested and implemented as designed in this dissertation.

This dissertation focused directly on the ACS, and the unique challenges facing a large, nationally representative, required-by-law, federal survey. This can limit some of the specific recommendations made in this dissertation, for example, on how to best communicate that the responding to the ACS is required by law. Few surveys have a legal mandate for responding.

However, the challenges of declining response and increasing costs are universal to all survey operations, and many of the recommendations in this dissertation can benefit other federal and nonfederal survey operations. While the analysis of current materials is designed specifically for the ACS materials, the logic of the analysis can be adapted to analyze mail communications for any survey operation. Similarly, the recommendations developed here are specifically for the ACS, but most can also be broadly applied in other survey contexts, for

291 example, the guidance to produce less repetitive mailings with fewer, intentional messages.

This dissertation marks the conclusion of a large, multi-year theoretical project with practical implications for the largest and most important survey in the United States. However, in many ways, this dissertation is just the beginning of the work required to reinvent survey communications for the ACS and other survey operations. The challenges facing survey operations are not going away, but the need for representative, unbiased data may be more important than ever. This dissertation is a call for all survey operations to pay more attention to the least studied aspect of their survey operations, their mail communication materials and messaging. Together, sociologists, survey methodologists, and survey professionals can develop new methodologies to increase response rates, reduce costs, and to create critically important unbiased data sets for our nation and communities across the country.

292 REFERENCES

Allen, Theodore and Tack Richardson. 2019. “Speech Acts: A Framework to Improve Clarity, Accountability, and Intent.” Strategic Communications Presentation, April 30, U.S. Census Bureau, Suitland, MD.

Amaya, Ashley, Felicia Leclere, Kari Carris, Youlian Liao. 2015. “Where to Start: An Evaluation of Primary Data-Collection Modes in an Address-Based Sampling Design.” Public Opinion Quarterly, 79(2), 420-442.

Artefact Group. 2019. “Behavior Change Strategy Cards.” Retrieved from: www.artefactgroup.com/resources/behavior-change-strategy-cards/

Armstrong, J. Scott and Edward J. Luske. 1987. “Return Postage in Mail Surveys: A Meta Analysis.” Public Opinion Quarterly, 51(2):233–248.

Auerbach, Carl and Louise Silverstein. 2003. Qualitative Data: An Introduction to Coding and Analysis. NYC: University Press.

Avdeyeva, Olga A. and Richard E. Matland. 2013. “An Experimental Test of Mail Surveys as a Tool for Social Inquiry in Russia.” International Journal of Public Opinion Research, 25(2): 173–194.

Bandilla Wolfgang; Mick Couper, and Lars Kaczmirek. 2014. “The Effectiveness of Mailed Invitations for Web Surveys and the Representativeness of Mixed-Mode versus Internet Only Samples.” Survey Practice, 7(4), 1-12.

Barth, Dorothy, Mary-Frances Zelenak, Mark Asiala, Edward Castro, and Andrew Roberts. 2016. “2015 Envelope Mandatory Messaging Test.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2016/acs/2016_Barth_01.html

Bates, Nancy and Yuling Pan. 2010. “Motivating Non-English-Speaking Populations for Census and Survey Participation.” Presented at the Federal Committee on Statistical Methodology, November 2-4, Washington, DC. Retrieved from: www.census.gov/ content/dam/Census/library/working-papers/2010/adrm/ssm2010-08.pdf

293 Baumgardner, Stephanie. 2013. “Tracking American Community Survey Mail Response during the 2010 Census.” Retrieved from U.S. Census Bureau: www.census.gov/acs/www/Downloads/library/2013/2013_Baumgardner_01.p

Baumgardner, Stephanie, Deborah H. Griffin, and David A. Raglin. 2014. “The Effects of Adding an Interne Response Option to the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2014/acs/ 2014_Baumgardner_04.html

Baumgardner, Stephanie. 2017. “Who Are We Most Likely to Reach with Digital Advertising?” Presented at the annual meeting of the American Association for Public Opinion Research, May 18-21, New Orleans, LA.

Baumgardner, Stephanie. 2018 “When is the Best Time to Field Your Survey? Trends in American Community Survey Response Rate.” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, Denver, CO.

Berkley, Jenny. 2018. “ACS Respondent Demographics” Presented at the U.S. Census Bureau, July 24, 2018, Suitland, MD.

Birnholtz, Jeremy P., Daneil B. Horn, Thomas A. Finholt, and Sung Joo Bae. 2004. “The Effects of Cash, Electronic, and Paper Gift Certificates as Respondent Incentives for a Web-based Survey of Technologically Sophisticated Respondents.” Social Science Computer Review, 22(3): 355–362.

Blau, Peter. 1964. Exchange Power in Social Life. New York, NY: Wiley

Blumberg, Stephen J. and Julian V. Luke. 2016. “Wireless Substitution: Early Release of Estimates from the National Health Interview Survey, January-June 2016.” Retrieved from Center of Disease Control and Prevention: www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201612.pdf

294 Blumberg Stephen J. and Julian V. Luke. 2019. “Wireless Substitution: Early Release of Estimates From the National Health Interview Survey, July-December 2018.” Retrieved from Center of Disease Control and Prevention: www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201906.pdf

Bogen, Karen. 1996. “The Effect of Questionnaire Length on Response Rates—A Review of the Literature.” Proceedings of the Survey Research Methods Section of the American Statistical Association, 1020-2015.

Bowling, Ann. 2005. “Mode of Questionnaire Administration Can Have Serious Effects on Data Quality.” Journal of Public Health, 29(3):281–291.

Bradburn, Norman. 1978. “Respondent burden.” Proceedings of the Survey Research Methods Section of the American Statistical Association, 35-40.

Braveman, Julia. 2008. “Testimonials versus informational persuasive messages: the moderating effect of delivery mode and personal involvement” Communication Research, 35:666-694.

Breton Charkes, Fred Cutler, Sarah Lachance, Alex Mierke-Zatwarnicki. 2017. “Telephone Versus Online Survey Modes for Election Studies: Comparing Canadian Public Opinion and Vote Choice in the 2015 Federal Election.” Canadian Journal of Political Science, 50(4), 1005- 1036.

Brick, J. Michael, Douglas Williams, and Jill M. Montaquila. 2011. “Address Based Sampling for Subpopulation Surveys.” Public Opinion Quarterly, 75(3):409-428.

Brick, Michael J. and Douglas Williams. 2012. “Explaining Rising Nonresponse Rates in Cross- Sectional Surveys.” The ANNALS of the American Academy of Political and Social Science, 645(1): 36-59.

Brick, J. Michael, W.R. Andrews, Pat D. Brick, Howard King, Nancy A. Mathiowetz, Lynne Stokes. 2012. “Methods for Improving Response Rates in Two-phase Mail Surveys.” Survey Practice, 5(3).

295 Brick, J. Michael and Douglas Williams. 2012. “Explaining Rising Nonresponse Rates in Cross- Sectional Surveys.” ANNALS of the American Academy of Political and Social Science, 645:36-59.

Bucks, Brian and Mick Couper. 2018. “The Fine Print: The Effect of Legal/Regulatory Language on Mail Survey Response.” Survey Practice, 11(2).

Chesnut, John. 2010. “Testing an Additional Mailing Piece in the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2010/acs/ 2010_Chesnut_01.html

Chesnut, John and Mary Davis. 2011. “Evaluation of the ACS Mail Materials and Mailing Strategy During the 2010 Census.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2011/acs/2011_Chesnut_01.html

Christian, Leah. 2007. How mixed-mode surveys are transforming social research: The influence of survey mode on measurement in web and telephone surveys. Unpublished doctoral dissertation. Washington State University, Pullman, WA.

Christian, Leah and Don A. Dillman. 2004. “The Influence of Symbolic and Graphical Language Manipulations on Answers to Paper Self-Administered Questionnaires.” Public Opinion Quarterly, 68(1):57-80.

Church, Allan H. 1993. “Estimating effect of incentives on mail survey response rates: A meta- analysis.” Public Opinion Quarterly, 57(1):62-79.

Cialdini, Robert. 1984. Influence: The New Psychology of Modern Persuasion. New York, NY: Quill.

Cialdini, Robert. 2009. Influence: Science and Practice. Boston, MA: Pearson Education.

Cialdini, Robert. 2016. Pre-Suasion: A Revolutionary Way to Influence and Persuade. New York, NY: Simon & Schuster.

Clark, Sandra L., Andrew Roberts, Jennifer Tancreto, and David Raglin. 2015a. “2015 Replacement Mail Questionnaire Package Test.” Retrieved from U.S. Census Bureau:. www.census.gov/library/working-papers/2015/acs/2015_Clark_02.html

296 Clark, Sandra L., Lauren DiFiglia, Jennifer Tancreto and David Raglin. 2015b. “2015 Mail Contact Strategy Modification Test.” Retrieved from U.S. Census Bureau: www.census.gov/ content/dam/Census/library/working-papers/2015/acs/2015_Clark_03.pdf

Clark, Sandra L. and Andrew Roberts. 2016. “Evaluation of August 2015 ACS Mail Contact Strategy Modification.” Retrieved from U.S. Census Bureau: www.census.gov/ content/dam/Census/library/working-papers/2016/acs/2016_Clark_01.pdf

Clark, Sandra L., R. Chase Sawyer, Amanda Kilmek, Ellen Wilson, Christopher Mazur, and William Chapin. 2018. “Reducing Burden and Improving Data Quality: Can the American Community Survey Accomplish this with Administrative Records?” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, Denver, CO.

Comley, Pete. 2006. “The Games We Play: A Psychoanalysis of the Relationship Between Panel Owners and Panel Participants.” Proceedings from the ESOMAR World Research Conference. 317: 123–132.

Cook, Colleen, Fred Heath, and Russel L. Thompson. 2000. “A Meta-Analysis of Response Rates in Web- or Internet-Based Surveys.” Educational and Psychological Measurement, 60(6): 821-36.

Conrey, Frederica, Randal ZuWallack, Robynne Locke. 2012. “Census Barriers, Attitudes, and Motivators Survey II Final Report.” Retrieved from U.S. Census Bureau: www.census.gov/library/publications/2012/dec/2010_cpex_205.html

Creswell, John. 2013. Qualitative Inquiry and : Choosing Among Five Approaches (3rd ed.). Los Angeles, CA: Sage Publications.

Czajka, J. and Beyler, A. 2016. “Background Paper: Declining Response Rates in Federal Surveys: Trends and Implications.” Retrieved from U.S. Department of Health and Human Services: aspe.hhs.gov/system/files/pdf/255531/Decliningresponserates.pdf

Daly, Jeanette M., Julie K. Jones, Patricia L. Gereau, and Barcey T. Levy. 2011. “Nonresponse Error in Mail Surveys: Top Ten Problems.” Nursing Research and Practice.

297 Das, Marcel and Mick P. Couper. 2014. “Optimizing Opt-out Consent for Record Linkage.” Journal of Official Statistics, 30 (3):479–497. de Leeuw, Edith D. 2005. “To mix or not to mix: Data collection modes in surveys.” Journal of Official Statistics, 21(2):233-255. de Leeuw, Edith D. 2008. “Choosing the Method of Data Collection” in International Handbook of Survey Methodology in International Handbook of Survey Methodology, ed. Edith D. de Leeuw, Joop J. Hox and Don A. Dillman. Oxford, United Kingdom: Taylor and Francis. de Leeuw, Edith D., Joop J. Hox, and Don A. Dillman. 2008 “The Cornerstones of Survey Research” in International Handbook of Survey Methodology, ed. Edith D. de Leeuw, Joop J. Hox and Don A. Dillman. Oxford, United Kingdom: Taylor and Francis. 1-17. de Wit, John B. F., Enny Das, and Raymond Vet. 2008. “What Works Best: Objective Statistics or a Personal Testimonial? An Assessment of the persuasive effects of different types of message evidence on risk perception.” Health Psychology, 27: 110–115.

Deci, Edward L. and Richard M. Ryan. 1985. Intrinsic Motivation and Self-Determination in Human Behavior. New York, NY: Plenum Press.

Dillard, James and Lijiang Shen. 2012. The SAGE Handbook of Persuasion: Development in Theory and Practice, Second Edition. Thousand Oaks, CA: Sage Publications Inc.

Dillman, Don A.1978. Mail and Telephone Surveys: The Total Design Method. Hoboken, NJ: Wiley.

Dillman, Don A. 1991. “The Design and Administration of Mail Surveys.” Annual Review of Sociology, 17:225-249.

Dillman, Don A. 1996. "Why Innovation is Difficult in Government Surveys." Journal of Official Statistics. 12(2): 113-124. (Earlier versions in 1994 Annual Research Conference Proceedings, U.S. Bureau of the Census, Washington, DC. 213-223 and 1995 Survey Research, 26(1-2).

Dillman, Don A. 2000. Mail and Internet Surveys: The Tailored Design, Second Edition. John Wiley: Hoboken, NJ.

298 Dillman, Don A. 2007. Mail and Internet Surveys: The Tailored Design, Second Edition – 2007 Update. John Wiley: Hoboken, NJ.

Dillman, Don A. 2014. “Review of Proposed Materials for Improving Response Rates and Survey Awareness of the American Community Survey.” Memo to the U.S. Census Bureau, August 13, 2014.

Dillman, Don A. 2016. “Part 2. What Theory Can Tell Us About Why People Do or Do Not Answer Survey Requests” Presented at the Summer at Census program, July 11, Suitland, MD.

Dillman, Don A. 2017a. “The Promise and Challenge of Pushing Respondents to the Web in Mixed-mode Surveys.” Survey Methodology, Statistics Canada, Catalogue No. 12-001-X, 43(1).

Dillman, Don A. 2017b. “Web-Push Surveys and the Worldwide Challenge of Being Neither Too Farr Ahead nor Behind our Respondents.” Presented at the United Nations Workshop on Statistical Data Collection, October 10, Ottawa, Ontario.

Dillman, Don A. 2019a. “Why Web-Push Methods are Needed and the Research Efforts Making Worldwide use Feasible.” Presented at the annual meeting of the American Association for Public Opinion Research, May 18, 2019, Toronto, Ontario.

Dillman, Don A. 2019b. “Towards Survey Response Theories That No Longer Pass Like Strangers in the Night.” Presented at the European Survey Research Association Biennial Conference, July 16, Zagreb, Croatia.

Dillman, Don A., Jon R. Clark, and Michael D. Sinclair. 1995. “How Prenotice Letters, Stamped Return Envelopes and Reminder Postcards Affect Mailback Response Rates for Census Questionnaires.” Survey Methodology, Statistics Canada, 21(2): 159–166.

Dillman, Don A., Eleanor Singer, Jon R. Clark, James B. Treat. 1996. “Effects of Benefits Appeals, Mandatory Appeals, and Variations in Statements of Confidentiality on Completion Rates for Census Questionnaires.” Public Opinion Quarterly, 60(3):376-389.

299 Dillman, Don A., Cleo Jenkins, Betsy Martin, and Theresa DeMaio. 1996. “Cognitive and Motivational Properties of Three Proposed Decennial Census Forms.” Retrieved from U.S. Census Bureau: www.census.gov/srd/papers/pdf/ssm2006-04.pdf

Dillman, Don A., Lesser, John Carlson, Robert G. Mason, Fern Willits, Arrick Jackson. 1999. “Personalization of mail questionnaires revisited.” Presented at the annual meeting of the Rural Sociology Society, Chicago, IL.

Dillman, Don A., Virginia Lesser, Robert G. Mason, Fern Willits, John Carlson, and Rob Robertson. 2002. "Personalization of Mail Surveys on General Public and Other Populations: Results from Nine Experiments." Proceedings of the Section on Survey Methods, The American Statistical Association.

Dillman, Don A. and Cleo D. Redline. 2004. "Testing Paper Self-Administered Questionnaires: Cognitive Interview and Field Test Comparisons." In Presser, Stanley, et al. (eds.), Methods for Testing and Evaluating Survey Questionnaires. New York: Wiley- Interscience. 299-317.

Dillman, Don A., Virginia Lesser, Robert G. Mason, John Carlson, Fern Willits, Rob Robertson, Bryan Burke. 2007. “Personalization of Mail Surveys for General Public and Populations with a Group Identity: Results from Nine Studies.” Rural Sociology, 72(4): 632-646

Dillman, Don A., Jolene D. Smyth, and Leah M. Christian. 2009. Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition. John Wiley: Hoboken, NJ

Dillman, Don A., Jolene D. Smyth, and Leah M. Christian. 2014. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th edition. Hoboken, NJ: Wiley.

Dillman, Don A. and Pearce Greenberg. 2017. “Making Mail Communications More Effective: A test of Presuasion vs. Social Exchange Theories.” Presented to the 7th Conference of the European Survey Research Association, July 21, Lisbon, Portugal.

300 Dillon, Michaela, Jessica Majercik, Bonnie Moore, Kevin Rinz, Quentin Brummet, and David Sheppard. 2018. “Preliminary Research Investigating the Use of Administrative Records in the American Community Survey.” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, Denver, CO.

Dirksz, Gerry, Lisa Lusskin, Beth Ponce, Paul Felstead, Josephine Loenard, Paul J. Lavrakas. 2018. “Testing the Inclusion of an Informational Brochure in the First Recruitment Mailing to an ACS Sample in a Mixed Mode Survey.” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, Denver, CO.

Dommeyer, Curt J., Doris Elganayan, and Cliff Umans. 1991. “Miscellany: Increasing Mail survey Response with an Envelope Tester.” International Journal of , 33(2):1-5.

Dubay, William H. 2008. “Working with Plain Language: A Training Manual.” Impact Information. Retrieved from: www.impact-information.com/Resources/working.pdf

Dutwin, David and Paul J. Lavrakas. 2016. “Trends in Telephone Outcomes, 2008-2015.” Survey Practice, 9(2)1-9.

Edwards, Phil, I. Roberts, MJ Clarke, and Carolyn Diguiseppi. 2009. “Methods to Increase Response to Postal and Electronic Questionnaires.” Cochrane Database of Systematic Reviews, 3(3).

Edwards, Michelle L., Don A. Dillman and Jolene D. Smyth. 2014. “An Experimental Test of the Effects of Survey Sponsorship on Internet and Mail Survey Response.” Public Opinion Quarterly, 78 (3):734-750.

Envelope.org. 2019. The EMA Guide to Envelopes and Mailing. Retrieved on October 13, 2019 from: www.envelope.org/guide

Erdman, Chandra and Nancy Bates. 2017. “The Low Response Score (LRS): A metric to Locate, Predict, and Manage Hard-to-Survey Populations.” Public Opinion Quarterly, 81(1): 144- 156.

301 Evans, Sarah, Jenna Levy, Jennifer Miller-Gonzalez, Monica Vines, Anna Sandoval Girón, Gina Walejko, Nancy Bates, and Yazmin García Trejo. 2019. “2020 Census Barriers, Attitudes, and Motivators Study Focus Group Final Report A New Design for the 21st Century” Retrieved from U.S. Census Bureau: www2.census.gov/programs-surveys/decennial/ 2020/program-management/final-analysis-reports/2020-report-cbams-focus-group.pdf

Esri. 2018. “Tapestry Segmentation.” Retrieved from: esri.com/tapestry August 9, 2018.

Federal Plain Language Guidelines. 2011. “Improving Communication from the Federal Government to the Public.” Retrieved from: www.plainlanguage.gov/index.cfm

Fernandez, Leticia, Rachel Shattuck, and James Noon. 2018. “The Use of Administrative Records and the American Community Survey to Study the Characteristics of Undercounted Young Children in the 2010 Census.” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, Denver, CO.

Festinger, Leon. 1957. A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press.

Feygina, I., Foster, L., and Hopkins, D. 2015. “Re: Potential Pilot Interventions to Increase Response Rates to Census Surveys.” Email from Social Behavioral Science Team established in 2015 to help federal agencies integrate behavioral insights into their policies and programs (Executive Order No.13707, 2015). Memo to the U.S. Census Bureau, September 10, 2015.

Fishbein, Martin and Icek Ajzen. 2011. Predicting and Changing Behavior: The Reasoned Action Approach. Oxford, United Kingdom: Taylor & Francis.

Flesch, Rudolf. 1949. The Art of Readable Writing. New York: Harper Collins.

Flesch, Rudolf. 1979 OR How to write in Plain English. A Book for Lawyers and Consumers. New York, NY: Harper Collins.

Fobia, Aleia C., Jessica Holzberg, and Jenny H. Childs. 2017. “Communicating Data and Privacy Use.” Presented at the Association of Public Data Users Annual Conference, August 13, Arlington, VA.

302 Fobia, Aleia C. and Jessica Holzberg 2016. “Communicating Data Use and Privacy: In-person Versus Web Based Methods for Message Testing.” Presented at the annual meeting of the American Association for Public Opinion Research. May 12- 15, Austin, TX.

Fox, Richard J., Melvin R. Crask, and Jonghoon Kim. 1988. “Mail Survey Response Rate: A Meta- Analysis of Selected Techniques for Inducing Response.” The Public Opinion Quarterly, 52(4): 467-491.

Fricker, Scott, Jeff Gonzalez, and Lucilla Tan. 2011. “Are you burdened? Let’s find out.” Presented at the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ.

Fricker, Scott, Craig Kreisler, and Lucilla Tan. 2012. “An exploration of the application of PLS path modeling approach to creating a summary index of respondent burden.” Proceedings of the Survey Research Methods Section of the American Statistical Association, 4141-55. Retrieved from Bureau of Labor Statistics: www.bls.gov/osmr/research-papers/2012/pdf/st120050.pdf.

Fricker, Scott, Ting Yan, and Shirley Tsai. 2014. “Response Burden: What Predicts It and Who is Burdened Out?” Presented at the annual meeting of the American Association for Public Opinion Research, Anaheim, CA.

Fricker, Scott, Jeff Gonzalez, and Lucilla Tan. 2011. “Are you burdened? Let’s find out.” Presented at the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ.

Fulton, Jenna, Gearson D. Morales, and Jenny H. Childs. 2016. “Effectiveness of Messaging to Encourage Response to the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/srd/papers/pdf/rsm2016-04.pdf

Galesic, Mirta. 2006. “Dropouts on the Web: Effects of Interest and Burden Experiences During an Online Survey.” Journal of Official Statistics, 22(2):313-28.

303 Galesic, Mirta and Michael Bosnjak. 2009. “Effects of Questionnaire Length on Participation and Indicators of Response Quality in a Web Survey.” Journal of Official Statistics, 73(2): 349-60.

Gentry, Robin. 2008. “Offering Respondents a Choice of Survey Mode.” Presented at the Council for Marketing and Opinion Research Respondent Cooperation Workshop, Las Vegas, NV.

Gentry, Robin J. and Cindy D. Good. 2008. “Offering Respondents a Choice of Survey Mode: Use Patterns of an Internet Response Option in a Mail Survey”. Presented at the annual meeting of the American Association for Public Opinion Research, May 5, New Orleans, LA.

Gerber, Eleanor and Gianna Dusch. 2007. “Comparing NAvigaation in Two Formats for Demographic Items in the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2007/adrm/rsm2007-36.html

Gibbs, Graham. 2007. Analyzing Qualitative Data (1st ed.). Thousand Oaks, CA: Sage Publications Inc.

Gibson Rhonda and Dolf Zillmann. 1994. “Exaggerated Versus Representative Exemplification in News Reports: Perceptions of issues and personal consequences.” Communication Research, 21:603 – 624.

Gottlieb, Karla L. and Gail Robinson (Eds.). 2002. A practical guide for integrating civic responsibility into the curriculum. Washington, DC: Community College Press.

Gouldner, Alvin Ward. 1960. “The Norm of Reciprocity: A Preliminary Statement.” American Sociological Review, 25(2): 161.

304 Griffin, Deborah H. 2013. “Effect of Changing Call Parameters in the American Community Survey’s Computer Assisted Telephone Interviewing Operation.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working-papers/2013/ acs/2013_Griffin_03.pdf.

Griffin, Deborah H. 2014. “Reducing Respondent Burden in the American Community Survey’s Computer Assisted Personal Visit Interviewing Operation – Phase 2 Results.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working- papers/2014/acs/2014_Griffin_01.pdf.

Griffin, Deborah H., Fischer, D. P., and Morgan, M.T. 2001. “Testing an Internet Response Option for the American Community Survey.” Presented at the annual meeting of the American Association for Public Opinion Research, May 17-20, Montreal, Quebec.

Griffin, Deborah H., Joan K. Broadwater, Theresa F. Leslie, Pamela D, McGovern, and David Raglin. 2004. “Meeting 21st Century Demographic Data Needs –Implementing the American Community Survey Report 11: Testing Voluntary Methods – Additional Results.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/ library/working-papers/ 2004/acs/ 2004_Griffin_02.pdf

Griffin, Deborah H. and Todd R. Hughes. 2013. “Analysis of Alternative Call Parameters in the American Community Survey’s Computer Assisted Telephone Interviewing.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working- papers/2013/acs/2013_Griffin_02.pdf.

Griffin, Deborah H. and Dawn V. Nelson. 2014. “Reducing Respondent Burden in the American Community Survey’s Computer Assisted Personal Visit Interviewing Operation – Phase 1 Results (Part 2).” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/ Census/library/working-papers/2014/acs/2014_Griffin_02.pdf.

305 Griffin, Deborah H., Eric V. Slud, and Chandra Erdman. 2015. “Reducing Respondent Burden in the American Community Survey’s Computer Assisted Personal Visit Interviewing Operation – Phase 3 Results.” Retrieved from U.S. Census Bureau: www.census.gov/ content/dam/Census/library/working-papers/2015/acs/2015_Griffin_01.pdf

Gross, Bertram M. 1964. The Managing of Organizations. Thousand Oaks, CA: Sage.

Groves, Robert M. 2006. “Nonresponse Rates and Nonresponse Bias in Household Surveys.” Public Opinion Quarterly, 70(5): 646-75.

Groves, Robert M. and Robert L. Kahn. 1979. Surveys by telephone: A national comparison with personal interviews. Ann Arbor, Michigan: Institute for Social Research.

Groves, Robert M., Robert Cialdini, and Mick P. Couper. 1992. “Understanding the Decision to Participate in a Survey.” Public Opinion Quarterly, 56(4): 475–495.

Groves, Robert M. and Mick P. Couper. 1998. Nonresponse in Household Surveys. New York: Wiley and Sons.

Groves, Robert M., Eleanor Singer, and Amy Corning. 2000. “Leverage-Saliency Theory of Survey Participation.” Public Opinion Quarterly, 64(3): 209–308.

Groves, Robert M., Stanley Presser, and Sarah Dipko. 2004. “The Role of Topic Interest in Survey Participation Decisions.” Public Opinion Quarterly, 68 (1): 2-31.

Groves, Robert M., Mick P. Couper, Stanley Presser, Eleanor Singer, Roger Tourangeau, Giorgina Piani Acosta, and Lindsay Nelson. 2006. “Experiments in Producing Nonresponse Bias.” Public Opinion Quarterly, 70(5): 720-736.

Groves, Robert M., Stanley Presser, Roger Tourangeau, Brady T. West, Mick P. Couper, Eleanor Singer, and Christopher Toppe. 2012. “Support for the Survey Sponsor and Nonresponse Bias.” Public Opinion Quarterly, 76(3): 512–524.

306 Hagedorn, Sam and Robert Green. 2014. “ACS Messaging Research: Refinement Survey.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2014/acs/ 2014_Hagedorn_03.html

Hagedorn, Sam, Robert Green, and Adam Rosenblatt. 2014. “ACS Messaging Research: Benchmark Survey.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2014/acs/2014_Hagedorn_01.html

Hagedorn, Sam, Michael Panek, and Robert Green. 2014. “American Community Survey Mail Package Research: Online Visual Testing.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working-papers/2014/acs/ 2014_Hagedorn_04.pdf

Hallgren, Kevin A. 2012. “Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial.” Tutorials in Quantitative Methods for Psychology, 8(1): 23-34. Retrieved from National Institutes of Health: www.ncbi.nlm.nih.gov/pmc/articles/PMC3402032

Hallsworth, Michael, John A. List, Robert D. Metcalf, and Ivo Vlaev. 2014. “The Behavioralist as Tax Collector: Using Natural Field Experiments to Enhance Tax Compliance.” Retrieved from National Bureau of Economics Research: www.nber.org/papers/w20007

Haraldsen, Gustav. 2004. “Identifying and Reducing Response Burdens in Internet Business Surveys.” Journal of Official Statistics, 20(2): 393-410.

Harter, Rachel, Michael P. Battaglia, Trent D. Buskirk, Don A. Dillman, Ned English, Mansour Fahimi, Martin R. Frankel, Timothy Kennel, Joseph P. McMichael, Camerson B. McPhee, Jill Monaquila, Tracie Yancey, and Andrew L. Zukerberg. 2016. “Address-based Sampling” American Association for Public Opinion Research Task Force Report. Retrieved from: www.aapor.org/getattachment/Education-Resources/Reports/ AAPOR_Report_1_7_16_CLEAN-COPY-FINAL-(2).pdf.aspx

307 Heberlein, Thomas A. and Robert Baumgartner. 1978. “Factors Affecting Response Rates to Mailed Questionnaires: A Quantitative Analysis of the Published Literature.” American Sociological Review, 43(4): 447-62.

Hedlin, Dan, Trine Dale, Gustav Haraldsen, and Jacqui Jones. 2005. “Developing Methods for Assessing Perceived Response Burden.” Retrieved from Eurostat: ec.europa.eu/eurostat/documents/64157/4374310/10-DEVELOPING-METHODS-FOR- ASSESSING-PERCEIVED-RESPONSE-BURDEN.pdf

Heerwegh Dirk. 2009. “Mode Differences Between Face-to-Face and Web Surveys: An Experimental Investigation of Data Quality and Social Desirability Effects.” International Journal of Public Opinion Research, 21: 111-121.

Heerwegh, Dirk, Geert Loosveldt. 2008. “Face-to-Face versus Web Surveying in a High-Internet- Coverage Population.” Public Opinion Quarterly, 72(5): 836-846 .

Heimel, Sarah, Dorothy Barth, and Megan Rabe. 2016. “Why We Ask: Mail Package Insert Test.”. Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/ library/working-papers/ 2016/acs/2016_Heimel_01.pdf

Heimel, Sarah, Dorothy Barth, Jenny Berkley, and Michael Risley. 2019. “Effects of Mailing a Data Slide to American Community Survey Respondents.” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, Toronto, Ontario.

Hembroff, Larry A., Debra Rusz, Ann Rafferty, Harry McGee, and Nathaniel Ehrlich. 2005. “The Cost Effectiveness of Alternative Advance Mailings in a Telephone Survey.” Public Opinion Quarterly, 69(2):232–245.

Henley, James R. Jr. 1976. "Response rate to mail questionnaires with a return deadline.” Public Opinion Quarterly, 40:374-375.

308 Hoffman, Donald D. 2000. Visual Intelligence: How we Create what we See. New York, NY: Norton.

Holzberg, Jessica, Jonathan Katz, Gerson Morales and Mary Davis. 2018. “Assessing Respondents’ Perceptions of Burden in the American Community Survey” Presented to the Federal Committee on Statistical Methodology (CSM) Research Conference, March 9, 2018, Washington DC.

Homans, George. 1961. Social Behavior: Its Elementary Forms. New York: Harcourt Brace & World.

Hotchkiss, Marisa and Jessica Phelan. 2017. “Uses of Census Bureau Data in Federal Funds Distribution: A New Design for the 21st Century.” Retrieved from U.S. Census Bureau: www2.census.gov/programs-surveys/decennial/2020/program-management/working- papers/Uses-of-Census-Bureau-Data-in-Federal-Funds-Distribution.pdf.

Hughes, Todd, Eric Slud, Robert Ashmead, and Rachael Walsh. 2016. “Results of a Field Pilot to Reduce Respondent Contact Burden in the American Community Survey’s Computer Assisted Personal Interviewing Operation.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working-papers/2016/acs/ 2016_Hughes_01.pdf.

Huntsinger, Jerry. 2019. “Tutorial 55: How to get your Outer Envelope Ripped Open!” Retrieved from the Showcase of Fundraising Innovation and Inspiration (sofii.org): http://sofii.org/article/tutorial-55-how-to-get-your-outer-envelope-ripped-open

Holm, Sture. 1979. “A Simple Sequentially Rejective Multiple Test Procedure,” Scandinavian Journal of Statistics, 6(2):65-70.

James, Jeannine M. and Richard Bolstein. 1992. “Large Money Incentives and their Effect on Mail Survey Response Rates.” Public Opinion Quarterly, 56(4): 442–453.

309 Jones, Jacqui. 2012. “Response Burden: Introductory Overview Lecture.” Presented at the Fourth Annual International Conference on Establishment Surveys, Montreal, Quebec.

Joshipura, Megha. 2008. “2005 American Community Survey Respondent Characteristics Evaluation.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/ Census/library/working-papers/2008/acs/2008_Joshipura_01.pdf

Joshipura, Megha. 2010. “Evaluating the Effects of a Multi-Lingual Brochure in the American Community Survey,” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2010/acs/2010_Joshipura_01.html

Kaplan, Robin and Scott Fricker. 2017. “Objective and Subjective Burden Measures: Which Survey Features and Respondent Characteristics Contribute to Both?” Presented at the annual meeting of the American Association for Public Opinion Research, New Orleans, Louisiana.

Keller, Andrew. 2018. “How Does Using Administrative Records for Characteristics Imputation Improve Survey Estimates?” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, Denver, CO.

Kephart, Kathleen, Jonathan Katz, Jasmine Luck and Mary Davis. Forthcoming. “Cognitive Testing of the 2019 ACS Due Date Test Materials.” Under review.

Kennedy, Courtney and Hannah Hartig. 2019. “Response Rates in Telephone Surveys have Resumed their Decline” Retrieved from Fact Tank: www.pewresearch.org/fact-tank/2019/02/27/response-rates-in-telephone-surveys- have-resumed-their-decline/

Kimble, Joseph. 2014. Writing for Dollars, Writing to Please: The Case for Plain Language in Business, Government, and Law. Durham, North Carolina: Carolina Academic Press.

310 Kincaid, Peter J., Robert P. Fishburne Jr., Richard L. Rogers, and Brad S. Chissom. 1975. “Derivation of New Readability Formulas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy Enlisted Personnel.” Retrieved from The Institute for Simulation and Training: https://stars.library.ucf.edu/istlibrary/56/

Klausch Thomas, Joop J. Hox, Barry Schouten. 2013. “Measurement Effects of Survey Mode on Equivalence of Attitudinal Rating Scale Questions.” Sociological Methods & Research, 52(3): 227-263.

Kovacs, Dan, Sarah Thorne, and Felix Wolfinger. 2014. “ACS Messaging Research: Mental Models Research Project.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2014/acs/2014_Kovacs_01.html

Kreuter Frauke, Stanley Presser, Roger Tourangeau. 2008. “Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity.” Public Opinion Quarterly, 72(5), 847-865.

Kreuter, Matthew W., Kathleen Holmes, Kassandra Alcaraz, Bindu Kalesan, Suchitra Rath, Melissa Richert, Amy McQueen, Nikki Caito, Lou Robinson, and Eddie M. Clark. 2010. “Comparing Narrative and Informational Videos to Increase Mammography in Low- income African American Women.” Patient Education and Counseling, 81:6-14.

Kutner, Mark, Elizabeth Greenberg, and Justin Baer. 2005. “National Assessment of Adult Literacy: A First Look at the Literacy of America’s Adults in the 21st Century.” Retrieved from National Center for Education Statistics: nces.ed.gov/naal/pdf/2006470.pdf

Landreth, Asley. 2003. “Results and Recommendations from Cognitive Interviews with Selected Materials Accompanying the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working-papers/2003/adrm/ ssm2003-10.pdf

311 Landry, Kim. 2017. “Write to the Right Reading Level.” Retrieved from Hollister Creative: www.hollistercreative.com/blog/write-to-the-right-reading-level/

Lavrakas, Paul J., Benjamin Skalland, Christopher Ward, Can Geng, Welch, Jr., Jenny Jeyarajah, and Cynthia Knighton. 2017. “Testing the Effects of Envelope Features on Survey Response in a Telephone Survey Advance Letter Mailing Experiment.” Journal of Survey Statistics and Methodology, 6(2):262-283.

Lavrakas Paul. J., Grant Benson, Stephen Blumberg, Trent Buskirk, Ismael F. Cervantes, Leah Christian, David Dutwin, Mansour Fahimi, Howard Fienberg, Tom Guterbock, Scott Keeter, Jenny Kelly, Courtney Kennedy, Andy Peytchev, Linda Piekarski, Chuck Shuttles. 2017b. “The Future of U.S. General Population Telephone Survey Research.” Report From the AAPOR Task Force. Retrieved from: www.aapor.org/Education- Resources/Reports/The-Future-Of-U-S-General-Population-Telephone-Sur.aspx

Lepkowski, James, Clyde Tucker, J. Michael Brick, Edith de Leeuw, Lilli Japec, Paul J. Lavrakas, Michael W. Link, Roberta L. Sangster. 2007. Advances in Telephone Survey Methodology. New York: Wiley.

Leslie, T. F. 1996. National Content Survey Results. U.S. Census Bureau Internal Memorandum.

Leslie, T.F. 1997. “Comparing two approaches to Questionnaire Design: Official Government Versus Public Information Design.” Presented to the American Statistical Association annual meeting, Anaheim California.

Lesser, Virginia, Don A. Dillman, Robert G. Mason, Fern Willits, and Frederick Lorenz. 2002. "Quantifying the Influence of Financial Incentives on Mail Survey Response Rates and Their Effects on Nonresponse Error." Proceedings of the Section on Survey Methods, The American Statistical Association.

312 Letourneau, Earl. 2012. “Mail Response/Return Rates Assessment.” Retrieved from U.S. Census Bureau: census.gov/content/dam/Census/library/publications/2012/dec/ 2010_cpex_198.pdf

Levine, Robert V. 2003. The Power of Persuasion. Hoboken, NJ: Wiley.

Lichtman, Marilyn. 2012. Qualitative Research in Education: A User’s Guide (3rd ed.). Thousand Oaks, CA: Sage Publications Inc.

Lidwell, William, Kritina Holden, and Jill Butler. 2003. Universal Principles of Design, Revised and Updated (2nd ed.). Beverly, MA: Rockport Publishers

Link, Michael W., Michael P. Battaglia, Martin R. Frankel, Larry Osborn, and Ali H. Mokdad. 2008. “A Comparison of Address-Based Sampling (ABS) Verses Random-Digit Dialing (RDD) for General Population Surveys.” Public Opinion Quarterly, 72(1):6-27.

Lohr, Sharon L. 2008. “Coverage and Sampling.” in International Handbook of Survey Methodology, ed. Edith D. de Leeuw, Joop J. Hox and Don A. Dillman. Oxford, United Kingdom: Taylor and Francis. 97-112.

Longsine, Lindsay and Michael Risley. 2019. “2017 Adaptive Strategy Test.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working- papers/2019/acs/2019_Longsine_01.pdf

Lovejoy, Jennette, Brendan R. Watson, Stephen Lacy, and Daniel Riffe. 2016. “Three Decades of Reliability in Communication Content Analyses: Reporting of Reliability Statistics and Coefficient Levels in Three Top Journals.” Journalism and Mass Communication Quarterly, 93(4):1135–1159.

Lynn, Peter. 2008. “The problem of nonresponse.” in International Handbook of Survey Methodology, ed. Edith D. de Leeuw, Joop J. Hox and Don A. Dillman. Oxford, United Kingdom: Taylor and Francis. 97 – 112.

313 Macro International Inc. 2009. “Census Barriers, Attitudes, and Motivators Survey: Analytic Report” Retrieved from U.S. Census Bureau: www2.census.gov/programs- surveys/decennial/2010/partners/pdf/C2POMemoNo11.pdf?#

Martin, Elizabeth. 2009. “Can a Deadline and Compressed Mailing Schedule Improve Mail Response in the Decennial Census?” Public Opinion Quarterly, 25(2), 361-367.

Matthews, Gail. 2015. “The Effectiveness of Four Coaching Technique in Enhancing Goal Achievement: Writing Goals, Formulating Action Steps, Making a Commitment, and Accountability.” Presented to the 9th Annual International Symposium on Psychology, May 25-28, Athens, Greece.

Matthews, Brenna, Mary C. Davis, Jennifer G. Tancreto, Mary Frances Zelenak, and Michelle Ruiter. 2012. “2011 American Community Survey Internet Test: Results from Second Test in November 2011.” Retrieved from U.S. Census Bureau: www.census.gov/content/ dam/Census/library/working-papers/2012/acs/2012_Matthews_01.pdf

McCormack, J. 2014. Brief: Make a Bigger Impact by Saying Less. New York, NY: Wiley.

Medway Rebecca L. and Jenna Fulton 2012. “When More Gets You Less: A Meta-analysis of the Effect of Concurrent Web Options on Mail Survey Response Rates.” Public Opinion Quarterly, 76:733-746.

Messer, Benjamin L. 2012. Pushing Households to the Web: Experiments of a “Web+Mail” Methodology for Conducting General Public Surveys. Unpublished doctoral dissertation, Washington State University, Pullman.

Messer, Benjamin L. and Don A. Dillman. 2011. “Surveying the General Public Over the Internet Using Address-Based Sampling and Mail Contact Procedures.” Public Opinion Quarterly, 75(3): 429–457.

Miles, Matthew B., A. Michael Huberman and Johnny Saldaña. 2019. Qualitative : A Methods Sourcebook (4th ed.). Thousand Oaks, CA: Sage Publications.

314 Milkman, Katherine L., John Beshears, James J. Choi, David Laibson, and Brigitte C. Madrian. 2011. “Using Implementation Intentions Prompts to Enhance Influenza Vaccination Rates.” Proceedings of the National Academy of Sciences, 108(26): 10415–10420.

Millar, Morgan M. and Don A. Dillman. 2011. “Improving Response to Web and Mixed-Mode Surveys.” Public Opinion Quarterly, 75(2):249–269.

Mills, Gregory. 2016a. “Evaluation of Transitioning Telephone Number Sources for the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2016/acs/2016_Mills_01.html

Mills, Gregory. 2016b. “Simulated Effects of Changing Calling Parameters and Workload Size on Computer Assisted Telephone Interview Productivity in the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/library/working- papers/2016/acs/2016_Mills_02.html

Mills, Greg. 2017. “Investigating the Use of Cellular Telephone Numbers for the Computer- Assisted Telephone Interview Operation in the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working- papers/2017/acs/2017_Mills_01.pdf

Mills, Gregory, Sarah Heimel, and Angie Buchanan. 2019. “Response Consistency Between Details Race and Origin Questions from the 2016 ACS Content Test and the Ancestry Question.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2019/acs/2019_Mills_01.html

Munodawafa, Davison. 2008. “Communications: Concepts, Practice and Challenges.” Health Education Research, 23(3): 369–370.

Murphy, Padraic and Andrew Roberts. 2014. “2014 Pre-Notice Test.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working- papers/2015/acs/2015_Murphy_01.pdf

315 Murray, Teresa D. 2017. “American Community Survey from Census Bureau can creep you out: Money Matters.” The Plain Dealer, October 8, 2017. Retrieved from: www..com/business/2017/10/american_community_survey_from.html

National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press.

National Academies of Sciences, Engineering, and Medicine. 2019. Improving the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press.

National Research Council. 2013a. Benefits, Burdens, and Prospects of the American Community Survey: Summary of a Workshop. D.L. Cork, Rapporteur. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington D.C.: That National Academic Press.

National Research Council. 2013b. Nonresponse in Social Science Surveys. Panel on a Research Agenda for the Future of Social Science Data Collection. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington D.C.: That National Academic Press.

Nichols, Elizabeth. 2012. “Why Do Survey Participants Choose to Report by Web, Paper or Not at All? Results From an American Community Survey Qualitative Study.” Presented at the annual meeting of the American Association for Public Opinion Research, May 18, Orlando, Florida.

Nichols, Elizabeth, Rachel Horwitz, and Jennifer G. Tancreto. 2013. “Response Mode Choice and the Hard-to-Interview in the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working-papers/2013/adrm/ rsm2013-01.pdf

316 Nichols, Elizabeth, Rachel Horwitz, and Jennifer G. Tancreto. 2015. “An Examination of Self- Response for Hard-to-Interview Groups when Offered an Internet Reporting Option for the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov /content/dam/Census/library/working-papers/2015/acs/2015_Nichols_01.pdf

Nielsen, Jakob. 2005. “Lower-literacy Users: Writing for a Broad Consumer Audience.” The Nielsen Norman Group. Retrieved from: www.nngroup.com/articles/writing-for-lower- literacy-users/

OECD. 2013. “OECD Skills Outlook 2013: First Results from the Survey of Adult Skills.” OECD Skills Studies, Programme for the International Assessment of Adult Competencies. OECD Publishing. Paris, France. Retrieved from: dx.doi.org/10.1787/9789264204256-e

OECD. 2016. “Skills Matter: Further Results from the Survey of Adult Skills.” OECD Skills Studies, Programme for the International Assessment of Adult Competencies. OECD Publishing, Paris, France. Retrieved from: dx.doi.org/10.1787/9789264258051-en

Olesen, Virginia, Nellie Droes, Diane Hatton, Nan Chico, and Leonard Schatzman. 1994. “Analyzing Together: Recollections of a Team Approach.” In Alan Bryman and Robert G. Burgess (Eds.). Analyzing Qualitative Data. 111-128. London, UK: Routledge.

Oliver, Broderick, Michael Risley, and Andrew Roberts. 2016. “2015 Summer Mandatory Messaging Test.” Retrieved from U.S. Census Bureau: www.census.gov/library/working- papers/2016/acs/2016_Oliver_01.html

Oliver, Broderick, Sarah Heimel, and Jonathan Schreiner. 2017. “Strategic Framework for Messaging in the American Community Survey Mail Materials.” Retrieved from U.S. Census Bureau: census.gov/content/dam/Census/library/working-papers/2017/acs/ 2017_Oliver_01.pdf

317 Oliver, Broderick, Lindsay Longsine, Jonathan Schreiner, and Megan Rabe. 2018. “2017 American Community Survey Mail Design Test”. Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working-papers/2018/acs/ 2018_Oliver_01.pdf

Olson, Kristen, Jolene D. Smyth, and Heather M. Wood. 2012. “Does Giving People their Preferred Survey Mode Actually Increase Survey Participation Rates? An Experimental Examination” Public Opinion Quarterly, 74(4):611–635.

Olson, Kristen, Jolene D. Smith, Rachel Horwitz, Scott Keeteer, Virginia Lesser, Stphanie Marken, Nancy Mathiowetz, Jaki McCarthy, Eileen O’Brien, Jean Opsomer, Darby Steiger, David Sterrett, Jennifer Su, Z. Tuba Suzer-Gurteken, Chintan Turakhia, and James Wagner. 2019. “Report of the AAPOR Task Force on Transitions from Telephone Surveys to Self- Administered and Mixed-Mode Surveys.” American Association for Public Opinion Research Task Force Report. Retrieved from: www.aapor.org/getattachment/Education- Resources/Reports/Report-of-the-Task-Force-on-Transitions-from-Telephone-Surveys- FULL-REPORT-FINAL.pdf.aspx

Olson, Curtis A. 2014. “Survey Burden, Response Rates, and the Tragedy of the Commons.” Journal of Continuing Education in the Health Professions, 34(2): 93-95.

Onwuegbuzie, Anthony J., Marla H. Mallette, Eujin Hwang, and John R. Slate. 2013. “Evidence- based Guidelines for Avoiding Poor Readability in Manuscripts Submitted to Journals for Review for Publication.” Research in the Schools, 20(1): 1-11.

Orrison, Gregory and Joseph Ney. 2014. “ACS Messaging Research: Deliberative Focus Groups with Stakeholders Who Are Distrustful of the Government.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working-papers/2014/acs/ 2014_Orrison_01.pdf

Palmer, Stephen E. 1999. Vision Science: Photons to Phenomenology. Cambridge, MA: The MIT Press.

318 Pan, Yuling, Marjorie Hinsdale, Alisu Schoua-Glusberg, and Hyunjoo Park. 2008. "Cognitive Testing of ACS Multilingual Brochures in Multiple Languages." Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2008/adrm/rsm2008-06.html

Perrault, Evan K., and Samantha A. Nazione. 2016. “Informed Consent—Uninformed Participants: Shortcomings of Online Social Science Consent Forms and Recommendations for Improvement.” Journal of Empirical Research on Human Research Ethics, 11(3): 274–280.

PEW Research Center. 2019. “Public Trust in Government: 1958 – 2019.” Retrieved from: www.people-press.org/2019/04/11/public-trust-in-government-1958-2019/

Phipps, Polly. 2014. “Defining and Measuring Respondent Burden in Establishment Surveys.” Presented at the Federal Committee on Statistical Methodology Statistical Policy Seminar, December 16, Washington, DC.

Plain Writing Act of 2010, Pub. L. No. 111-274, Stat 2861, 2862, and 2863. Retrieved from: www.govinfo.gov/content/pkg/PLAW-111publ274/pdf/PLAW-111publ274.pdf

Poldre, Tom. 2017. “Communicating to the Power of Three.” Best Practices (blog). Retrieved from: www.cision.ca/best-practices/communicating-to-the-power-of-three/

Preisendorfer Peter, Felix Wolter. 2014. “Who is Telling the Truth? A Validation Study on Determinants of Response Behavior in Surveys.” Public Opinion Quarterly, 78(1): 126- 146.

Presser, Stanley, Johnny Blair, and Timothy Triplett. 1992. “Survey Sponsorship, Response Rates, and Response Effects.” Social Science Quarterly, 73(3): 699–702.

Presser, Stanley and Susan McCulloch. 2011. “The growth of Survey Research in the United States: Government-sponsored surveys, 1984-2004.” Social Science Research, 40: 1019- 1024.

319 Raglin, David, T. F. Leslie, and Debbie Griffin. 2004. “How is the Propensity to Respond for Different Data Collection Modes Affected by a Mailing Package and Mandatory/Voluntary Status?” Internal report, U.S. Census Bureau, Washington, DC.

Raglin, David. 2014. “American Community Survey Fiscal Year 2014 Content Review Interviewer Survey Results,” Retrieved from U.S. Census Bureau: www2.census.gov/programs- surveys/acs/operations_admin/2014_content_review/methods_results_report/Intervie wer_Survey_Results_Report.pdf

Reingold, Inc. 2014a. “American Community Survey Messaging and Mail Package Assessment Research: Cumulative Research Findings”. Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working-papers/2014/acs/ 2014_Walker_02.pdf

Reingold, Inc. 2014b. “ACS Messaging Research: Online Visual Testing Findings.” Presented to the U.S. Census Bureau, October 28, 2014.

Reinhart, Amber Marie. 2006 “Comparing the persuasive effects of narrative versus statistical messages: A meta-analytical review.” Unpublished dissertation, State University of New York at Buffalo. Retrieved from ProQuest Dissertations Publishing, 2006. 3213634

Reinhart, Amber Marie and Thomas H. Feeley. 2007. “Comparing the persuasive effects of narrative versus statistical messages: A meta-analytical review.” Presented at the annual meeting of the National Communication Association, November 16, Chicago, IL.

Ricketts, Mitch, James Shanteau, Breeanna McSpadden, and Kristen Fernandesz-Medina, 2010. “Using Stories to Battle Unintentional Injuries: Narratives in Safety and Health Communication.” Social Science and Medicine, 70: 1441 – 1449.

Ridolfo, Heather, Kenneth M. Pick, Andrew J. Dau, Alison Black, and Julie Weber. 2019. “Motivating Respondents to Open the Envelope: Does Messaging Matter?” Presented to the Association for Public Opinion Research Annual Conference, May 17, 2019, Toronto, Ontario.

320 Risley, Michael, Dorothy Barth, Kathryn Cheza, and Megan Rabe. 2018. “2017 Pressure Seal Mailing Materials Test.” Retrieved from U.S. Census Bureau: www.census.gov/ content/dam/Census/ library/ working-papers/2018/acs/2018_Risley_01.pdf

Robins, Cynthia, Darby Steiger, Jasmine Folz, Karen Stein, and Martha Stapleton. 2016. “2016 American Community Survey Respondent Burden Testing.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working-papers/2016/acs/ 2016_Westat_01.pdf.

Rolstad, Sindre, John Adler, and Anna Ryden. 2011. “Response Burden and Questionnaire Length: Is Shorter Better? A Review and Meta-analysis.” Value in Health, 14: 1101-08.

Roose, Henk, John Lievens, Hans Waege. 2007. “The Joint Effect of Topic Interest and Follow up Procedures on the Response in a Mail Questionnaire: An Empirical Test of the Leverage- Salience Theory in Audience Research.” Sociological Methods & Research, 35(3): 410- 428.

Ruiter, Michelle, Mary Frances Zelenak, Jennifer G. Tancreto, and Mary Davis. 2012. “Methods for incorporating an Internet Response Mode into American Community Survey Mailings: A Comparison of Approaches.” Retrieved from U.S. Census Bureau.

Ryan, Richard M. and Edward L. Deci. 2000. “Self-Determination Theory and the Facilitation of Intrinsic Motivation, Social Development, and Well-being.” American Psychologist, 55: 68-78.

Saldaña, Johnny. 2015. The Coding Manual for Qualitative Researchers (3rd ed.). Thousand Oaks, CA: Sage Publications.

Schaeffer, Nora Cate and Stanley Presser. 2003. “The Science of Asking Questions.” Annual Review of Sociology, 29(1): 65-88.

321 Schreiner, Jonathan, Broderick Oliver, and Sarah Heimel. 2017. “Rethinking the Conversation: A Strategic Framework for American Community Survey Mail Contact Messaging.” Panel on Census Bureau: Innovations and Measurement, Presented at the Southern Demographic Association Annual Meeting, October 25, Morgantown, WV.

Schreiner, Jonathan. 2018a. “Applying a Strategic Framework to Reinvent American Community Survey Mail Contact Materials.” Panel on Developing Holistic Approaches to Survey Messaging for Multi-Mode Surveys. Presented at the annual meeting of the American Association for Public Opinion Research, May 18, Denver, CO.

Schreiner, Jonathan. 2018b. “Review of How Current ACS Mail Materials Mesh with the ACS Strategic Framework, and Next Steps” Presentation for the Steering Committee for the Workshop on Improving the American Community Survey, National Academy of Sciences, September 27, Washington D.C.

Schreiner, Jonathan. 2018c. “Enhancing Respondent Mail Materials for the American Community Survey: Current Mail Materials Analysis.” Presented at the Southern Demographic Association Annual Meeting, October 12, Durham, NC.

Schreiner, Jonathan. 2019. “Redesigning the American Community Survey Recruitment Messaging – From analysis to design.” Presented at the annual meeting of the American Association for Public Opinion Research, May 16, Toronto, Ontario.

Schreiner, Jonathan, Broderick Oliver, and Elizabeth Poehler. Forthcoming. “An Assessment of the American Community Survey Mail Communication Materials.” Under review.

Schwartz, Lisa K., Lisbeth Goble, and Edward M. English. 2006. “Counterbalancing Topic Interest with Cell Quotas and Incentives: Examining Leverage-Salience Theory in the Context of the Poetry in America Survey.” Presented at the annual meeting of the American Association for Public Opinion Research, Montreal, Canada.

322 Schwede, Laurie. 2008. “‘Carrot’ or ‘Stick’ Approach to Reminder Cards: What Do Cognitive Respondents Think?” Proceedings of the Survey Research Methods Section of the American Statistical Association: www.asasrms.org/Proceedings/y2008/Files/schwede.pdf

Schwede, Laurie. 2013. “Cognitive Testing of Modified American Community Survey Envelopes and Letters for use during the 2010 Census: ACS Messaging Project Final Report on Phases 1 and 2.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/ Census/ library/working-papers/2013/acs/2013_Schwede_01.pdf.

Schuman, Howard and Stanley Presser. 1996. Questions and Answers in Attitude Surveys: Experiments on Question Form, Wording and Context. Thousand Oaks, CA: Sage Publications.

Scott, Christopher. 1961. “Research on Mail Surveys.” Journal of Royal Statistics Society, 124: 142-205.

Scott, William A. 1955. “Reliability of Content Analysis: The Case of Nominal Scale Coding”. Public Opinion Quarterly, 19(3):321-325.

Shephard, D. and Bowers, J. 2016. “Suggestions for ACS 2016 Softened Revised Design (v1).” Email from Social Behavioral Science Team established in 2015 to help federal agencies integrate behavioral insights into their policies and programs (Executive Order No.13707, 2015). Memo to the U.S. Census Bureau, September 10, 2015.

Singer, Eleanor. 1978. “Informed Consent: Consequences for Response Rate and Response Quality in Social Surveys.” American Sociological Review, 43(2): 144-162.

Singer, Eleanor. 2006. “Introduction: nonresponse bias in household surveys.” Public Opinion Quarterly, 70(5): 637–645, 2006.

Singer, Eleanor, and Martin R. Frankel. 1982. "Informed Consent in Telephone Interviews," American Sociological Review, 47: 116-126.

323 Singer, Eleanor, Hans-Jurgen Hippler, and Norbert Schwarz, 1992. “Confidentiality Assurances in Surveys: Reassurance or Threat?” International Journal of Public Opinion Research, 4(3): 256–68.

Singer, Eleanor, Dawn R. von Thurn, and Esther R. Miller. 1995. “Confidentiality assurances and response: A quantitative review of the experimental literature.” Public Opinion Quarterly, 59(1), 66-77.

Singer, Eleanor, John Van Hoewyk, Nancy Gebler, Trivellore Raghunathan, and Katherine McGonagle. 1999. “The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys.” Journal of Official Statistics, 15(2): 217–230.

Singer, Elenaor and R. M. Bossarte. 2006. “Incentives for Survey Participation – When are they “Coercive?” American Journal of Preventative Medicine, 31:411-418.

Singer, Eleanor and Cong Ye. 2013. “The Use and Effects of Incentives in Surveys.” Annals of the American Academy of Political and Social Science, 645(1): 112–141.

Smyth, Jolene D., Don A. Dillman, Leah M. Christian, and Allison C. O’Neill. 2010. “Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century.” American Behavioral Scientist, 53(9): 1423-1448.

Steeh, Charlotte. 2008. “Telephone Surveys.” in International Handbook of Survey Methodology, ed. Edith D. de Leeuw, Joop J. Hox and Don A. Dillman. Oxford, United Kingdom: Taylor and Francis. 221 – 238.

Stepler, Renee, Daniel Eklund, Margaret Behlen, and Elizabeth Sinclair. 2019. “Implementing Non-Monetary Incentives in the National Sample Survey of Registered Nurses.” Presented at the Federal Computer Assisted Survey Information Collection Annual Workshop, April 17, Washington, DC.

324 Stokes, Samantha, Courtney Reiser, Michael Bentley, Joan Hill, and Alfred Meier. 2011. “2010 Census Deadline Messaging and Compressed Mailing Schedule Experiment.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/publications/ 2012/dec/2010_cpex_166.pdf

Tait, Alan R., Terri Voepel-Lewis, Vijayan N. Nair, Naveen Narisetty, and Angela Fagerlin. 2013. “Informing the Uninformed: Optimizing the Consent Message Using a Fractional Factorial Design.” Journal of the American Medical Association: JAMA Pediatrics, 167(7): 640–646.

Tancreto, Jennifer, Mary C. Davis, and Mary Frances Zelenak. 2012a. “Design of the American Community Survey Internet Instrument.” Retrieved from U.S. Census Bureau: www.census.gov/library/working-papers/2012/acs/2012_Tancreto_02.html

Tancreto, Jennifer G., Mary Frances Zelenak, Mary C. Davis, Michelle Ruiter, and Brenna Matthew. 2012b. “2011 American Community Survey Internet Tests: Results from the First Test in April 2011.” Retrieved from U.S. Census Bureau: www.census.gov/content/ dam/ Census/library/working-papers/2012/acs/2012_Tancreto_01.pdf

Tancreto, Jennifer G. and Elizabeth Poehler. 2015. “Evolution of ACS Respondent Contact Materials.” Internet U.S. Census Bureau document. Requested November 12, 2018.

Tarnai, John, David Schultz, David Solet, and Lori Pfingst. 2012. “Response Rate Effects in an ABS Survey for Stamped vs. Business Reply Return Envelopes with and without Incentives, and Medium vs. Standard Size Outgoing Envelopes.” Presented at the annual meeting of the American Association for Public Opinion Research, May 18, 2012, Orlando, FL.

Terry, Rodney, Darby Steiger, Angie Buchanan, and Mary C. Davis. 2019. “Cognitive Testing of Race, Hispanic Origin, and Ancestry Questions to Investigate Respondent Burden.” Presented at the annual meeting of the American Association for Public Opinion Research, May 19, 2019, Toronto, Ontario.

Thibaut, John W. and Harold H. Kelley. 1959. Social Psychology of Groups. New York, NY: Wiley.

325 Tortora, Robert D. 2017. “Respondent Burden, Reduction Of.” In Encyclopedia of Statistical Sciences, edited by Samuel Kotz, Campbell B. Read, N. Balakrishnan, Brani Vidakovic, and Norman L. Johnson. New York: John Wiley and Sons.

Tourangeau, Roger and Ting Yan. 2007. “Sensitive Questions in Surveys.” Psychological Bulletin, 133(5): 859-83.

Tourangeau, Robert and Cong Ye. 2009. “The Framing of the Survey Request and Panel Attrition.” Public Opinion Quarterly, 73(2): 338-48.

Tourangeau, Roger, Lance J. Rips, and Kenneth Rasinski. 2000. The Psychology of Survey Response. Cambridge, MA: Cambridge University Press

Tourangeau, Robert, Brad Edwards, Timothy P. Johnson, Kirk M. Wolter, and Nancy Bates (eds). 2014. Hard-to-survey populations. Cambridge: Cambridge University Press.

Torongo, Robert. 2019. “You’ve Got Mail: The Impact of Hand-Written Letters on Survey Response” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, 2019, Toronto, Ontario.

Trussell, Norm and Lavrakas, Paul J. 2004. “The Influence of Incremental Increases in Token Cash Incentives on Mail Survey Response: Is there an Optimal Amount?” Public Opinion Quarterly, 68(3): 349–367.

Tulp, D. R., C. E. How, G. L. Kusch, and S. J. Cole. 1991. “Nonresponse Under Mandatory Versus Voluntary Reporting in the 1980 Survey of Pollution Abatement Costs and Expenditures (PACE).” Proceedings from survey research methods section of the American Statistical Association. Alexandria, VA. 272-277.

Tuttle, Alfred D., Jonathan Schreiner, Elizabeth Nichols, E. Olmsted-Hawala, and Rodney Terry. 2019. “Using Eye-Tracking to Evaluate New American Community Survey Mail Material Design Strategies.” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, 2019, Toronto, Ontario.

326 U.S. Department of Education. 2002. “Adult Literacy in America: A First Look at the Findings of the National Adult Literacy Survey.” Retrieved from the National Center for Education Statistics: nces.ed.gov/pubs93/93275.pdf

U.S. Census Bureau. 2014. “American Community Survey Design and Methodology.” Retrieved from U.S. Census Bureau: www2.census.gov/programs-surveys/acs/methodology/ design_and_methodology/acs_design_methodology_report_2014.pdf

U.S. Census Bureau. 2017. “Agility in Action 2.0: A Snapshot of Enhancements to the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/programs-surveys/acs/operations-and- administration/2015-16-survey-enhancements/Agility%20in%20Action%20v2.0.pdf

U.S. Census Bureau. 2018. “Agility in Action 2.1: A Snapshot of Enhancements to the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/programs-surveys/acs/operations-and- administration/2015-16-survey-enhancements/Agility_in_Action_V2.1.pdf

U.S. Census Bureau. 2019a. “Fiscal Year 2019 Budget Summary”. Retrieved from U.S. Census Bureau: www2.census.gov/about/budget/2019-Budget-Infographic-Bureau- Summary.pdf

U.S. Census Bureau. 2019b. “American Community Survey: Response Rates.” Retrieved from U.S. Census Bureau: www.census.gov/acs/www/methodology/sample-size-and-data- quality/response-rates/index.php

U.S. Office of Personnel Management. 2011. “Paperwork Reduction Act (PRA) Guide: Version 2.0.” Accessed from: www.opm.gov/about-us/open-government/digital-government- strategy/fitara/paperwork-reduction-act-guide.pdf

Velkoff, Victoria and Jennifer Ortman. 2018. “Making Administrative Records Key to Operational Agility at the U.S. Census Bureau.” Presented at the annual meeting of the American Association for Public Opinion Research, May 17, Denver, CO.

327 Velkoff, Victoria, Nikolas D. Pharris-Ciurej, Sandra L. Clark, R. Chase Sawyer, and Andrew Keller. 2018. “Making Administrative Records Key to Operational Agility at the U.S. Census Bureau.” Panel presented at the Southern Demographic Association Annual Meeting, October 11, Durham, NC.

Vögele, Siegfried. 2019. “Eye Flow Studies Provide Clues for Improving Your Direct Mail.” as cited in Chewning, Hugh. “Successful Direct Mail Starts and Ends with the Outer Envelope.” Retrieved from: cdmdirect.com/successful-direct-mail-starts8212and- ends8212withthe-outer-envelope/

Vögele, Siegfried. 2019. “Eye Flow Studies Provide Clues for Improving Your Direct Mail.” as cited in Cogan, Marisa. “Direct Mail: Getting that Envelope Opened.” Retrieved from: https://cpscards.com/direct-mail-envelope/

Walker, Shelly. 2015. “ACS Messaging Research: Cumulative Findings.” Retrieved from U.S Census Bureau: census.gov/content/dam/Census/library/working- papers/2014/acs/2014_Walker_01.pdf

Walther, Joachim, Nicola W. Sochacka, and Nadia N. Kellam. 2013. “Quality in Interpretive Engineering Education Research: Reflections on an Example Study.” Journal of Engineering Education, 102(4):626-659.

Warriner, Keith, John Goyder, Heidi Gjertsen, Paula Hohner, and Kathleen McSpurren. 1996. “Charities, no; Lotteries; no cash, yes: Main effects and interactions in a Canadian incentives experiment.” Public Opinion Quarterly, 60(4):542-562.

Wenemark, Marika, Gunilla H. Frisman, Tommy Svensson, and Margareta Kristenson. 2010. “Respondent Satisfaction and Respondent Burden Among Differently Motivated Participants in a Health-Related Survey.” Field Methods. 22(4): 38-390.

Wenemark, Marika, Andreas Persson, Helle Brage, Tommy Svensson, and Magareta Kristenson. 2011. “Applying Motivation Theory to Achieve Increased Response Rates, Respondent Satisfaction and Data Quality.” Journal of Official Statistics. 27(2):393-414.

328 Wheildon, Colin. 2007. Type & Layout: Are you Communicating or Just Making Pretty Shapes?” Melbourne, Australias. The Worsley Press.

Whitcomb, Michael E. and Stephen R. Porter. 2004. “E-mail Contacts: A Test of Complex Graphical Designs in Survey Research.” Social Science Computer Review, 22(3): 370–376.

Williams, Douglas, J. Michael Brick, Jill M. Montaquila, and Daifeng Han. 2014. "Effects of screening questionnaires on response in a two-phase postal survey." International Journal of Social Research Methodology, 19(1): 51-67.

Wilmer, Henry H., Lauren E. Sherman, and Jason M. Chein. 2017. “Smartphones and Cognition: A Review of Research Exploring the Links between Mobile Technology and Cognitive Functioning.” Frontiers in Psychology, 8: 605.

Wolf, Maryanne. 2018, August 25. “Skim Reading is the New Normal. The Effect on Society is Profound.” The Guardian. Retrieved on July 22, 2019 from www.theguardian.com/ commentisfree/2018/aug/25/skim-reading-new-normal-maryanne-wolf

Word, David L. 1997. “Who responds/who doesn’t? Analyzing variation in mail response rates during the 1990 census.” Retrieved from U.S. Census Bureau: www.census.gov/population/www/documentation/twps0019.html

World Bank. 2017. “Communication for Governance and Accountability Program (CommGAP).” Retrieved from World Bank: siteresources.worldbank.org/EXTGOVACC/Resources/CommGAPBrochureweb.pdf

Yammarino, Francis J., Steven J. Skinner, and Terry L. Childers. 1991. “Understanding Mail Survey Response Behavior: A Meta-analysis.” Public Opinion Quarterly, 55: 613-39.

Yang, Daniel K. 2015. “Compiling Respondent Burden Items: A Composite Index Approach.” Paper presented at the Consumer Expenditure Survey Methods Symposium. Retrieved from Bureau of Labor Statistics: www.bls.gov/cex/respondent-burden-index.pdf

329 Yu, Erica C., Scott Fricker, and Brandon Kopp. 2015. “Can Survey Instructions Relieve Respondent Burden?” Presented at the Annual Conference of the American Association for Public Opinion Research, Hollywood, FL. Retrieved from Bureau of Labor Statistics: www.bls.gov/osmr/research-papers/2015/pdf/st150260.pdf.

Zelenak, Mary Frances and Mary C. Davis. 2013. “Impact of Multiple Contacts by Computer- Assisted Telephone Interview and Computer-Assisted Personal Interview on Final Interview Outcome in the American Community Survey.” Retrieved from U.S. Census Bureau: www.census.gov/content/dam/Census/library/working- papers/2013/acs/2013_Zelenak_01.pdf.

Zillman, Dolf and Hans-Bernd Brosius. 2000. Exemplification in Communication: The Influence of Case Reports on the Perception of Issues. Mahwah, NJ: Routledge.

330