<<

TECHNICAL REPORT 10-13

Using Address Based Sampling to Survey the General Public by Mail vs. ‘Web plus Mail’

January 2010

Prepared for The National Science Foundation Division of Science Resources Statistics

Submitted by Benjamin L. Messer Don A. Dillman

SESRC Social & Economic Sciences Research Center (SESRC) P.O. Box 644014 Washington State University Pullman, Washington 99164-4014 Telephone: (509) 335-1511 Fax: (509) 335-0116

Using Address Based Sampling to Survey the General Public by Mail vs. ‘Web plus Mail’

March 3, 2010

Prepared for The National Science Foundation Division of Science Resource Statistics

by

Benjamin L. Messer & Don A. Dillman1 Social and Economic Sciences Research Center Washington State University Pullman, WA 99164

1 Research reported here was supported by the USDA National Agricultural Statistics Service and the NSF Division of Science Resources Statistics under a Cooperative Agreement with the Social and Economic Sciences Research Center Washington State University, Pullman, WA

i

ii

TABLE OF CONTENTS

ABSTRACT......

1. INTRODUCTION...... 1

2. BACKGROUND...... 3

3. METHODS...... 7

4. RESULTS...... 11

4.1 WCS Response Rates...... 11

4.1.1 Does mail or a combination mail and the Internet used with the DSF obtain reasonable response rates in statewide household surveys?...... 11

4.1.2 Does the inclusion and/or timing of an Internet instruction card increase response rates? ...... 12

4.1.3 Do cash incentives increase response rates for incentive vs. non- incentive mail and Internet respondents? ...... 12

4.2 Non-response Error...... 13

4.2.1 Do cash incentives reduce non-response error between incentive vs. non-incentive mail and Internet respondents? ...... 13

4.2.2 Does the inclusion and/or timing of an Internet instruction card reduce non-response error? ...... 15

4.2.3 Are Internet respondents different types of people than mail respondents? ...... 16

4.2.4 Are mail, Internet, or mail and Internet combined respondents representative of the general public? ...... 19

5. DISCUSSION & CONCLUSIONS...... 23

REFERENCES...... 25

APPENDIX I...... 31

APPENDIX II...... 43

iii

iv

Using Address-based Sampling to Survey the General Public by Mail vs. ‘Web plus Mail’

Benjamin L. Messer & Don A. Dillman

ABSTRACT

The Internet is increasingly being used as a survey method but suffers from incomplete coverage of households and lack of an adequate sample frame for conducting general public surveys. However, with the development of near-comprehensive address-based sample (ABS) frames, such as the U.S. Postal Service’s Delivery Sequence (DSF), mail may become used much frequently, particularly to send respondents an Internet survey. Yet it remains unclear as to what procedures are most effective in using the DSF with mail and the Internet survey modes to obtain acceptable levels of non-response in statewide general public household surveys. The 2008 Washington Community Survey (WCS) provides an opportunity to examine these issues. The WCS was conducted by sampling from the DSF and asking people in nine different panels to respond by Internet and/or mail. Different implementation procedures were also tested to determine their impact on non-response. These include an Internet instruction card (vs. none), a $5 cash incentive (vs. none), and multiple ways of introducing the choice between Internet and mail. Statistical comparisons of the characteristics between WCS Internet and mail respondents, as well as between WCS Internet and mail respondents and the American Community Survey (ACS) respondents, determine whether differences exist and how representative different WCS respondents are of general public households in Washington. Overall, we found mail and Internet respondents are very different types of people but an Internet preference approach with a $5 incentive and a mail follow-up sent three weeks later can obtain reasonable response rates (46.3%) and levels of non-response error. A mail-only treatment, with a $5 incentive, obtained the highest response rates (56.7%) but also produced similar levels of nonresponse error as the Internet preference approach. Furthermore, neither mail nor Internet preference respondents were consistently representative of the general population.

v

Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

CHAPTER 1: INTRODUCTION

Conducting random sample Internet surveys of the general public remains an elusive goal. This is primarily due to several limitations of using the Internet as the only survey mode, including incomplete household Internet coverage (Horrigan, 2008), difficulties drawing random samples of email addresses (Couper, 2000), and professional norms against contacting people by email when no prior relationship exists (CASRO, 2009). However, with the advent of comprehensive address-based sampling (ABS) frames and the growth of Internet penetration, it may be feasible to circumvent these limitations with a mixed-mode methodology in which households are contacted by mail with a request to respond over the Internet and those unable or unwilling to respond via the Internet are offered a mail response option.

Interest in such a methodology has developed in recent years partly as a result of declines in coverage and participation in random digit dialing (RDD) telephone surveys, which has been the most efficient method for conducting general public surveys in the past (Steeh, 2008). For example, Link & Mokdad (2006) explored the potential of mailing respondents a request to complete the Behavioral Risk Factor Surveillance System (BRFSS) survey over the Internet, combined with a CATI follow-up, to increase response rates over a CATI-only method. The Internet and CATI follow-up group obtained higher response rates compared the CATI-only group but the authors used RDD sampling, which has significant coverage problems (Blumberg, Luke, Cynamon, & Frankel, 2008; Blumberg & Luke, 2007; Keeter, Kennedy, Clark, Tompson, & Mokrzycki, 2007), to match telephone numbers with addresses, and obtained low Internet response rates (15.4%), indicating that more research is needed to determine how best to improve the effectiveness of using mail to drive people to the Internet.

In addition, the advantages associated with using the Internet as a survey mode, such as and cost savings compared to other modes (Couper, 2000; Kaplowitz, Hadlock, & Levine, 2004; Couper & Miller, 2008; Dillman, Smyth, Christian, 2009), and particularly to a mail-only strategy (Schonlau, Fricker, & Elliott, 2002; Fricker & Schonlau, 2002), has also contributed to the increased use of a mail and Internet mixed-mode methodology. However, researchers using this methodology have found it difficult to obtain a sufficiently high response rate or large enough proportion of Internet respondents to realize these advantages compared to a mail-only design (Schonlau et. al., 2002; Israel, 2009), particularly in general public surveys, which have been challenging to conduct without ABS frames.

The recent development and availability of comprehensive ABS frames, such the U.S. Postal Service’s Delivery Sequence File (DSF), has provided researchers with an essential tool for conducting random sample general public surveys using mail and the Internet. ABS frames such as the DSF are reported to have high coverage (Link, Battaglia, Frankel, Osborn, & Mokdad, 2008) and facilitate the ability to draw a random sample of postal addresses, which is important for generalizability (Kish, 1965; Dillman et. al., 2009). A recent study by Smyth, Dillman, Christian and O’Neill (In Press) demonstrated the possibility of using the DSF with mail contacts to drive people to the Internet. The authors obtained a 55% response rate with the Internet and a mail follow-up from households in a rural region of 45,000 people in the Western United States. Nearly three-fourths of the respondents used the Internet to complete the survey and the remainder sent back mail questionnaires. By comparison, a sub-sample surveyed by mail-only

1 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

achieved a higher response rate of 71%. Moreover, substantial differences were found between the demographic characteristics of Internet vs. mail-only respondents but these differences were minimized with the inclusion of mail follow-up to respondents in the Internet group.

Although these results are promising, the Smyth et al. (In Press) study has several limitations as an evaluation of the potential use of ABS methods for conducting mail and especially Internet surveys. Specifically, current demographic statistics for the rural region studied were unavailable for assessing non-response error. It is also possible, if not likely, that the implementation procedures used in this study will be less effective when applied to more urbanized populations. In addition, two different contact procedures thought to be important for eliciting responses from households over the Internet – a token cash incentive delivered by mail and an illustrated Internet instruction card (web card) – were not evaluated experimentally to ascertain their effectiveness.

Our purpose in this paper is to present a more comprehensive of the data collection procedures developed by Smyth et al. (In Press) by applying them to a statewide survey of households in Washington. The survey design permitted us to compare overall response rates for Internet-plus-mail vs. mail-only methods and to evaluate non-response error by comparing demographic characteristics with those from the American Community Survey (ACS). We also experimentally tested the effects of the incentive and web card used by Smyth et al. (In Press) on response rates and non-response error. Our overriding concern is to evaluate the suitability of using ABS and mail as a means of driving respondents towards the Internet, with mail as a response alternative for those unable or unwilling to respond over the Internet.

2 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

CHAPTER 2: BACKGROUND

The Internet is becoming an increasingly important survey response mode because of lower data collections costs and faster response compared to all other modes, among other advantages (Couper, 2000). Considerable advances have also been made in regards to coverage and people’s ability to use the Internet (Horrigan, 2008), but major limitations have prevented the sole use of the Internet in conducting general public surveys. For example, although coverage rates have been increasing, they still pose as a major obstacle for general public surveys. As of November 2007, approximately 62-65% of the population had Internet access from home, with an additional 10-12% having Internet access at other locations (e.g. work, school, etc.) (NTIA, 2007; Zhang, Callegaro, & Thomas, 2008; Horrigan, 2008), leaving 23%-28% without access to the Internet anywhere. These numbers are 5 to 10 percentage points lower in Washington, where the current study was conducted, but still pose considerable coverage limitations (NTIA, 2007).

In addition, the Internet has been found to produce consistently lower response rates than mail or telephone modes, by an average of 11% (Manfreda, Bosnjak, Berzelak, Haas, & Vehovar, 2006), and particularly in older populations (Bech & Kristensen, 2009). Lower response rates have also been prevalent even when the Internet request was sent via email and when the sampled population had access to the Internet, such as college students and faculty (e.g. Sax, Gilmartin, & Bryant, 2003; Schonlau, Asch, & , 2003; Kaplowitz et. al, 2004; Akl, Maroun, Klocke, Montori, Schunemann, 2005).

The subgroups of the population without Internet access are also quite different from those with access. Reports show that those without Internet access are likely to be older, unemployed or retired, not married and live in smaller households with lower levels of education and income compared to those with Internet access (Horrigan, 2008; Zhang et. al., 2008; Rookey, Hanway, & Dillman, 2008). Moreover, professional norms prevent contacting potential survey respondents by email without having already established a prior relationship with such person (CASRO, 2009) and email addresses, unlike phone numbers, are used and structured in ways that pose significant challenges to random probability sampling (Couper, 2000).

RDD, the primary method for sampling and contacting U.S. households in the late 1900’s, appears to be declining in effectiveness with many of the same coverage and response problems associated with the Internet (Steeh, 2008; Curtin & Singer, 2005; de Leeuw & Heer, 2002). For example, as of 2009 over 20% of U.S. and Washington households do not have landlines (Blumberg & Luke, 2008; Blumberg, Luke, Davidson, Davern, Yu, & Soderberg, 2009), the traditional RDD sample frame. The majority of these are cell-only households, which present more obstacles and extra costs in obtaining survey responses (Lavrakas, Steeh, Blumberg, Boyle, Brick, Callegaro, et. al., 2008).

A potential way to overcome the coverage and sampling problems of both the Internet and RDD is to use ABS. The DSF is an ABS frame comprised of a near complete and up-to-date listing of all postal addresses in the U.S. and is available for purchase at reasonable prices from private vendors (Link et. al., 2008; Fahini, 2009). The DSF has comparatively high coverage rate of up to 95-97% of U.S. households, especially in urban areas, but slightly lower rates have been found in many rural areas (Link et. al., 2008; Fahini, 2009; O’Muircheartaigh, English, & Eckman,

3 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

2007; Steve, Dally, Lavrakas, Yancey, & Kulp, 2007; Link, Battaglia, Giambo, Frankel, & Mokdad, 2005a; Staab & Iannacchione, 2004; Iannacchione, Staab, & Redden, 2003). Moreover, using additional databases such as InfoUSA, addresses from the DSF can be appended with names, with up to an 82% match rate, and with telephone numbers, with up to a 62% match rate (Fahini, 2009). Overall, ABS coverage is now considered superior to RDD and the Internet for sampling from the general public. However, unless addresses are appended with telephone numbers, using ABS requires mail contact procedures, which can be slower and costlier as a contact or survey mode compared to the Internet or telephone.

Mixing survey modes has, in general, also been found to increase response rates and improve data quality (de Leeuw, 2005; Dillman et. al., 2009), although this can heavily depend on the modes used and how they are offered to respondents (Smyth et. al., In Press; Israel, 2009; Gentry & Good, 2008; Grigorian & Hoffer, 2008; Schonlau et. al., 2003; Griffin, Fisher, & Morgan, 2001). For example, using mail with the Internet has been found to increase the proportion of Internet respondents when the Internet is offered first, followed by a mail option (Schonlau, et. al., 2003; Smyth et. al., In Press; Israel, 2009). The Smyth et. al. study (In Press) demonstrated this by withholding the offer to respond by mail until a follow-up mailing, in which 3/4s of respondents used the Internet, but obtained a lower overall response rate compared to the alternatives. The alternative of offering both Internet and mail options in the initial contact resulted in only about 20% of the 63% responded doing so by the Internet. Offering only mail in the initial contacts resulted in less than one percent of the 71% who responded doing so over the Internet but resulted in the highest response rate. Using a similar methodology, Israel (2009) reported comparable results from a non-probability study of Florida Extension clients. Thus, the evidence seems strong that offering the Internet in a follow-up or offering modes concurrently is not likely to result in a significant proportion of respondents using the Internet to respond but will likely result in a higher overall response rate compared to offering the Internet first.

Moreover, little is known about whether the response rates and proportion of Internet respondents obtained by including the Internet as a survey mode improve data quality. As demonstrated by ABS and Internet coverage rates, nearly everyone has a postal address but only about 2/3s of the population has Internet access and these people are somewhat different from those without access. However, mixing mail and Internet survey modes has generally been found to reduce non-response error compared to offering one mode alone by obtaining responses from different types of people who prefer or have access to a particular mode (Dillman et. al., 2009; de Leeuw, 2005). For example, using mail and the Internet with CATI, Link & Mokdad (2006) found that respondents to an Internet version of the BFRSS were more likely to be younger and married and to have children and higher levels of education and income compared to mail respondents. Rookey et. al. (2008) report similar differences between Internet and mail respondents in a panel survey by The Gallup Organization, suggesting that a combination of modes would better reduce non-response error.

Smyth et. al. (In Press) found respondents to mail and Internet modes were significantly different in regards to both demographic and technology characteristics. Demographically, mail respondents were older, retired or unemployed, less educated, had less income, were less likely to be married, and had less children in the household compared to Internet respondents (Smyth

4 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

et. al., In Press). Mail respondents in the Smyth et. al. (In Press) study were also less likely to be as technologically-oriented as Internet respondents: they tended to live in households with only a landline or no phone at all, reported less computer use and less access to the Internet from home, and also required more assistance with the computer and Internet compared to Internet respondents. Israel (2009) also reported similar demographic differences between mail and Internet respondents in Florida.

Differences between mail and Internet respondents should be expected given the absence of universal Internet coverage but the broader questions remains as to whether Internet, mail, or a combination of the two modes is actually representative of the target population. Because sample sizes for the American Community Survey are too small for the region covered by the Smyth et al. (In Press) survey, it was not possible to determine how demographic characteristics compare to the characteristics of all households in the region. The few studies that have been able to acquire data for non-response comparisons found significant levels of non-response error, using demographic differences as an indicator. For example, the study by Link & Mokdad (2006) found respondents to the Internet/CATI and mail/CATI surveys were different than the target population, as reported by the ACS. Respondents to both mixed-mode surveys were over- representative of older, white, females with higher levels of education and income compared to members of the target population. Rookey et. al. (2008) also found comparable differences in contrasting mail and Internet panelists with data provided by the U.S. Census, in which respondents to either mode were not representative of the general population in terms of income, education, marital status, employment, age, and gender. Thus, it is likely that combining mail with the Internet can contribute to enhanced data quality but this currently remains inconclusive.

Two implementation techniques used by Smyth et al. (In Press) may also potentially increase response rates and reduce nonresponse error, particularly for the Internet. One of these techniques is the use of token cash incentives of $5 sent with the request for a response. Token cash incentives sent with the response request have consistently been shown to increase response rates in mail and Internet surveys and are much more effective than other material incentives or payments sent after submission of a response (Ryu, Couper, & Marans, 2005; Birnholtz, Horn, Finholt, & Bae, 2004; Bosnjak & Tuten, 2003; Lesser, Dillman, Lorenz, Carlson, & Brown, 2002; Singer, 2002; Church, 1993; James & Bolstein, 1990). Token incentives in advance are not typically used in Internet surveys unless mail is the contact method, through which an incentive can be sent to the respondent.

The effects of this procedure were not examined experimentally in other studies (e.g. Smyth et. al., In Press; Birnholtz et. al, 2004; Bosnjak & Tuten, 2003) but we expect that cash incentives will be more effective for stimulating Internet response than mail response because of the extra effort required to respond via the Internet when the invitation is sent by mail. In using mail, potential respondents are contacted by a mode that is typically received and processed when respondents are not on their computers. The process of going from the mail delivery point to a computer with Internet access is more difficult than when a mail response is requested to an enclosed mail questionnaire or when a request is sent via email, in which case respondents are likely using their computer at the time.

5 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

Another technique that may have a positive influence on response, particularly in Internet surveys, is special instructions. Most people have Internet access, but some who do lack the skills for using it effectively (Stern & Dillman, 2006). We also expect that there is considerable mistrust in the authenticity of Internet surveys on the part of some people. The Smyth et al. (In Press) study used an illustrated Internet instruction card (web card) to establish trust and increase the simplicity of responding over the Internet but was not tested experimentally. Lesser (2009) used a similar web card in a study of Oregonians and found that the card actually produced lower response rates compared to no card. An experiment was incorporated into the current study to test whether the kind of insert used in the Smyth et al. (In Press) and Lesser (2009) studies had an impact on response. We also tested whether different kinds of respondents, e.g. older with less internet experience, were more likely to respond with the web card.

Additionally, the content and timing of the mail contacts can potentially affect non-response. For example, providing respondents with too much material (e.g. letter, paper questionnaire, URL and passcode, Internet instructions, etc.) could introduce complexity or increase burden into the decision-making process, compelling respondents to disregard the survey altogether (Dillman et. al., 2009a). Experimental research on non-survey situations has found that offering choices to individuals often results in people being less likely to choose any of the offered choices (Schwartz, 2004). Several survey experiments have also found that providing a choice between mail and Internet obtained response rates lower than in a mail-only design (Gentry & Good, 2008; Grigorian & Hoffer, 2008; Griffin et. al, 2001; Smyth et. al., In Press). Thus, delaying the mailing of the web card and presenting respondents with one piece of pertinent information at a time may reduce complexity and burden and therefore result in more completed surveys, but no research to date has examined the effects of this method of delivery.

6 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

CHAPTER 3: METHODS

For this study, we collected data from a state-wide general public household sample of addresses in Washington. Based on estimates from the 2007 ACS data for the state, Washington is estimated to have 2,547,663 households, with a population of 6,549,224 individuals (ACS, 2007). Addresses were randomly sampled from the DSF in two separate waves, one during the summer and the second during the fall of 2008. The survey was titled the “Washington Community Survey” (WCS) and included 41 questions about community satisfaction, quality of life, and where people obtain goods and services. The survey also includes questions about people’s Internet and cell phone usage and demographic characteristics. It took Internet respondents nearly 19 minutes to complete the survey and it is expected that the mail version required slightly more time.

Each wave of the WCS lasted about two months with the implementation process occurring over the course of the first month and data collection extending an additional month. The contact dates and materials mailed to respondents for the six treatment groups that comprise Wave 1 are listed in Table 1. As illustrated there, all groups received a pre-notice letter in the first contact and a reminder/thank you postcard in the third contact. The differences between the groups occur in the second and fourth contacts, varying by the mode. Half of the treatment groups began with mail, with the $5 mail-only group, Treatment 1, using mail throughout the survey and mail preferences groups, Treatments 2 & 3, offering web in the fourth contact. Mail groups 1 and 2 also contained a $5 incentive in the second contact, which was excluded from mail group 3. The three Web Preference groups, Treatments 4, 5, & 6, each began with web, followed by a mail option sent in the fourth contact. All the Web Preference groups received a letter with the survey URL and passcode while groups 4 & 5 also received a $5 incentive and groups 5 & 6 received the web card in the second contact.

Wave 2 of the survey consisted of three additional treatment groups, as shown in Table 1, and was implemented in the months of September and October. This wave was designed to determine whether the web card could be more effective for reducing non-response if it was mailed to respondents after the initial request to complete the survey, which was sent in the second contact, and for increasing the proportion of Internet respondents in the mail preference group (Treatment 9). Wave 2 utilized the same materials used in Wave 1, with the exception of the web card, which contained the respondent’s passcode in Wave 2. All groups received a $5 incentive in the second contact. The third contact, sent one week after the second contact, is also somewhat different from that in Wave 1. In Wave 2 the third contact included a reminder letter for all groups and a web card for groups 8 & 9 rather than the reminder postcard used in Wave 1.

Each contact was addressed to the resident of the city or town in the postal address. Postal addresses in the DSF can be appended with names and telephone numbers but efforts to do this were avoided since research has found that including names on the envelopes and letters resulted in negligible improvements in the response rate (Link et. al., 2008) and since the telephone mode was not utilized in this study.

7 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

TABLE 1: Implementation Procedures and Dates for WCS Wave 1 & 2 Treatment Groups

Treatment Groups 1st Contact 2nd Contact 3rd Contact 4th Contact

Wave 1 June 23, 2008 June 26, 2008 July 7, 2008 July 23, 2008 Questionnaire1, Questionnaire1 (1) $5 Mail-only Pre-notice Letter Reminder postcard Letter, & $5 & Letter

Questionnaire1, Questionnaire1 (2) $5 Mail Preference Pre-notice Letter Reminder postcard Letter, & $5 & URL letter

(3) Mail Preference, Questionnaire1 Questionnaire1 Pre-notice Letter Reminder postcard No $5 & Letter & URL letter

(4) $5 Web Preference, URL-only letter Reminder postcard Questionnaire1 Pre-notice Letter No Web card & $5 with URL & URL letter

(5) $5 Web Preference, URL letter, Reminder postcard Questionnaire1 Pre-notice Letter w/Web card Web card, & $5 with URL & URL letter

(6) Web Preference, URL letter & Reminder postcard Questionnaire1 Pre-notice Letter w/Web card & No $5 Web card with URL & URL letter

Wave 2 September 22, 2008 September 29, 2008 October 6, 2008 October 20, 2008 (7) $5 Web Preference, URL-only letter Reminder URL Questionnaire1 Pre-notice Letter No Web card & $5 letter & URL letter

(8) $5 Web Preference, URL-only letter Reminder URL Questionnaire1 Pre-notice Letter w/Web card & $5 letter & Web card & URL letter

(9) $5 Mail Preference, Questionnaire1, Reminder URL Questionnaire1 Pre-notice Letter w/Web card Letter, & $5 letter & Web card & URL letter Notes: 1 Return envelopes were included in each mailing with a paper questionnaire

The sample for the survey consisted of 5,400 randomly selected residential addresses in Washington, 3500 for Wave 1 and 1900 Wave 2. Business, seasonal, and vacant addresses were excluded from the sample frame to ensure that the sampled addresses were residential and were also more likely to belong to full-time citizens and residents of Washington. For random selection at the household level, a request that the adult with the most recent birthday fill out the survey was included in the contact letters and on the questionnaire. This method of within- household sampling has been found to be as effective as some alternatives, including any adult and all adults in the households (Battaglia, Link, Frankel, Osborn, & Mokdad, 2008).

The sample was also stratified to include a higher proportion of people residing in rural counties in Washington. Over two-thirds of Washingtonians reside in five of the thirty-nine counties in the state (ACS, 2008; Albrecht, 2008) and we obtained 50% of the sampled addresses from these five most populous, urban counties and the other 50% was drawn from the remaining 34 non- urban counties. Post-stratification weights were applied to the data to offset the effects of the disproportionate number of rural respondents on the representativeness of the results.

The paper questionnaires and web pages used in these experiments were designed using a uni- mode construction, as outlined by Dillman et. al. (2009). For example, the same graphical features, questions, and question order were used in each mode. Each question on the mail

8 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

questionnaire was also built within a separate field that appeared similar to the ones used in the Internet survey, with regard to shape and color. Cascading style sheet construction was used to maintain visual similarity of web pages across respondent browsers and screen configurations. The contact materials, including the letters, postcards, and envelopes, were also designed similarly. The design details of these components are provided in Messer (2009) and follow closely those used by Smyth et. al. (In Press). Also, see Appendix I and II for illustrations of these components.

Several statistical analyses were conducted with the WSC data to determine the effects of different implementation methods used in the WCS. Comparisons of response rates and demographic data between respondents to different treatment groups provide measures of the levels of non-response. Demographic variables include gender, age, education, race/ethnicity, children in household, number in household, marital status, employment status, and household income. Furthermore, demographic data on these variables from the WCS was compared with the 2008 data for Washington from the ACS to discern whether respondents to the WCS are representative of the general state population. The ACS, which replaced the Census Long Form in 2001, uses an ABS frame to sample about 3 million households across the U.S. and mail, CATI, & CAPI methods are employed in sequential stages to conduct the survey (ACS, 2008). WCS cost and time analyses are also presented as measures of efficiency for using mail and Internet modes.

In addition, corresponding treatment groups between waves of data collections are combined, where indicated, for certain response rate and non-response error analyses. Since Wave 1 and Wave 2 of the WCS were conducted only three months apart and used primarily the same implementation protocols, it seemed reasonable to combine corresponding groups. Initial tests were conducted, but not reported here, to determine whether combining corresponding treatment groups between waves was practical.

9 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

10 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

CHAPTER 4: RESULTS

4.1 WCS Response Rates

4.1.1 Does mail or a combination mail and the Internet used with the DSF obtain reasonable response rates in statewide household surveys?

Response rates for this statewide DSF sample are comparable to, if not higher than, those achieved in many recent telephone or Internet surveys (Curtin, Presser, & Singer, 2005; de Leeuw & Heer, 2002; Steeh, Kirgis, Cannon, & DeWitt, 2001; Manfreda et. al., 2006) but were modestly lower than those obtained by Smyth et al. (In Press) using similar methods. The Mail Preference group was 53.6% (compared to 71.1% in the Smyth et. al. study [In Press]) and a Mail-only group obtained the highest response rate, at 56.7%. The Web Preference group, in which a mail option was withheld until the fourth contact, was 46.3% (compared to 55% in the earlier study). The Wave 2 comparisons were slightly lower than those obtained in Wave 1, at 55.0% (compared to 53.6%) for the Mail Preference treatment and 42.9% (compared to 46.3%) for the Web Preference group. In the Web Preference groups, nearly 2/3s of respondents in the both Wave 1 & 2 responded over the Internet, which are comparable to the proportions obtained by Smyth et. al. (In Press). In today’s survey environment these response rates seem credible, notwithstanding analysis for nonresponse error, which is discussed later in this paper.

TABLE 2: WCS Coverage, Total Completes, and Response Rates by Wave, Treatment Group, and Survey Mode, with Significance Tests between Selected Treatment Groups and Modes

Treatment Groups Sample Size Completes and Rates by Survey Mode Total* N – Undeliver- % % % Wave 1 N ables 1st Mode(Proportion) 2nd Mode (Proportion) (n) 56.7 56.7 (1) $5 Mail-only 500 471 Mail: -- -- (1.0) (267)

52.5 1.1 53.6 (2) $5 Mail Pref. 500 474 Mail: Internet: (0.98) (0.02) (254)

(3) Mail Pref., 39.2 0.9 40.1 700 648 Mail: Internet: No $5 (0.98) (0.0) (260)

(4) $5 Web Pref., 28.5 14.3 42.8 600 554 Internet: Mail: No Web card (0.67) (0.33) (237)

(5) $5 Web Pref., 31.3 15.0 46.3 500 464 Internet: Mail: w/Web card (0.67) (0.33) (215)

(6) Web Pref., 13.4 12.3 25.7 700 643 Internet: Mail: w/Web card & No $5 (0.52) (0.48) (165)

Wave 2 (7) $5 Web Pref., 27.8 16.9 44.7 700 667 Internet: Mail: No Web card (0.62) (0.38) (298)

(8) $5 Web Pref. 26.2 17.0 42.9 700 665 Internet: Mail: w/Web card (0.61) (0.39) (285)

(9) $5 Mail Pref. 51.9 3.1 55.0 500 476 Mail: Internet: w/Web card (0.94) (0.06) (262) TABLE 2 continued on next page…

11 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

TABLE 2 continued

Significance Tests Internet Card $5 Incentive (4) No card (7) No card (2) $5 Mail (4&5) $5 Web vs. vs. vs. vs. Mode (5) Card (8) Card (3) Mail (6) Web 0.37 0.05 4.44 1.35 Mail...... z(p) (.355) (.481) (.000) (.089) 0.95 0.71 0.22 7.22

Web...... z(p) (.171) (.240) (.414) (.000) Mail & Web 1.14 0.67 4.47 7.15 Combined...... z(p) (.128) (.252) (.000) (.000) *Response rate=number of completed/(N – undeliverables).

4.1.2 Does the inclusion and/or timing of an web card increase response rates?

The Internet card was designed to attract more respondents to the Internet, particularly those unaccustomed to using the Internet. It appears from Table 3, however, that the web card produced mixed results in terms of increasing response in the Web Preference groups. In Wave 1, the $5 web card group (Treatment 5) had a higher response rate than the $5 no web card group (Treatment 4), with a difference of 3.4% for the total response rate and 2.8% between Internet respondents to both groups. However, as shown in Table 3, the differences in the total, mail, and Internet response rates are not statistically significant.

Moreover, delaying the mailing of the web card in Wave 2 in order to reduce complexity resulted in contradictory results. In this test the $5 no web card group (Treatment 7) had a slightly higher (1.8%), but not significantly different response than did the $5 web card group (Treatment 8). Thus, reducing the complexity by delaying delivery of the web card appears to have had no discernible effect. In addition, in both Waves the proportion of Internet respondents was virtually the same between groups with and without the web card, further indicating that it had little to no effect on Internet response.

4.1.3 Do cash incentives increase response rates for incentive vs. non-incentive mail and Internet respondents?

As shown in Table 3, the $5 incentive Mail Preference and Web Preference groups each obtained response rates significantly higher than the corresponding non-incentive groups. In the Mail Preference groups (Treatments 2 vs. 3), the difference was 13.5%, with 53.6% for the incentive group and 40.1% for the group without the incentive. The incentive appears to have an even larger impact on Internet respondents. The response rate for the $5 Web Preference group (Treatment 5) was 46.3% and the rate for the non-incentive group (Treatment 6) was only 25.7%, a 20.6 percentage points difference. Moreover, people were more likely to respond on the Internet (rather than by mail) when the incentive was included—67% of the respondents in the $5 treatment group went to the Internet as expected, compared to only 52% of all respondents in the non-incentive group. Thus, the token cash incentive in advance was particularly important in not only improving response, but also in driving people to the Internet. There was no differential mode effect in the Mail Preference groups, in which 98% of all responses were by mail regardless of the incentive.

12 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

4.2 Non-response Error

We compare the demographic characteristics of respondents to the different experimental groups to test for non-response error and to gain further insight into whether the differences in response rates between the different groups are important. First, incentive and non-incentive Mail Preference and Web Preference groups are contrasted to evaluate the impact of the incentive on non-response error. Second, comparisons are made between Web Preference groups to determine whether the inclusion of the web card affected non-response error. Third, mail, Internet, and mail and Internet combined respondents are compared to in what ways, if any, they differ. Finally, mail and Internet respondents are contrasted with population characteristics of Washington, as provided by the ACS, to test the overall representativeness of mail and Internet WCS respondents.

4.2.1 Do cash incentives reduce non-response error between incentive vs. non-incentive mail and Internet respondents?

Given the large disparities in response rates between incentive and non-incentive groups, it is very likely that non-incentive group respondents are also different types of people compared to corresponding incentive group respondents. As shown in Table 3, however, few significant differences exist. In the Mail Preference group comparisons, more non-incentive respondents were employed, use the computer and Internet, and have Internet in the household compared to incentive mail respondents. Mail non-incentive respondents also have slightly higher levels of income and education and children in the household, but these differences are not statistically significant, possibly due to smaller sample sizes.

TABLE 3: Characteristics of WCS Respondents to the Mail Incentive vs. No Incentive Groups and Web Incentive vs. No Incentive Groups

1&2: $5 Mail Pref. 4&5: $5 Web Pref. vs. vs. 3: Mail Pref., No $5 6: Web Pref., No $5 $5 No $5 . x2 (p) $5 No $5 Diff. x2 (p) Gender (%) Male 39.8 40.8 +1.0 0.05 39.8 42.4 +2.6 0.22 Female 60.2 59.2 -1.0 (.827) 60.2 57.6 -2.6 (.643)

Education (%) High school or less 18.8 13.9 -4.9 21.9 9.6 -12.3 1.20 Some college, no degree 25.5 26.0 +0.5 31.0 17.1 -13.9 7.94 (.309) Coll degree (2yr or 4yr) 39.3 38.1 -1.2 32.4 54.1 +21.7 (.000)

Prof/Grad degree 16.4 22.0 +5.6 14.7 19.2 +4.5

Age (%) 18-34 12.1 15.7 +3.6 16.8 14.8 -2.0 35-54 39.5 35.2 -4.3 0.87 29.8 37.1 +7.3 1.13 55-64 23.5 27.2 +3.7 (.458) 23.0 16.2 -6.8 (.336) 65+ 24.9 21.9 -3.0 30.4 31.9 +1.5

Race (%) Non-Hispanic Whites 85.2 83.8 -1.4 0.15 79.6 80.0 +0.4 0.01 All Others 14.8 16.2 +1.4 (.700) 20.4 20.0 -0.4 (.934)

Children in Household (% 1.04 0.05 39.5 45.1 +5.6 35.1 33.8 -1.3 Yes) (.309) (.830) Table 3 continued on next page…

13 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

TABLE 3, continued

Number in Household (%) 1 person 26.0 19.0 -7.0 24.6 31.4 +6.8 2 persons 33.6 34.1 +0.5 39.8 32.7 -7.1 0.77 1.25 3 persons 14.0 18.7 +4.7 17.5 10.9 -6.6 (.625) (.285) 4 persons 16.8 17.9 +1.1 12.8 19.9 +7.1 5+ persons 9.6 10.3 +0.7 5.3 5.1 -0.2

0.33 0.01 Married (%Yes) 59.3 62.0 +2.7 53.4 52.9 -0.5 (.564) (.924)

2.66 0.93 Employed (%Yes) 57.5 65.1 +7.6 56.5 62.0 +5.5 (.104) (.335)

1 0.06 0.61 Urban (% Yes) 62.8 63.8 +1.0 66.5 62.7 -3.8 (.807) (.436)

Household Income (%) < $10K/year 4.1 1.9 -2.2 2.3 6.0 +3.7 $10K to < $25K/year 12.8 8.2 -4.6 14.0 11.1 -2.9 $25K to < $50K/year 23.0 21.9 -1.1 1.09 30.6 23.6 -7.0 0.99 $50K to < $75K/year 22.1 22.3 +0.2 (.363) 16.7 20.6 +3.9 (.424) $75K to < $100K/year 16.5 18.8 +2.3 17.8 17.1 -0.7 $100K/year or > 21.5 26.9 +5.4 18.6 21.6 +3.0

Computer/Internet Status (%Yes) 3.34 1.92 Use Computer 84.1 89.8 +5.7 81.8 87.3 +5.5 (.068) (.166) 4.36 0.94 Use Internet 83.0 89.5 +6.5 84.8 88.5 +3.7 (.037) (.334) 0.68 1.82 Assistance w/Internet2 25.4 21.9 -3.5 21.5 15.3 -6.2 (.409) (.178) 3.00 1.19 Internet in Household 82.3 88.0 +5.7 84.9 89.1 +4.2 (.084) (.276) 0.83 0.11 High-speed Internet3 86.2 82.8 -3.4 84.2 85.7 +1.5 (.362) (.742) Telephone Status (%) No Phone 0.9 2.1 +1.2 1.9 0.4 -1.5 Cell-only 15.9 16.5 +0.6 20.8 17.6 -3.2 1.07 1.39 Landline-only 12.7 9.4 -3.3 13.0 13.3 +0.3 (.371) (.236) Landline & Cell 66.7 69.6 +2.9 62.3 64.2 +1.9 Unknown 3.8 2.4 -1.4 2.0 4.5 +2.5

Communicate w/Others4 (%) Face to face 8.3 6.1 -2.2 10.5 8.4 -2.1 Phone 65.6 68.2 +2.7 0.40 63.1 62.5 -0.6 0.30 Internet 22.8 23.1 +0.3 (.755) 23.3 26.4 +3.1 (.828) Other 3.2 2.6 -0.6 3.1 2.7 -0.4

Community Satisfaction5 (%) Complete/Mostly Satisfied 82.7 83.3 +0.6 0.76 85.6 84.1 -1.5 0.94 Somewhat/Not at all 17.3 16.7 -0.6 (.469) +1.5 (.384) Satisfied 14.4 15.9 Notes: Bold = p≤.10; 1Urban respondents had a postal address in one of the five most populous counties in Washington; 2Respondents reporting needing assistance with the Internet chose “All the time,” “Frequently,” or “Occasionally” on a scale of “All the time, Frequently, Occasionally, Rarely, Never,” and also reported using the Internet at some place; 3Respondents reported having Internet access in the household. 4Respondents reported the method of communication most often used to communicate with others inside and outside of the respondent’s community. The most common method for both is reported; 5Respondents reported overall satisfaction with their community on a scale of “Completely, Mostly, Somewhat, and Not at all” satisfied.

14 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

For the Web Preference groups, a greater proportion of non-incentive respondents had higher levels of education compared to incentive respondents. Non-incentive respondents also appear to be slightly younger and more likely to be employed and also use the computer more often and require less assistance with the Internet, although these comparisons are not significant either. Interestingly, levels of income were very similar between incentive and non-incentive respondents in the Web Preference groups. Overall, non-incentive mail and web respondents seem more similar to incentive respondents than they are different. Thus, while the incentive considerably increases response rates, it may not as noticeably reduce non-response error.

4.2.2 Does the inclusion and/or timing of an Internet instruction card reduce non-response error?

Although the web card did not significantly increase response rates, it could still be effective at reducing non-response error. However, we found very few significant differences (p ≤ .10) between respondents with the web card vs. those without the web card in either Wave of the survey. The comparisons are not illustrated in a table due to the lack of significant differences. The only significant difference in Wave 1 (Treatments 4 vs. 5) was that a greater proportion of web card respondents live in cell-only households while in Wave 2 (Treatments 7 vs. 8), in which the web card was delayed, the only difference was that web card respondents reported needing less assistance with the Internet, which is somewhat counterintuitive. We also found this to be the case when comparing across Waves (Treatments 4 & 7 vs. 5 & 8).

On the other hand, larger sample sizes could possibly result in more significant differences. Several differences were greater than 5.0 percentage points but did not achieve statistical significance. Moreover, inconsistencies were also evident between Waves. For example, in Wave 1 the web card group was comprised of a higher proportion of older females with less education and employment compared to the group without the web card. In the Wave 2 comparison, differences in gender were much smaller and respondents to the web card group reported more education and employment and also were more likely to be married compared to the respondents without the web card. When the corresponding groups are combined across Waves, these differences are much less pronounced and do not approach statistical significance. Thus, while it appears that some differences do exist, the web card did not have the overall intended effect of consistently convincing different types of people to respond via the Internet.

The same three comparisons were also conducted for mail respondents to each $5 Web Preference group. Although the card does not appear to affect Internet respondents, it could possibly have had an indirect effect by steering different types of people away from or towards mail. However, no significant differences (p ≤ .10) were found between mail respondents to the web card vs. those without the web card in Wave 1 (Treatments 4 vs. 5). In Wave 2 comparisons (Treatments 7 vs. 8), the only differences were that web card respondents were less likely to have children in the household or be employed and were more likely to have Internet service in the household, although, for the latter, both were greater than 85%. When mail respondents to the $5 Web Preference groups are combined and compared across waves (Treatments 4&5 vs. 7&8), there were no significant differences between Wave 1 and Wave 2 respondents. These results are also not reported in a table due to the absence of significant differences but, because of the small sample sizes, the results are not conclusive.

15 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

4.2.3 Are Internet respondents different types of people than mail respondents?

Before examining non-response error between mail and web modes, demographics of the various treatment groups were evaluated for potential differences. As demonstrated above, very few differences exist between respondents to the $5 Web Preference groups with and without the card (Treatments 4, 5, 7, & 8). As a result, these are combined for further analyses and will be referred to as the $5 Web Preference groups, unless otherwise indicated. We also found that respondents to the $5 Mail groups (Treatments 1, 2, and 9) were quite similar, with no significant differences existing in the two-way comparisons between groups and Waves ( 1 vs. 2, 2 vs. 9 and 1 vs. 9) for the demographic variables available to us. Given the lack of any differences, we combined all of these Mail groups together for further analyses and the combined groups will be referred to as the $5 Mail groups.

The first test of non-response error was conducted by comparing web respondents with mail follow-up respondents to the $5 Web Preference groups. This test established whether respondents to the mail, which was sent three weeks after the Internet, were different than respondents to the Internet. As reported in Table 4, this appears to be the case. Mail follow-up respondents are significantly more likely (p ≤ .10) to be older, unmarried, and unemployed with lower levels of education and income and who have no children in the household, share the household with fewer people, and live in rural counties. Technologically, mail follow-up respondents use the computer and Internet less often and require assistance with the Internet more often, lack Internet access and high-speed access in the household, have landline phone service and no cell phone service, and communicate with others most often over the phone compared to web respondents. Of particular importance, only about 65% of mail follow-up respondents reported having Internet access in the household, which is close to the national average, while nearly all web respondents (96.3%) reported Internet in the household. Finally, mail follow-up respondents are less satisfied overall with their community compared to web respondents, which is one of the primary topics of the survey. Differences greater than 5 percentage points are in bold in Table 4. These multiple differences indicate that a mail follow- up may reduce non-response error by obtaining very different types of people. It may also be necessary to use mail as a follow-up to ensure that potential respondents without Internet access can actually respond to the survey.

In a second test of non-response error, web-only respondents to the $5 Web Preference groups are compared with mail-only respondents in the $5 Mail groups to establish whether respondents to either mode are different, regardless of whether a follow-up mode is provided. In this comparison, both types of respondents received the corresponding mail or Internet survey first, and not as the follow-up mode. Table 4 shows that similar differences exist as those found between web respondents and follow-up mail respondents but are much less pronounced. For example, “gender,” “children in household,” and “urban” completely lose statistical significance (p ≤ .10) while other differences maintain statistical significance at the p ≤ .10 level but are much smaller absolute differences. The only exception is community satisfaction, in which the difference between web-only and mail-only respondents is somewhat larger than between web and mail follow-up respondents.

16

TABLE 4: Characteristics of WCS Respondents to Web vs. Mail in the $5 Web Preference Groups, to Web vs. Messer &Dillman:Surve Mail between $5 Web Preference Groups and $5 Mail Preference Groups, and to Web & Mail combined $5 Web Preference Groups vs. $5 Mail Preference Groups

$5 Web Pref. (Web-only) $5 Web Pref. (Web + Mail) $5 Web Pref. vs. vs. Web-only vs. Mail Follow-up $5 Mail Pref. (Mail-only) $5 Mail Pref. (Mail-only) Mail Web Mail Web-only Mail Diff. x2 (p) Web-only Pref. Diff. x2 (p) Pref. Pref. Diff. x2 (p) Gender (%) Male 43.2 34.8 -8.4 6.09 43.2 39.3 -3.9 2.04 40.4 39.8 -0.6 0.06 Female 56.8 65.2 +8.4 (.014) 56.8 60.7 +3.9 (.153) 59.6 60.2 +0.6 (.800)

Education (%) y High school or less 12.1 29.8 +17.7 12.1 20.6 +8.5 18.0 20.1 +2.1 in Some college, no g 28.0 27.6 +0.4 27.9 28.2 +0.3 27.8 27.9 +0.1 theGeneralPublic b degree 15.49 5.98 0.45 Coll degree (2yr or (.000) (.001) (.715) 41.6 31.0 -10.6 41.6 35.4 -6.2 38.0 36.0 -2.0 4yr) Prof/Grad degree 18.4 11.6 -6.8 18.4 15.8 -2.6 16.2 16.0 -0.2

Age (%) 18-34 24.0 8.7 -15.3 24.0 13.5 -10.5 18.9 14.1 -4.8

17 35-54 39.2 28.1 -11.1 27.22 39.3 36.1 -3.2 11.30 35.7 36.0 +0.3 3.05 55-64 20.1 22.6 +2.5 (.000) 20.0 25.4 +5.4 (.000) 20.8 25.5 +4.7 (.027) 65+ 16.7 40.6 +23.9 16.7 25.0 +8.3 24.6 24.4 -0.2

Race (%) y

Non-Hispanic Whites 82.5 84.5 +2.0 0.51 82.5 82.6 +0.1 0.00 83.1 82.6 -0.5 0.04 Mailvs.‘Web All Others 17.5 15.5 -2.0 (.465) 17.5 17.4 -0.1 (.949) 16.9 17.4 +0.5 (.835)

Children in Household 9.28 1.79 0.00 41.7 29.2 -12.5 41.7 37.4 -4.3 37.9 37.8 -0.1 (% Yes) (.002) (.182) (.972)

Number in Household (%) 1 person 19.5 32.8 +13.3 19.5 28.2 +8.7 23.4 28.5 +5.1 2 persons 35.7 37.1 +1.4 35.7 32.0 -3.7 35.9 31.3 -4.6 p 2.94 2.56 1.46 lus Mail’ 3 persons 14.5 11.7 -2.8 14.5 17.1 +2.6 13.8 17.3 +3.5 (.002) (.009) (.160) 4 persons 21.4 10.7 -10.7 21.4 13.5 -7.9 18.2 13.8 -4.4 5+ persons 8.9 7.7 -1.2 8.9 9.2 +0.3 8.7 9.1 +0.4

20.91 6.52 0.54 Married (%Yes) 63.1 47.2 -15.9 63.1 55.9 -7.2 57.8 55.9 -1.9 (.000) (.011) (.464)

36.89 11.31 0.59 Employed (%Yes) 68.4 47.5 -20.9 68.4 59.3 -9.1 61.4 59.5 -1.9 (.000) (.001) (.443)

1 6.02 1.17 0.11 Urban (%Yes) 67.6 60.0 -7.6 67.6 65.0 -2.6 65.0 65.8 +0.8 (.014) (.279) (.739)

Messer &Dillman:Surve TABLE 4, continued

Household Income (%) < $10K/year 1.9 6.6 +4.1 1.9 4.9 +3.0 3.4 4.8 +1.4 $10K to < $25K/year 9.6 22.2 +12.6 9.6 13.2 +3.6 13.5 13.1 -0.4 $25K to < $50K/year 22.6 30.3 +7.7 10.94 22.6 24.9 +2.3 3.20 25.2 25.3 +0.1 0.60 $50K to < $75K/year 23.0 16.2 -6.8 (.000) 23.0 22.4 -0.6 (.007) 20.7 22.3 +1.6 (.700) $75K to < $100K/year 19.6 13.6 -6.0 19.6 16.0 -3.6 17.6 15.7 -1.9 $100K/year or > 23.3 11.1 -12.2 23.3 18.6 -4.7 19.6 18.8 -0.8

Computer/Internet Status (%Yes)

210.02 73.58 1.03 y

Use Computer 98.6 63.1 -35.5 98.6 84.6 -14.0 86.9 85.1 -1.8 in (.000) (.000) (.310) 232.49 89.38 2.34 g Use Internet 99.2 65.2 -34.0 99.2 85.1 -14.1 88.2 85.6 -2.6 theGeneralPublic b (.000) (.000) (.126) 23.78 10.54 2.11 Assistance w/Internet2 17.1 33.1 +16.0 17.1 24.8 +7.7 21.1 24.4 +3.3 (.000) (.001) (.147) 153.02 53.25 1.61 Internet in Household 96.3 65.3 -31.0 96.3 83.5 -13.3 86.1 83.9 -2.2 (.000) (.000) (.204) 34.29 4.05 0.21 High-speed Internet3 89.3 71.1 -18.2 89.3 85.3 -4.0 85.0 85.9 +0.9 (.000) (.044) (.647)

18 Telephone Status (%) No Phone 0.5 2.8 +2.3 0.5 1.2 +0.7 1.3 1.1 -0.2 Cell-only 22.3 13.0 -9.3 22.3 16.6 -5.7 19.3 16.7 -2.6 21.55 6.96 0.67 Landline-only 6.9 23.2 +16.3 6.9 13.4 +6.5 12.3 13.2 +0.9 (.000) (.000) (.610) y

Landline and Cell 70.0 56.9 -13.1 70.0 66.5 -3.5 65.3 66.6 +1.3 Mailvs.‘Web Unknown 0.6 4.1 +3.5 0.6 2.3 +1.7 1.8 2.4 +0.6

Communicate with Others4 (%) Face to face 12.6 9.2 -3.4 12.6 8.0 -4.6 11.5 7.9 -3.6 Phone 53.0 77.6 +24.6 18.11 53.0 67.2 +14.2 8.33 60.9 66.4 +5.5 2.27 Internet 30.8 12.0 -18.8 (.000) 30.8 22.3 -8.5 (.000) 24.7 22.8 -1.9 (.079) Other 3.6 1.3 -2.3 3.6 2.5 -1.1 2.8 2.8 -- p lus Mail’ Community Satisfaction5 (%) Complete/Mostly 88.6 81.6 -7.0 88.6 81.4 -7.2 86.1 81.8 -4.3 Satisfied 8.24 12.95 5.41 Somewhat/Not at all (.004) (.000) (.020) 11.4 18.4 +7.0 11.4 18.6 +7.2 13.9 18.2 +4.3 Satisfied Notes: Bold = p≤.10; 1Urban respondents had a postal address in one of the five most populous counties in Washington; 2Respondents reporting needing assistance with the Internet chose “All the time,” “Frequently,” or “Occasionally” on a scale of “All the time, Frequently, Occasionally, Rarely, Never,” and also reported having access to the Internet at some place; 3Respondents reported having Internet access in the household. 4Respondents reported the method of communication most often used to communicate with others inside and outside of the respondent’s community. The most common method for both is reported; 5Respondents reported overall satisfaction with their community on a scale of “Completely, Mostly, Somewhat, and Not at all” satisfied.

Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

Finally, $5 Mail group respondents were compared with web and mail follow-up combined respondents to the $5 Web Preference groups. This test assesses whether a mail follow-up to a web survey reduces non-response differences relative to a mail-only survey. The third main column in Table 4 reports only three significant differences at the p ≤ .10 level: age, communications with others, and community satisfaction. $5 Mail respondents were slightly older, less likely to communicate with others over the phone, and more satisfied with their communities compared $5 Web Preference respondents. All other differences are relatively much smaller and lose statistical significance.

In sum, web respondents are quite different than mail respondents, either when mail is used alone or as a follow-up to the Internet. Combining mail and web respondents, however, seems to obtain a sample of respondents approximately similar to those obtained via mail-only. Thus, while mail adds value to the Internet surveys by reducing non-response, the opposite design of offering the Internet to mail non-respondents does not. It appears that the of people who might be expected to respond to the Internet have already responded to the offer of mail.

4.2.4 Are mail, Internet, or mail and Internet combined respondents representative of the general public?

To gain insight into whether respondents to the $5 Mail Preference and $5 Web Preference treatment groups are representative of the general population in Washington, three comparisons were made with WCS data and the 2008 annual estimates for Washington provided by the ACS. The ACS provides demographic estimates for the general population of Washington (and the U.S. in general). In addition, it uses an address-based sample frame from the U.S. Census Bureau (ACS, 2008). WCS data are weighted on gender, age, and education from ACS estimates, as these methods have been found to at least partially account for non-response error (Biemer & Christ, 2008; Link et. al., 2008). Moreover, without these weights, differences were even more pronounced in most demographic comparisons, including gender, age, and education, in which all groups were over-representative of older females with higher levels of education (not shown in a table).

The results presented in Table 5 indicate that different combinations of respondents are more representative on some variables but not on others. All groups are over-representative of households with children, with the $5 Web Preference respondents slightly closer than the $5 Mail-only and the $5 Web-only respondents. Household size is more mixed but the $5 Web Preference groups are closest overall. Finally, the $5 Web Preference and $5 Mail-only respondents are also slightly closer to the population in terms of household income. Overall, it appears that respondents to the $5 Web Preference groups are most representative of the general population, although WCS $5 Mail-only and Web-only respondents are comparably representative when weighting methods are used.

19

Table 5: Comparison of Weighted1 Respondent Characteristics for $5 Mail-only, $5 Web- Messer &Dillman:Surve only, $5 Web Preference Groups vs. 2008 American Community Survey (ACS) data for Washington

$5 Mail Pref. $5 Web Pref. $5 Web Pref. (Mail-only) (Web-only) (Web + Mail) vs. 2008 ACS vs. 2008 ACS vs. 2008 ACS Web + Mail ACS Diff.2 Web ACS Diff.2 Mail ACS Diff.2 Gender1 (%) Male 49.4 49.4 -- 49.4 49.4 -- 49.4 49.4 -- Female 50.6 50.6 -- 50.6 50.6 -- 50.6 50.6 --

1 Age (%) y 18-44 49.0 49.0 -- 49.0 49.0 -- 49.0 49.0 -- in 45+ 51.0 51.0 -- 51.0 51.0 -- 51.0 51.0 -- g theGeneralPublic b

Education1 (%) Some college or less 63.0 63.0 -- 63.0 63.0 -- 63.0 63.0 -- 2 Yr Degree or more 37.0 37.0 -- 37.0 37.0 -- 37.0 37.0 --

Race1 (%) Non-Hispanic Whites 80.7 78.8 +1.9 81.1 78.8 +2.3 80.3 78.8 +1.5 All Others 19.3 21.2 -1.9 18.9 21.2 -2.3 19.7 21.2 -1.5 20 Children in Household 44.0 33.1 +10.9 41.9 33.1 +8.8 40.8 33.1 +7.7 (% Yes)

y

Number in Household (%) Mailvs.‘Web 1 person 25.3 27.8 -2.5 19.3 27.8 -8.5 22.5 27.8 -5.3 2 persons 27.7 34.8 -7.1 33.8 34.8 -1.0 32.9 34.8 -1.9 3 persons 18.7 15.3 +3.4 16.4 15.3 +1.1 15.6 15.3 +0.3 4 persons 14.4 13.3 +1.1 19.5 13.3 +6.2 18.4 13.3 +5.1 5+ persons 13.9 8.8 +5.1 11.0 8.8 +2.2 10.6 8.8 +1.8

Married (%Yes) 53.1 54.7 -1.6 58.3 54.7 +3.6 53.4 54.7 -1.3

Employed3 (%Yes) 63.1 65.912 -2.8 68.1 65.912 +2.2 64.0 65.912 -1.9 p lus Mail’

Household Income (%) < $10K/year 5.6 6.3 -0.7 2.4 6.3 -3.9 4.1 6.3 -2.2 $10K to < $25K/year 15.1 14.0 +1.1 11.1 14.0 -2.9 14.1 14.0 +0.1 $25K to < $50K/year 24.7 24.3 +1.4 28.4 24.3 +4.1 30.1 24.3 +5.8 $50K to < $75K/year 24.0 20.1 +3.9 22.2 20.1 +2.1 20.1 20.1 -- $75K to < $100K/year 15.3 13.4 +1.9 16.1 13.4 +2.7 14.8 13.4 +1.4 $100K/year or > 15.3 21.9 -6.6 19.9 21.9 -2.0 16.8 21.9 -5.1 Notes: 1Gender, age, & education weighted based on estimates from ACS (2008); 2Figures in bold indicates difference greater than 4.0; 3Employment estimates from 2007 Current Population Survey for Washington (CPS, 2007).

Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

Other important indicators of non-response error include telephone and Internet status. These data are unavailable from the ACS but are provided from other sources. Internet and mail respondents are different in regards to telephone and Internet status but it is unclear as to whether they are representative of the general population. Link et. al. (2008) and Blumberg & Luke (2008) estimated that about 15-20% of U.S. households are cell-only or no-phone households and Blumberg et. al. (2009) estimate that 16.3% of households in Washington are cell-only. As shown in Table 6, it was found that 22.3% of $5 Web-only respondents live in cell-only households. Approximately 16.7% of $5 Mail-only respondents reported living in a cell-only household. $5 Web Preference respondents fall in between Mail-only and Web-only respondents. Thus, WCS $5 Mail-only respondents were closest overall in regards to cell-only household while $5 Web Preference respondents were found to be slightly over-representative of cell-only households in Washington by three percentage points.

TABLE 6: WCS Respondent Cell Phone1 and Internet2 Characteristics for Mail Preference and Web Preference Web-only and ‘Web plus Mail’ Groups

Web Pref. Web Pref. Mail Pref. (-only) (Web-only) (Web + Mail) 2009 2009 2009 Mail CDC Diff. Web CDC Diff. Total CDC Diff. Cell-only 16.7 16.3 +0.4 22.3 16.3 +6.0 19.3 16.3 +3.0

2007 2007 2007 Mail NTIA Diff. Web NTIA Diff. Total NTIA Diff. Internet 83.9 72.0 +11.9 96.3 72.0 +24.3 86.2 72.0 +14.0 %High-speed 85.9 58.0 +27.9 89.3 58.0 +31.3 85.0 58.0 +27.0 Notes: Figures in bold indicate differences greater than 4.0; 1 Data from CDC (Blumberg et. al., 2009) for Washington state; 2 Data from NTIA, 2007 for Washington state.

All WCS respondents are also over-representative of the general Washington population with Internet access in the household, and especially high-speed access, as measured by the National Telecommunications and Information Administration (NTIA) (2007). According to the NTIA (2007), 72% of Washingtonians have Internet access from their home, with 58% having broadband access. As shown in Table 6, $5 Web-only respondents, as could be expected, reported higher levels of Internet access in the home, at 96.3%, with 89.3% of these having high- speed Internet. $5 Mail-only respondents have slightly lower levels of household Internet access, at 83.9%, and broadband access, at 85.9%, but still over-represent those with household Internet access by at least 11.9 percentage points (mail-only) and high-speed access households by at least 27.0% ($5 Web Preference). So, even approaching potential respondents via mail resulted in a completed sample over-representative of households with Internet access.

21 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

CHAPTER 5: DISCUSSION & CONCLUSIONS

Overall, the results suggest that using mail and web methods with the DSF and a mail contact can be a practical strategy for conducting statewide general public household surveys. However, it is also apparent that different modes and implementation procedures do produce mixed results in terms of response rates and non-response error. Relative to other studies using a similar methodology (e.g. Smyth et. al., In Press), the total response rates in the WCS Mail and Web groups were somewhat lower. This could be due to a number of factors, such as a much larger target population and conducting the WCS during a presidential election year, but their effects on response are difficult to determine. On the other hand, and more in line with these other studies, this research demonstrates that when mail is the contact method, offering a follow-up mode, whether mail or Internet, appears to produce lower overall response rates compared to offering mail alone. The highest response rate in this study was obtained by a traditional mail-only method. The rates of the $5 Mail Preference groups were not significantly lower than the Mail- only group but the web follow-up in these groups added little in terms of response rate and different types of respondents.

Although the response rates from the WCS $5 Web Preference groups were lower overall compared to the $5 Mail groups, using a mail contact method to convince people to respond to the Internet does appear promising when it is offered first to respondents, and also when followed by a mail method to those who unwilling or unable to respond via the Internet. For example, the Internet method used in the WCS $5 Web Preference groups obtained between 1/4 and 1/3 of total sampled population, who also comprised a majority proportion of total respondents in each Web Preference group. The mail follow-up sent three weeks later also substantially increased response rates in each group by at least 15 percentage points and seems a necessary method to use given that 28% of the Washington population (and about 35% of the U.S. population) does not have Internet access in the household.

The $5 Web Preference groups also appeared to utilize the web and mail follow-up mixed-mode method effectively for reducing non-response error, but the results varied compared to the mail- only method. As has been the case in other studies, WCS web-only respondents were very different from both mail follow-up and mail-only respondents on a number of characteristics, suggesting that using either method alone will result in higher non-response error. However, combining web and mail follow-up respondents resulted in a sample very similar to mail-only respondents, indicating that the Internet may not add value in terms of reducing overall non- response error. Whether this is actually the case is difficult to determine.

Web-only, mail-only, or web and mail follow-up combined respondents are not consistently representative of the general population in Washington, but are close when data are weighted to control for gender, age, and education. This may cast some doubt over whether response rates were “reasonable” enough and will likely involve some tradeoffs for reducing non-response error when any of these methods are used. However, it is apparent that the mail follow-up increased the representativeness of the Web Preference respondents even slightly better than mail-only could achieve.

22 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

In addition, the inclusion of a web card and a $5 incentive also produced varied results. The web card did not have the intended effect of reducing non-response or complexity among web respondents. It was ineffective at steering more and different types of people to the Internet or away from mail, even when presented singularly to respondents in a separate mailing. However, it may be comforting to note that excluding a web card will most certainly reduce survey costs. The $5 incentive, on the other hand, did significantly increase response rates, especially among web respondents. Neither mail nor web non-incentive respondents were very different from corresponding incentive respondents but the absolute increase in the number of responses seemed worth it in order to keep low the sampling error in estimates made about the target population.

In conclusion, the methods implemented in this study seem to be a feasible way forward for using the Internet more effectively as a survey mode. Although a mail-only design appears superior in terms of response, the Internet could provide cost and time savings compared to mail and will likely improve as Internet penetration continues to increase in the U.S.

23 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

References

Akl, Elie A., Maroun, Nancy., Klocke, Robert A., Montori, Victor & Schunemann, Holger J. (2005). “Electronic mail was not better than postal mail for surveying residents and faculty.” Journal of Clinical Epidemiology 58: 425-29.

Albrecht, Don E. (2008). Population Brief: Trends in the Western US: The State of Washington. Western Rural Development Center, US Dept. of Agriculture. http://wrdc.usu.edu/files/uploads/Population/Washington_WEB.pdf

American Community Survey. (2008). 2008 American Community Survey 1 year estimates: Washington. American Factfinder, US Census Bureau, US Dept. of Commerce. http://factfinder.census.gov/servlet/DatasetMainPageServlet?_program=ACS&_submen uId=&_lang=en&_ts=

Battaglia, Michael P., Link, Michael W., Frankel, Martin R., Osborn, Larry, & Mokdad, Ali H. (2008). “An evaluation of respondent selection methods for household mail surveys.” Public Opinion Quarterly, 72(3): 459-469.

Bech, Mickael & Kristensen, Morten Bo (2009). “Differential response rates in postal and Web- based surveys among older respondents.” Survey Research Methods 3(1): 1-6.

Biemer, Paul P. & Christ, Sharon L. (2008). “Weighting survey data,” in International Handbook of Survey Methodology, de Leeuw, Edith D., Hox, Joop J., Dillman, Don A., eds., NY: Psychology Press.

Birnholtz, Jeremy P., Horn, Daniel B., Finholt, Thomas A., & Bae, Sung Joo (2004). “The effects of cash, electronic, and paper gift certificates as respondent incentives for a web-based survey of technologically sophisticated respondents.” Social Science Computer Review 22(3): 355-62.

Blumberg, Stephen J., Luke, Julian V., Davidson, Gestur, Davern, Michael E., Yu, Tzy-Chyi, & Soderberg, Karen (2009, March). “Wire Substitution: State-level estimates from the National Health Interview Survey, January-December 2007.” National Health Statistics Reports, No. 14, U.S. Department of Health and Human Services, Center for Disease Control and Prevention, National Center for Health Statistics. Retrieved March 11, 2009 from http://www.cdc.gov/nchs/data/nhsr/nhsr014.pdf

Blumberg, Stephen J., & Luke, Julian V. (2008). Wireless substitution: Early release of estimates from the National Health Interview Survey, January-June 2008. National Center for Health Statistics. Available from: http://www.cdc.gov/nchs/nhis.htm

Blumberg, Stephen J. & Luke, Julian V. (2007). Coverage bias in traditional telephone surveys of low-income and young adults. Public Opinion Quarterly, 71, 5, 734-749.

Blumberg, Stephen J., Luke, Julian V., Cynamon, Marcie L., & Frankel, Martin R. (2008). “Recent trends in household telephone coverage in the United States.” In J.M. Lepkowski, C. Tucker, J.M. Brick, E. de Leeuw, L. Japec, P.J. Lavrakas, et al., eds., Advances in Telephone Survey Methodology, 56-86. New York: John Wiley & Sons.

24 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

Bosnjak, Michael & Tuten, Tracy L. (2003). “Prepaid and Promised Incentives in Web Surveys: An Experiment.” Social Science Computer Review 21(2): 208-17.

Carini, Robert M., Hayek, John C., Kuh, George D., Kennedy, John M., & Ouimet, Judith A. (2003). “College student responses to web and paper surveys: Does mode matter?” Research in Higher Education 44(1): 1-19.

Church, Allan H. (1993). “Estimating the effect of incentives on mail survey response rates: A meta-analysis.” Public Opinion Quarterly, 57: 62-79.

Council of American Survey Research Organizations (CASRO). (2008). CASRO code of standards and ethics for survey research. Retrieved January 19, 2009, from http://www.casro.org/pdfs/CodeVertical-FINAL.pdf.

Couper, Mick P. (2000). “Web surveys: A review of issues and approaches.” Public Opinion Quarterly, 64: 464-494.

Couper, Mick P. & Miller, Peter V. (2008). “Web Survey Methods: Introduction.” Public Opinion Quarterly 72(5): 831-35.

Current Population Survey (CPS). (2007). CPS Table Creator for the annual social and economic supplement, 2007, Washington. US Census Bureau, US Dept. of Commerce. http://www.census.gov/hhes/www/cpstc/cps_table_creator.html

Curtin, Richard, Presser, Stanley, & Singer, Eleanor (2005). “Changes in telephone survey nonresponse over the past quarter century.” Public Opinion Quarterly, 69: 87-98. de Leeuw, Edith D. (2005). “To mix or not to mix data collection modes in surveys.” Journal of Official Statistics, 21(2), 233-255. de Leeuw, Edith D. & Heer, Wim (2002). “Trends in household survey nonresponse: A longitudinal and international comparison,” in Groves, R.M. et. al., eds., Survey Nonresponse, NY: John Wiley & Sons. pp. 41-54.

Denscombe, Martyn. (2006). “Web-based questionnaire and the mode effect: An evaluation based on completion rates and data contents of near-identical questionnaires delivered in different modes.” Social Science Computer Review 24(2): 246-54.

Dillman, Don A., Smyth, Jolene D., & Christian, Leah Melani (2009). Internet, mail, and mixed- mode surveys: The tailored design method. Hoboken, NJ: John Wiley & Sons.

Fahini, Mansour (2009, May). “Address-based Sampling: Merits, Design, Implementation.” Short course presented at Annual Meeting of the American Association for Public Opinion Research Conference, Hollywood, FL.

Fricker, Ronald D. & Schonlau, Matthias (2002). “Advantages and disadvantages of Internet research surveys: Evidence from the literature.” Field Methods 14(4): 347-67.

Gentry, Robin J., & Good, Cindy D. (2008, May). Offering respondents a choice of survey mode: Use patterns of an Internet response option in a mail survey. Paper presented at the American Association for Public Opinion Research, New Orleans, LA.

25 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

Griffin, Deborah H., Fisher, Donald P., & Morgan, Michael T. (2001, May). Testing an Internet response option for the American Community Survey. Paper presented at the American Association for Public Opinion Research. Montreal, Quebec, Canada.

Grigorian, Karen, & Hoffer, Thomas B. (2008, March). 2006 Survey of Doctoral Recipients mode assignment analysis report. Prepared for the National Science Foundation by the National Opinion Research Center.

Horrigan, John B. (2008). Home broadband adoption 2008: Adoption stalls for low-income Americans even as many broadband users opt for premium services that give them more speed. Retrieved December 10, 2008 from http://www.pewInternet.org/pdfs/PIP_ Broadband_2008.pdf

Iannacchione, Vincent G., Staab, Jennifer M., & Redden, David T. (2003). Evaluating the use of residential mailing addresses in a metropolitan household survey. Public Opinion Quarterly, 67, 202–210.

Israel, Glenn D. 2009. Obtaining Responses from Extension Clients: Exploring Web and Mail Survey Options. Paper presented at the annual meeting of the Southern Rural Sociological Association, Atlanta, GA, February 2, 2009.

James, Jeannine M. & Bolstein, Richard (1990). “The effect of monetary incentive and follow-up mailings on the response rate and response quality in mail surveys.” Public Opinion Quarterly, 54: 346-361.

Kaplowitz, Michael D., Hadlock, Timothy D., & Levine, Ralph (2004). “A Comparison of Web and Mail Survey Response Rates.” Public Opinion Quarterly 68(1): 94-101.

Keeter, Scott, Kennedy, Courtney, Clark, April, Tompson, Trevor, & Mokrzycki, Mike. (2007). “What’s missing from national landline RDD surveys? The impact of the growing cell- only population.” Public Opinion Quarterly 71(5): 772-792.

Kish, Leslie (1965). Survey Sampling. New York: John Wiley.

Lavrakas, Paul J., Steeh, Charlotte, Blumberg, Stephen, Boyle, John, Brick, Michael, Callegaro, Mario, et. al. (AAPOR Cell-Phone Task Force) (2008). AAPOR cell phone task force guidelines and considerations for survey researchers when planning and conducting RDD and other telephone surveys in the U.S. with respondents reached via cell phone numbers. American Association for Public Opinion Research. Accessed March 30, 2008 from http://www.aapor.org.

Lesser, Virginia (2009, May). “Evaluating Response Quality in a Study Using Random Digit Dialing, Mail and Web Using the Postal Delivery Sequence File.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Hollywood, FL.

Lesser, Virginia, Dillman, Don A., Lorenz, Frederick O., Carlson, John, & Brown, T.L. August 1999. “The influence of financial incentives on mail questionnaire response rates.” Paper presented at the meeting of the Rural Sociological Society, Portland, Oregon.

26 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

Link, Michael W., Battaglia, Michael P., Frankel, Martin R., Osborn, Larry, & Mokdad, Ali H. (2008). “Comparison of address-based sampling (ABS) versus random-digit dialing (RDD) for general population surveys.” Public Opinion Quarterly, 72(1), 6-27.

Link, Michael W., & Mokdad, Ali (2006). “Can web and mail survey modes improve participation in an RDD-based National Health Surveillance?” Journal of Official Statistics, 22(2), 293- 312.

Link, Michael W., Battaglia, Michael, Giambo, P., Frankel, Martin R., & Mokdad, Ali H. (2005a). “Assessment of address frame replacements for RDD sample frames.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Miami Beach, FL.

Link, Michael W. & Mokdad, Ali H. (2005b). “Use of prenotification letters: An assessment of benefits, costs, and data quality.” Public Opinion Quarterly, 69: 572-587.

Manfreda, Katja Lozar, Bosnjak, Michael, Berzelak, Jernej, Haas, Iris, & Vehovar, Vasja (2006). “Web surveys versus other survey modes: A meta-analysis comparing response rates.” International Journal of Market Research, 50(1): 79-104.

Messer, Benjamin L. (2009). “Improving survey response in mail and Internet general public surveys using address-based sampling and mail contact procedures.” Unpublished Master’s thesis. Washington State University. http://www.dissertations.wsu.edu/Thesis/Spring2009/B_Messer_040309.pdf

National Telecommunications and Information Administration (NTIA). (2007). “Households using the Internet in and outside the home, by selected characteristics: Total, urban, rural, principal city, 2007.” Retrieved February 5, 2009. http://www.ntia.doc.gov/reports/2008/table_householdinternet2007.pdf

O’Muircheartaigh, Colm A., English, Edward M., & Eckman, Stephanie (2007, May). Predicting the relative quality of alternative sampling frames. Paper presented at the annual conference of the American Association for Public Opinion Research, Anaheim, CA.

Rookey, Bryan D., Hanway, Steve, & Dillman, Don A. (2008). “Does a probability-based household panel benefit from assignment to postal response as an alternative to Internet-only?” Public Opinion Quarterly, 72(5): 962-984.

Ryu, Erica, Couper, Mick P., & Marans, Robert W. (2006). “Survey Incentives: Cash vs. in-kind; Face-to-face vs. mail; Response rate vs. nonresponse error.” International Journal of Public Opinion Research, 18(1): 89-106.

Sax, Linda J., Gilmartin, Shannon K., & Bryant, Alyssa N. (2003). “Assessing response rates and nonresponse bias in web and paper surveys.” Research in Higher Education 44(4): 409-32.

Schonlau, Matthias, Asch, Beth J., & Du, Can (2003). “Web Surveys as Part of a Mixed Mode Strategy for Populations that cannot be Contacted by E-Mail.” Social Science Computer Review 21: 218-222.

27 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

Schonlau, Matthias, Fricker, Ronald, & Elliott, Mark (2002). Conducting research surveys via e- mail and the web. Santa Monica, CA: RAND.

Schwartz, Barry (2004). The Paradox of Choice: Why More is Less. NY: HarperCollins Publishers, Inc.

Singer, Eleanor (2002). “The use of incentives to reduce nonresponse in household surveys,” in Survey Nonresponse, Groves, R.M., Dillman, D.A., Eltinge, J.L., & Little R.J.A. (eds.), NY: John Wiley Co., pps. 163-178 (Chapter 11).

Smyth, Jolene D., Dillman, Don A., Christian, Leah Melani, & O’Neill, Allison C. (In Press). “Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century.” American Behavioral Scientist.

Staab, Jennifer M. & Iannacchione, Vincent G. (2004). “Evaluating the use of residential addresses in a national household survey.” Proceedings of the American Statistical Association, Survey Methodology Section (-ROM), 4028-33. Alexandria, VA: American Statistical Association.

Steeh, Charlotte (2008). “Telephone surveys.” In, International Handbook of Survey Methodology, de Leeuw, Edith D., Hox, Joop J., Dillman, Don A., eds. New York: Taylor & Francis Group.

Steeh, Charlotte, Kirgis, Nicole, Cannon, Brian, & DeWitt, Jeff (2001). “Are they as really bad as they seem? Nonresponse rates at the end of the twentieth century.” Journal of Official Statistics, 17: 227-247.

Stern, Michael J. & Dillman, Don A. 2006. “Community Participation, Social Ties, and Use of the Internet.” City & Community 5(4):409-24.

Steve, Ken, Dally, Gail, Lavrakas, Paul J., Yancey, Tracie, & Kulp, Dale (2007, May). R&D studies to replace the RDD-frame with an ABS-frame. Paper presented at the annual conference of the American Association for Public Opinion Research, Anaheim, CA.

Zhang, Chan, Callegaro, Mario, & Thomas, Melanie (2008, November). More than the digital divide? Investigating the differences between Internet and non-Internet users on attitudes and behaviors. Paper presented at the Midwest Association for Public Opinion Research conference, Chicago, IL

28 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

29 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

Appendix I

Washington Community Survey Contact Letters

30 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

FIRST CONTACT LETTER Same for All Groups 1 - 9

Month Day, 2008

«CITY» Area Resident «ADDRESS» «UNIT» «CITY», «STATE» «ZIP»«dash»«ZIP4»

Dear «CITY» Area Resident,

In the next few days you will receive a request to participate in an important study being conducted by Washington State University to understand how well communities throughout the state of Washington are meeting the needs of their residents. Your participation will simply require answering a few questions about the quality of life and other important issues in the Washington community where you live.

We would like to do everything we can to it easy and enjoyable for you to participate in the study. I am writing in advance because many people like to know ahead of time that they will be asked to fill out a questionnaire. This research can only be successful with the generous help of people like you.

To say thanks, you will receive a small token of appreciation with the request to participate. I hope you will take 10-15 minutes of your time to help us. Most of all, I hope that you enjoy the questionnaire and the opportunity to voice your thoughts and opinions about the Washington community where you live.

Best Wishes,

Don A. Dillman Regents Professor and Deputy Director

31 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

SECOND CONTACT LETTER Same for Mail Groups 1-3

Month Day, 2008

«CITY» Area Resident «ADDRESS» «CITY», «STATE» «ZIP»«dash»«ZIP4»

Dear «CITY» Area Resident,

I am writing to ask for your help in gaining a better understanding of how communities across the state of Washington are providing for the people who live in them. We are especially interested in what people think about important issues in their community, from the availability of jobs and needed services to the use of newer technologies such as cell phones. Most importantly, we want to understand how these issues affect the quality of life in different types of Washington communities.

The best way we have of learning about these issues is by asking all different kinds of people throughout the state of Washington, from towns and cities both large and small, to share their thoughts and opinions. Your address is one of only a small number from each of the 39 counties in Washington that have been randomly selected to help in this study.

To help us make sure we hear from all different types of people who live in Washington communities, please have the adult (age 18 or over) in your household who has had the most recent birthday be the one to complete the enclosed questionnaire.

The questions should only take about 15 minutes to complete. Your responses are voluntary and will be kept confidential. Your names are not on our mailing list, and no one’s answers will ever be associated with the mailing address. If you have any questions about this survey of Washington communities, please call Thom Allen, the study director, by telephone at 1-800- 833-0867 or by email at [email protected].

By taking a few minutes to share your thoughts and opinions about life in the Washington community where you live, you will be helping us out a great deal. We expect that the results of this study will be useful in helping state and community leaders throughout Washington better understand how different aspects of communities contribute to providing a good quality of life for residents.

I hope you enjoy completing the questionnaire and look forward to receiving your responses.

Many Thanks,

Don A. Dillman Regents Professor and Deputy Director

32 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

SECOND CONTACT LETTER Same for Web Preference Groups 4 - 8

Month Day, 2008

Area Resident

,

Dear Area Resident,

I am writing to ask for your help in gaining a better understanding of how communities across the state of Washington are providing for the people who live in them. We are especially interested in what people think about important issues in their community, from the availability of jobs and needed services to the use of newer technologies such as cell phones. Most importantly, we want to understand how these issues affect the quality of life in different types of Washington communities.

The best way we have of learning about these issues is by asking all different kinds of people throughout the state of Washington, from cities and towns both large and small, to share their thoughts and opinions. Your address is one of only a small number from each of the 39 counties in Washington that have been randomly selected to help in this study.

To help us make sure we hear from all different types of people who live in Washington communities, please have the adult (age 18 or over) in your household who has had the most recent birthday be the one to complete the questionnaire.

We are hoping that you will be able to complete the questionnaire on the Internet. Doing that is easy: just enter this web page address in your Internet browser, and then type in your access code to begin the survey. To help, we have also enclosed step-by-step instructions that show examples of the questions included in the survey. http://opinion.wsu.edu/Washington Your access code:

We realize that some households may not have Internet access. If this is the case for you, in about three weeks we will send a paper version of the questionnaire for you to fill out and mail back to us. If you prefer to receive one sooner, please let us know and it will be sent you shortly.

The questions should only take about 15 minutes to complete. Your responses are voluntary and will be kept confidential. Your names are not on our mailing list, and no one’s answers will ever be associated with the mailing address. If you have any questions about this survey of Washington communities, please contact Thom Allen by phone at 1-800-833-0867 or by email at [email protected].

By taking a few minutes to share your thoughts and opinions about life in the Washington community where you live, you will be helping us out a great deal, and a small token of appreciation is enclosed as a way of saying thank you. We expect that the results of this study will be useful in helping state and community leaders throughout Washington better understand how different aspects of communities contribute to providing a good quality of life for residents.

I hope you enjoy completing the questionnaire and look forward to receiving your responses.

Many Thanks,

Don A. Dillman Regents Professor and Deputy Director

33 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

SECOND CONTACT LETTER Group 9: Mail Preference + $5

MONTH DAY, 2008

Area Resident

,

Dear Area Resident,

I am writing to ask for your help in gaining a better understanding of how communities across the state of Washington are providing for the people who live in them. We are especially interested in what people think about important issues in their community, from the availability of jobs and needed services to the use of newer technologies such as cell phones. Most importantly, we want to understand how these issues affect the quality of life in different types of Washington communities.

The best way we have of learning about these issues is by asking all different kinds of people throughout the state of Washington, from cities and towns both large and small, to share their thoughts and opinions. Your address is one of only a small number from each of the 39 counties in Washington that have been randomly selected to help in this study.

To help us make sure we hear from all different types of people who live in the area, please have the adult (age 18 or over) in your household who has had the most recent birthday be the one to complete the questionnaire.

To make it easier for people we are providing two ways to respond, both of which include the same exact questions. One way is to simply fill out the questionnaire and return it in the enclosed envelope. The other is to answer the questions on the Internet. Doing that is easy: just enter this web page address in your Internet browser, and then type in your access code to begin the survey. http://opinion.wsu.edu/Washington Your access code:

We hope that giving a choice between paper and web makes it more convenient for you. It is entirely up to you which way you respond. If you choose to respond by web, we have also enclosed step-by-step instructions that show examples of the questions included in the survey.

The questions should only take about 15 minutes to complete. Your responses are voluntary and will be kept confidential. Your names are not on our mailing list, and no one’s answers will ever be associated with the mailing address. If you have any questions about this survey of Washington communities, please contact Thom Allen by phone at 1-800-833-0867 or by email at [email protected].

By taking a few minutes to share your thoughts and opinions about life in the Washington community where you live, you will be helping us out a great deal, and a small token of appreciation is enclosed as a way of saying thank you. We expect that the results of this study will be useful in helping state and community leaders throughout Washington better understand how different aspects of communities contribute to providing a good quality of life for residents.

I hope you enjoy completing the questionnaire and look forward to receiving your responses.

Many Thanks,

Don A. Dillman Regents Professor and Deputy Director

34 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

THIRD CONTACT LETTER Same for Mail Groups 1 – 3

MONTH DAY, 2008

Last week a questionnaire was mailed to you because your household was randomly selected to help in an important study about the quality of life in the Washington community where you live.

If someone at your address has already completed and returned the questionnaire, please accept our sincere thanks. If not, please have the adult in your household who has had the most recent birthday do so right away. We are especially grateful for your help with this important study.

If you did not receive a questionnaire, or if it was misplaced, please call us at 1-800-833-0867 and we will get another in the mail to you.

Sincerely,

Don A. Dillman, Regents Professor and Deputy Director

35 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

THIRD CONTACT LETTER Same for Web Preference Groups 4 – 7

MONTH DAY, 2008

Last week a letter was mailed to you because your household was randomly selected to help in an important study about the quality of life in the Washington community where you live.

If someone at your address has already completed the questionnaire, please accept our sincere thanks. If not, please have the adult in your household who has had the most recent birthday do so right away. We are especially grateful for your help with this important study.

To complete the survey enter this web page address in your internet browser and then type in your access code to begin answering questions.

http://opinion.wsu.edu/Washington Your access code:

If you do not have internet access, we still very much want to hear from you. A paper questionnaire will be sent to you in about two weeks.

Sincerely,

Don A. Dillman, Regents Professor and Deputy Director

36 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

THIRD CONTACT LETTER Same for Web Preference Group 8 & Mail Preference Group 9

MONTH DAY, 2008

Area Resident

,

Dear Area Resident,

Last week a letter was mailed to you because your household was randomly selected to help in an important study about the quality of life in the Washington community where you live.

If someone at your address has already completed the questionnaire, please accept our sincere thanks. If not, please have the adult in your household who has had the most recent birthday do so right away. We are especially grateful for your help with this important study.

To complete the survey please follow the simple step-by-step instructions on the attached card.

If you do not have internet access, we still very much want to hear from you. A paper questionnaire will be sent to you in about two weeks or you can call 1-800- ###-#### and we will send you one shortly. Thank you for your patience.

Many Thanks,

Don A. Dillman Regents Professor and Deputy Director

37 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

FOUTH CONTACT Mail-only Group 1

MONTH DAY, 2008

Area Resident

,

Dear Area Resident,

Recently, we sent a questionnaire to your household that asked for your views about the Washington community where you live. To the best of our knowledge, it has not yet been returned.

We are writing again because of the importance that each household’s questionnaire has for assuring the results of the survey are representative of people and communities throughout the state. This survey of a small number of households from all counties in Washington is being done in hopes of gaining a better understanding of how well communities of all sizes and locations are doing in meeting the needs of residents. Because of how people’s situations are different, it is only by hearing from all the households in our random sample that we can be sure the results are accurate.

We hope the adult in your household who has had the most recent birthday will fill out and return the questionnaire soon. But, if for any reason this person prefers not to answer it, please let us know by returning a note or blank questionnaire in the enclosed stamped envelope.

We also want to assure you that we make every effort to uphold the confidentiality of your answers to the survey. As a standard procedure, a questionnaire identification number is printed on the back cover of the questionnaire so that we can check your address off of the mailing list when it is returned. The list of addresses is then destroyed so that individual addresses can never be connected to the results in any way. Protecting the confidentiality of people’s answers is very important us.

We hope that you enjoy the questionnaire.

Sincerely,

Don A. Dillman Regents Professor and Deputy Director

This study is sponsored by the Social and Economic Sciences Research Center and the Department of Community and Rural Sociology at Washington State University. If you have questions about the study please contact Thom Allen, the study director, at 1-800-833-0867 or by email at [email protected].

38 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

FOURTH CONTACT LETTER Same for Mail Preference Groups 2 & 3

MONTH DAY, 2008

Area Resident

,

Dear Area Resident,

Recently, we sent a questionnaire to your household that asked for your views about the Washington community where you live. To the best of our knowledge, it has not yet been returned.

We are writing again because of the importance that each household’s questionnaire has for assuring the results of the survey are representative of people and communities throughout the state. This survey of a small number of households from all counties in Washington is being done in hopes of gaining a better understanding of how well communities of all sizes and locations are doing in meeting the needs of residents. Because of how people’s situations are different, it is only by hearing from all the households in our random sample that we can be sure the results are accurate.

We hope the adult in your household who has had the most recent birthday will fill out and return the questionnaire soon. But, if for any reason this person prefers not to answer it, please let us know by returning a note or blank questionnaire in the enclosed stamped envelope.

We have also made the questionnaire available on the Internet in case you prefer to fill it out that way. Doing this is easy: just enter this web page address in your Internet browser, and then type in your access code to begin the survey. We have enclosed step-by-step instructions to help with this. http://opinion.wsu.edu/Washington Your access code:

We want to assure you that we make every effort to uphold the confidentiality of your answers to the survey. As a standard procedure, a questionnaire identification number is printed on the back cover of the questionnaire so that we can check your address off of the mailing list when it is returned. The list of addresses is then destroyed so that individual addresses can never be connected to the results in any way. Protecting the confidentiality of people’s answers is very important us.

We hope that you enjoy the questionnaire.

Sincerely,

Don A. Dillman Regents Professor and Deputy Director

This study is sponsored by the Social and Economic Sciences Research Center and the Department of Community and Rural Sociology at Washington State University. If you have questions about the study please contact Thom Allen, the study director, at 1-800-833-0867 or by email at [email protected].

39 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

FOURTH CONTACT LETTER Same for Web Preference Groups 4 – 8 & Mail Preference Group 9

MONTH DAY, 2008

Area Resident

,

Dear Area Resident,

Recently, we sent a letter to your household inviting you to answer an Internet survey about the Washington community where you live. To the best of our knowledge, it has not yet been completed.

We are writing again because of the importance that each household’s questionnaire has for assuring the results of the survey are representative of people and communities throughout the state. This survey of a small number of households from all counties in Washington is being done in hopes of gaining a better understanding of how well communities of all sizes and locations are doing in meeting the needs of residents. Because of how people’s situations are different, it is only by hearing from all the households in our random sample that we can be sure the results are accurate.

We hope the adult in your household who has had the most recent birthday will fill out the questionnaire soon and that this person will be able to complete the questionnaire on the Internet. Doing that is easy: just enter this web page address in your Internet browser, and then type in your access code to begin the survey. We have enclosed step-by-step instructions to help with this. http://opinion.wsu.edu/Washington Your access code:

We have included a paper copy of the questionnaire for those who are unable to complete it on the Internet. If for any reason you prefer not to answer either the paper or Internet questionnaire, please let us know by returning a note or blank questionnaire in the enclosed stamped envelope.

We also want to assure you that we make every effort to uphold the confidentiality of your answers to the survey. As a standard procedure, a questionnaire identification number is printed on the back cover of the questionnaire so that we can check your address off of the mailing list when it is returned. The list of addresses is then destroyed so that individual addresses can never be connected to the results in any way. Protecting the confidentiality of people’s answers is very important us.

We hope that you enjoy the questionnaire.

Sincerely,

Don A. Dillman Regents Professor and Deputy Director

This study is sponsored by the Social and Economic Sciences Research Center and the Department of Community and Rural Sociology at Washington State University. If you have questions about the study please contact Thom Allen, the study director, at 1-800-833-0867 or by email at [email protected].

40 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

41 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

Appendix II

Washington Community Survey Questionnaire and Web Instruction Card

42 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

WEB INSTRUCTION CARD Same for Web Preference Groups 5 – 8 and Mail Preference Group 9*

*Respondents in Web Preference Group 8 and Mail Preference Group 9 received a web card with their passcodes printed in the space below “Step 2”.

43 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

QUESTIONNAIRE Same for All Groups 1 – 9

44 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

45 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

46 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

47 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

48 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

49 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

50 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

51 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

52 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

53 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

54 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

55 Messer & Dillman: Surveying the General Public by Mail vs. ‘Web plus Mail’

56