Questionnaire Design Asking Questions with a Purpose
Paul Pope Extension Program Specialist—Evaluation
Chris Boleman Assistant Professor and Extension Specialist
Scott Cummings Assistant Department Head and Program Leader for Extension Education
Texas Cooperative Extension The Texas A&M University System
Updated and revised from the original Questionnaire Design: Asking Questions with a Purpose, developed by Mary G. Marshall Contents
Introduction...... 4
Questionnaires: Part of the Survey Process (The Big Picture) ...... 5
Using a Questionnaire: Pro and Cons...... 6
Modes of Delivery ...... 7
Constructing a Questionnaire: A Step-By-Step Approach ...... 8
Starting Point Guidelines ...... 9
Developing the Program Impact Section ...... 11
Developing the Program Improvement Section ...... 14
Developing the Demographic Section ...... 15
Appendix A: Wording the Question ...... 20
Appendix B: Types of Questions ...... 24
Appendix C: Rating Scales ...... 29
Appendix D: Formatting the Questionnaire ...... 32
Appendix E: Questionnaire Examples ...... 34 Introduction
Questionnaires defined efficient—and often the only—means of gather- ing evidence on program impact, gaining insight For many activities in Cooperative Extension, into program improvement and knowing more asking questions of program participants is the about the target audience. primary source of information for evaluation pur- poses. Questionnaires provide a means of asking In Extension, most questionnaires are self- questions of participants in an efficient, systemat- administered (completed by program participants ic and consistent manner. using written instructions and/or minimal verbal instructions). They are usually completed in per- A questionnaire is a tool for getting similar son at the conclusion of a program or delivered to information from people that can be tabulated participants via mail after a program concludes. and discussed numerically. Specifically, a ques- tionnaire is a document that asks the same ques- Participants control the data collection process tions of all individuals in a sample or population. in two ways: Note that a questionnaire is not defined by a •They decide which questions they will particular mode of delivery, although it is most answer or not answer. often associated with a paper document such as a • They record a written response (as opposed mail survey. Questionnaires can be delivered in a to an interviewer recording it) to each ques- variety of ways—mail, phone, Web, in-person, tionnaire item. etc.—and may be self-administered or interview- er-administered. The information gathered from This publication questionnaires may be quantitative or qualitative. Developing a sound questionnaire is not quick or easy. It takes time and focused attention—typ- Questionnaires and evaluation ically involving forethought, peer review, several in Extension drafts and pretests before a final version emerges. A questionnaire is one of many data collection Although the process of developing a question- methods available to Extension faculty. Other naire does not fit into a step-by-step approach as methods include direct observation, individual neatly as other processes do, this publication measurement, testing, focus groups and case attempts to present it as such—with some gener- studies. However, questionnaires have been a al guidelines to help you along the way. major source of information for evaluation pur- The concepts and guidelines in this publication poses and will continue to be so into the forsee- apply to all questionnaires—regardless of how able future. Questionnaires have proved to be an they are delivered.
4 Questionnaire Design: Part of the Survey Process
The Big Picture The figure below presents an overview of the survey process and includes elements of sam- It is helpful to put questionnaire design in con- pling, data collection, data processing and data text. Questionnaire design is a small part of the analysis. overall survey process that may be used in an eval- uation. As part of this process, questionnaires are some- times interchangeably referred to as survey instru- ments.
5 Using a Questionnaire: Pros and Cons
Like all data collection methods, question- are not face to face with an interviewer or if the naires have both advantages and disadvantages. responses cannot easily be inked back to them. Certain types of questionnaires (such as mail Advantages or Web) allow participants to respond at their own convenience—within a data collection peri- Questionnaires provide an efficient way to col- od that typically last several weeks or months. lect data. Generally, it takes less time and resources to develop and administer a question- naire than it would to directly observe or interview Disadvantages participants individually. Questionnaires become Questionnaires are limited in the number and even more efficient as the number of potential scope of questions they can ask. They cannot eas- respondents increases. ily or spontaneously probe deeper into a partici- For evaluations involving many participants pant’s opinions or experiences on a specific topic (1,000 or more) a questionnaire used in conjunc- as can an in-person interviewer. Space considera- tion with sampling is often the most practical and tions and the risk of survey fatigue tend to limit efficient way to collect data. branching opportunities (asking a more specific set of questions based on a previous response). Questionnaires can be used with any of the three primary evaluation strategies: pre-post, ret- Questionnaires rely on the abillity of partici- rospective post or post only. pants to clearly comprehend what is being asked of them, their willingness to answer, their will- Questionnaires can be administered to allow ingness to consider and reflect, and their honesty participants to provide responses confidentially and accuracy of recall in providing responses. or anonymously. Participants feel more secure in Failure on any of these counts may result in no providing frank and complete information if they response or inaccurate responses.
6 Modes of Delivery
As mentioned earlier, questionnaires can be Mixed mode administered in a variety of ways: Mixed mode delivery of questionnaires is •Postal (traditional mail) becoming more popular as a means of increasing • Phone response rates. One form of mixed mode delivery •Web involves sending a questionnaire to participants • E-mail in multiple ways. For example, some participants • In-person may be mailed a questionnaire, whereas other •Mixed mode (using more than one method participants may be administered the question- to deliver a questionnaire) naire by phone or sent an invitation via e-mail to There are several factors to consider in decid- complete a Web-based version of the question- ing on a mode of delivery, including anticipated naire. response rates, ability to follow-up or probe, Another form of mixed mode delivery involves speed of data collection and availability of a sam- giving participants options in completing a ques- pling frame. tionnaire. For example, a survey packet may be If you need the data quickly, and most or all mailed to program participants with instructions participants have access to the Internet, deliver- that allow them to complete and return the paper ing a questionnaire via the Web can be an effec- questionnaire or to access and complete the same tive means of collecting data in a short period. questionnaire on the web. Be aware that the mode of delivery you select If you use mixed mode delivery, be sure that may affect questionnaire design to some extent. the wording is consistent (or at least comparable) For example, if the questionnaire is administered between modes so that the results from all modes by phone, it may need to have fewer and more can be tabulated and analyzed together in a valid concise response categories than those on a writ- and meaningful way. ten questionnaire. Although mixed mode delivery is becoming more common, most questionnaires in Extension are still delivered to participants single mode—in person, by mail or via the Web.
7 Constructing a Questionnaire: A Step-By-Step Approach
Below is a step-by-step process for developing require—nominal, ordinal, interval) a questionnaire. The guidelines in this process dictate the structure of questions. are sound practices that apply to any question- Frequencies and percentages are naire. The three suggested sections of a question- available with any question struc- naire—Steps 4, 5 and 6—should be adequate for ture. most evaluations. Step 4. Write a set of draft questions for the Step 1. Review and adhere to the Starting program impact section of the ques- Point Guidelines (next section). This tionnaire (page 11). involves thinking about the ques- Step 5. Write a set of draft questions for the tionnaire before you write questions. program improvement section of the This is a valuable first step that will questionnaire (page 14). make the remainder of the process easier. Some aspects of this step are Step 6. Write a set of draft questions for the particularly critical: demographic section of the question- naire (page 15). • Deciding what you need or want to know Step 7. Review the three sections to again ensure that the questions adhere to • Knowing the desired client change the guidelines specified in Appendix of your program A: Wording the Question. • Knowing the outcome indicators Step 8. Review the three sections to again for your program ensure that the questions adhere to • Knowing the evaluation strategy the guidelines specified in Appendix you will be employing B: Types of Questions. Step 2. Review the guidelines in Appendix Step 9. Review the questionnaire format and A: Wording the Question to learn modify it to adhere to the guidelines about proper phrasing and how to specified in Appendix D: Formatting avoid common mistakes. the Questionnaire. Step 3. Review the guidelines in Appendix B: Step 10. Have a colleague review the ques- Types of Questions to learn about the tionnaire. Make any modifications. different question structures avail- Step 11. Pretest the questionnaire with staff, able. The types of analysis that need friends or clients. Make final modifi- to be done (and the level of data they cations.
8 Starting Point Guidelines
• Know the outcome indicators you specified as part of your outcome plan. As part of the The starting point in constructing a question- program development (planning) process, naire is to know what kind of evidence is needed you may have listed some specific indicators to meet the purpose of the study and how the of program success or outcome attainment. information will be used. Before you actually If so, think about how items on a question- start developing questions, review and follow naire might be used as your outcome indica- these guidelines: tors. For example, one outcome indicator of • Make a list of what you want to know. a successful food safety program might be What do you really want to find out? What for participants to always wash raw fruits do you want to be able to say or demon- and vegetables before eating them. Asking strate? What are your outcome indicators? participants about that on a questionnaire How can the questionnaire be used to meas- can serve as the outcome indicator. ure these? • Know the type of evaluation strategy you • From the beginning, think through what will be using (pre-post, retrospective post or you will actually do with each piece of post-only) and whether your post measure information. This affects how questions are will be at the conclusion of the event (imme- structured. diate post) or some period of time after the A good example is whether to ask for a par- event concludes (longer-term post). To some ticipant’s age using categories or to ask for a extent, this will determine the types of ques- specific age. If frequencies and percentages tions you ask in the program impact section are sufficient for your evaluation purposes, of the questionnaire. then using age categories is sufficient. On • It is important to know the target audience. the other hand, if you need a mean (aver- In writing questions, look through the par- age) age for evaluation purposes or want to ticipants’ eyes: run regression to demonstrate a relationship Will they understand the question? between age and another variable, then it is Will the question be seen as reasonable? better to ask for a specific age or year of Will the question infringe on the partici- birth than to use categories. pant’s privacy? In general, if you need or want to run analy- Will participants be willing and able to sis beyond frequencies and percentages, you tell you what you want to know? should be familiar with the concept of “lev- • Keep the information you are collecting as els of data”—that is, whether data are nom- short and directed as possible. Include a inal, ordinal or interval. Each statistical question only if it relates directly to the pur- analysis procedure requires a specific level of pose of the study. Do not ask questions you data. do not plan to use. Eliminate “nice to know” • Keep in mind the type(s) of client change items. your program aims to achieve. Is it knowl- • Questionnaires used for an evaluation typ- edge gain, skills acquisition, behavior ically have three sections: program impact, change or something else? This will help program improvement and attribute infor- you focus on the type of questions to ask. mation. Types of client change are discussed in greater detail in the next section.
9 The program impact section includes ques- participants can be built with this informa- tions that measure client change or other tion. types of impact. It may also include ques- Each of these sections will be covered in tions related to customer satisfaction greater detail. The program improvement section typically In summary, be selective, be realistic and keep asks participants what they liked about the it simple. Know what information is needed, program, what they disliked and what sug- why it is wanted and how you will use it. The gestions they have for improvement. information you obtain is only as good as the The attribute information or participant questions you ask. So take great care in asking background section asks about the partici- the right questions and structuring them appropri- pants themselves (who they are, how much ately! of something they have, etc.). A profile of
10 Developing the Program Impact Section
(1st Section of a Questionnaire) The program impact section of the question- answers associated with each question. Some naire is used to collect information for demon- examples include: strating the impact of your program. This relates The most effective weight loss program includes to “client change” or “evidence of impact.” exercise. (True or False) Extension outcome programs should address Broccoli is particularly good for: one or more levels of client change. They are defined below, each with example questions. a) your muscles Think about the level(s) of client change your b) your eyes program will address. c) your metabolism d) your bones and teeth Learning—knowledge: What participants know; learned information; accepted advice; how The ideal refrigerator temperature is ______. well they understand or comprehend something. The second approach is weaker, relying on self- Questions that measure knowledge gain ask perception and makes the most sense for a retro- what participants know, are aware of or under- spective post evaluation strategy. Participants are stand. There are two approaches to measuring not tested on actual knowledge. Instead, partici- knowledge. pants express what they perceive to be their own level of understanding before and after the pro- The first approach is preferred because it gram (with any increase being an indicator of demonstrates impact more strongly. It involves knowledge gain). developing what essentially looks and functions like a traditional test of knowledge in a school Specifically, at the end of the program, partici- setting. Response categories using this approach pants are asked about their level of understanding include true/false, multiple choice and fill in the related to the topics covered, before vs. after the pro- correct response. There are right and wrong gram. An example:
Please circle the number (using the scale below) for each statement that best describes your level of knowledge before and then after the Regional Cotton Conference.
1= Very Low 2 = Low 3 = Moderate 4 = High 5 = Very High
Statements Before After
What is your level of knowledge pertaining 1 2 3 4 5 1 2 3 4 5 to White Weed and Standard Weed Control in Conventional Cotton?
What is your level of knowledge pertaining 1 2 3 4 5 1 2 3 4 5 to The Cotton Market Outlook?
What is your level of understanding 1 2 3 4 5 1 2 3 4 5 pertaining to New Technology and Value of Bt cotton varieties in North-Central Texas?
11 Learning—skills: An individual’s mental and physical abilities to use new and/or alternative processes. Skills acquisition can be measured with a questionnaire, but it is challenging, indirect and relies on the participant’s self-perception. For example: I learned enough at this workshop to successfully scout for the Pecan Nut Casebearer in my own orchard. Strongly agree Agree Uncertain Disagree Strongly disagree
I am confident that I can apply the knowledge and tools I gained today to develop my own budget. Strongly agree Agree Uncertain Disagree Strongly disagree
Skills acquisition is more accurately measured Application—behavior change: Sustained change through observation or an assessment of a partic- in a participant’s actions or usual mode of operation. ipant actually demonstrating a skill. The measure Behavior typically changes after knowledge is might be dichotomous (such as success/fail) or a gained or a skill acquired and, in light of being skill level on a scale. equipped, a new mode of operation is deemed as Remember, a participant can learn a skill but necessary or desirable. Questions dealing with never use it. If the goal of your program is solely behavior change ask people about what they have to teach a skill, then measure for that, not for done in the past, what they are doing now or what behavior change. they plan to do in the future. For example: Learning—attitudes/beliefs: How participants Before this Extension event, did you treat your feel about something; what participants think is cotton for bollworms? true; an opinion, preference or perspective. As a result of this Extension event, do you plan Beliefs are judgments of what people think is to treat your cotton for bollworms? true or false, what they think exists or does not As a result of this Extension event, have you exist. The questions may seek perceptions of developed a budget? past, present or future reality. This type of ques- tion is often used in needs assessments. Two points are important to remember in ask- ing about behavior change. First, phrase the In your opinion, does positive self-esteem question such that, to the greatest extent possi- among adolescents prevent drug abuse? ble, Extension education is specified as the poten- Do you think that lower beef prices would tial cause of behavior change. A program partici- increase beef consumption? pant may change behavior, but it may or may not Attitude questions ask people to indicate what be because of the Extension program. One or they value or whether they have a positive or nega- more intervening factors, such as reading an arti- tive feeling about a subject. Words typically used in cle in a trade publication or getting advice from a attitude questions include: prefer/not prefer, desir- neighbor, may have played a role in convincing able/undesirable, favor/oppose, should/should not, the participant to actually change. satisfactory/unsatisfactory. To help control for this, preface the question Do you favor or oppose controlled calving for with something similar to “As a result of the your operation? Extension-sponsored event, did you change . . .” or make it clear in the questionnaire instructions Do you agree or disagree that eating beef caus- that you are asking about what happened as a es heart disease? result of the Extension program. The probability 12 of intervening factors causing behavior change Structural control measures (rather than Extension) will be higher for some Earth dikes events than others. Silt fences Second, behavior change is often not immedi- Sediment traps ate. It takes time. Therefore, it may be preferable Sediment basins in terms of measuring impact to ask about behav- Storm water management controls ior change months after the conclusion of a pro- Retention ponds gram (longer term post) to give participants time Detention ponds to adopt the desired behavior. The exception may Infiltration be a program delivered over the course of several Vegetated swells and natural depressions months. By the end of the program, some behav- ior change may be possible and expected. Application—new technology: When a partici- If you administer an evaluation as the program pant adopts new technology as a result of an concludes and behavior change is not expected at Extension program that point, asking about intent or plans to change is an option. The results will be more meaningful The wording of questions for new technology and favorable compared to asking about actual is similar to those for behavior change. The same behavior change at that time but not as strong as issues (controlling for intervening factors and providing evidence of actual behavior change allowing time for adoption to occur) apply as later. well. Examples: Application—best practice: When a partici- As a result of the Extension event, did you plant pant decides that a new practice is preferred over new transgenic corn this year? the current practice and adopts the new. As a result of the Extension event, have you The wording of questions for best practices is started using yield monitors in your operation? similar to those for behavior change. The same issues (controlling for intervening factors and Summary of client change allowing time for adoption to occur) apply as well. Examples: Remember: The evidence you collect to demon- strate impact should match the program’s goals for As a result of the Extension event, have you client change. For example, if the program was added wildlife as a component of your ranch designed strictly to increase knowledge, then ask operation? questions that measure knowledge gain; do not As a result of the Extension event, have you ask about behavior change. adopted IPM as part of your operation? The timing of data collection is also important. As a result of the Extension workshop, which of For the application-based client changes (behav- the following practices (if any) have you imple- ior change, adoption of best practices, adoption of mented to help control sedimentation from your new technology), do not collect data until you operation? (select all that apply). can reasonably expect the client change to have occurred. On the other hand, asking about inten- Erosion and sediment controls tions or planned changes is an option if you con- Temporary seeding duct an immediate post-event evaluation. Permanent seeding Mulching
13 Developing the Program Improvement Section
(2nd Section of a Questionnaire)
It is beneficial to devote a section of your ques- What can we do differently to make this work- tionnaire to program improvement to elicit what shop better in the future? participants liked about the program, what they disliked and what suggestions they have for What topics would you like the newsletter to improvement. The goal is to use the information address more often? from this section to modify the next program and, as a result, experience greater impact and higher Is there anything else you want to us to know participant satisfaction. Program improvement about your experience with this event? questions can address any aspect of an event, including impact, satisfaction, content, delivery, Program improvement questions can be closed- logistics and future direction. ended as well, such as this customer satisfaction question used as part of an output program: Typically, program improvement questions are structured as free-form open-ends such as: Overall, how satisfied are you with this event? Not at all What did you like most about this workshop? Slightly Somewhat What did you like least about this workshop? Mostly How can we improve this workshop in the future? Completely
14 Developing the Demographic Section
(3rd Section of a Questionnaire)
In addition to questions about program impact differences can be made if desired or required. and improvement, questionnaires typically ask Second, differences between groups on about the participants themselves—what they are interests, perceptions, intentions, etc., can and what they have. Also called “demographics” be uncovered, enabling more effective plan- or “participant background information,” attrib- ning of future programs. utes are a person’s personal characteristics such To minimize both survey burden and intru- as age, education, occupation, ethnicity, income, sion on personal lives, you should collect only etc. relevant, useful and actionable attribute infor- Knowing the attributes of program participants mation. If it’s simply “nice to know” informa- is important for several reasons: tion or you don’t have plans to use it, don’t ask for it. • It tells you who you’re reaching. Who participated in your program? Attribute information will tell you. If the program has General Guidelines more than one primary focus, you might ask • Put attribute questions in one section at the why they participated. end of the questionnaire. A question about how participants heard • Include a short introduction to the section about the program can help future marketing explaining how the information will (or will efforts. Of course, the response categories not) be used. Examples: should correspond to how the program was If anonymous (no names or other indi- actually marketed, along with avenues com- vidual identifiers on the form): mon to any event (such as word of mouth). Please tell us about yourself. Remember, • It tells you to what extent you’re reaching your responses are anonymous and cannot the program’s target audience. be traced back to you. All information col- Not everyone who attends your program is lected in this section will be reported in sum- part of the target audience. Asking one or mary form and used for program improve- more questions to determine if a participant is ment purposes only. part of the target audience will help you gauge If confidential (ID allows responses to be the effectiveness of your marketing efforts. traced to individuals but not easily): The information can help you adjust the Please tell us about yourself. Any informa- marketing of the program or measure the tion you provide will be held in strict confi- program’s impact more precisely. For exam- dence, reported in summary form and used ple, you may not want to include partici- for program improvement purposes only. pants outside the target audience in the cal- • Sometimes it is desirable to use response culations about impact. categories from other sources for compari- • It allows you to examine differences among son purposes (such as census data or previ- groups. ous surveys). First, you can analyze how the program is per- forming across various groups—has program Specific Guidelines impact been uniform, or are there differences between groups? For example, you might dis- cover that the program has been more success- The following are specific guidelines for ful with younger participants than with older approaching common attribute questions. participants. Program changes to address group
15 Gender/sex Of course, Asian, Native American and any other Some people wonder how to word a question races/ethnicities can be separated from Other and about whether a respondent is female or male: given their own response categories (see below). Should it be gender or sex? Although gender may be How would you describe yourself? (select one or viewed culturally as more sensitive and is com- more) monly used today, sex is actually the more precise Asian/Asian American/Pacific Islander term and is the wording used by the Census Black/African American (Non-Hispanic) Bureau. Hispanic To avoid the issue altogether, consider adopt- Native American ing this generic wording: White (Non-Hispanic) Other You are . . . Female Note that the categories are arranged in alpha- Male betical order to eliminate any implied importance associated with ordering. The exception is Other, Another consideration is the order of response which is traditionally (and logically) placed at categories. Traditionally, for maximum readability, the end. response categories for sex were ordered to match what one was accustomed to hearing in everyday Also note that instructions such as select one life. In the past, that has meant hearing “male” only or circle one have been replaced with select first and “female” second. However, this societal one or more. Today, many respondents want to norm has dissipated over time. Generally, for sen- identify themselves as belonging to two or more sitive questions, consider arranging the categories races and/or ethnicities. in alphabetical order (as above) to eliminate any A different approach is to ask about Hispanic implied importance of categories associated with origin separately from race, as done by the order. Census Bureau and on the 2004-2005 Texas 4-H Member Enrollment Form: Race/ethnicity Are you of Hispanic ethnicity? Knowing respondents’ racial and ethnic identi- ty can be important in looking at program Yes impact; however, asking about it can be a sensi- No tive and delicate matter. Remember, Hispanic ori- What is your racial group(s)? (Select all that gin is classified as an ethnicity (not race) by the apply) Census Bureau so, typically, you will ask for Asian/Asian American/Pacific Islander race/ethnicity, not just race. Black/African American To avoid any issues in this area, consider Native American adopting the following generic wording used by White the Census Bureau: Other How would you describe yourself? (select one or Age more) Some participants are reluctant to reveal their Black/African American (Non-Hispanic) exact age. Reluctance often translates into a refusal Hispanic to answer the question or giving an incorrect age— White (Non-Hispanic) both adversely affect data analysis. Other If you do not need an exact age for statistical analysis or programming planning purposes, use
16 age categories, as program participants are more Do not ask for the month and/or year of birth likely to give their correct age if falls within a cat- unless it is absolutely necessary. Doing so will egory. For example: mean a lot of computation work for you to do later and a significant number of refusals to What is your age? answer. Under 25 25-34 Income 35-44 People are especially sensitive about revealing 45-54 income. Therefore, keep two things in mind as 55-64 you develop income questions: 65-74 75-84 First, never ask for a specific dollar amount. 85 and over Instead, always use broad income ranges as your response categories. They should be precise enough These eight categories work well if you have to be useful to you yet broad enough that most many program participants with a wide distribu- respondents will feel comfortable enough to provide tion of ages. Having eight categories, compared to a correct answer. fewer categories, has at least two advantages. First, there are many ways to equitably collapse Second, phrase the income questions precise- categories—using all eight or the six “in the mid- ly. Exactly what do you want to know? Annual dle.” Second, mean and median calculations income? Monthly income? Net? Gross? Taxable? using categorical data become more precise as Take-home? Of the respondent only? Of the entire the number of categories increase. household? Another suggestion that might encourage more Who is included in household income? Respon- accurate and complete reporting of age is to dent plus spouse (if married)? Unmarried partners? include a category higher than most program par- Roommates? Teenage children with jobs? ticipants you expect to attend—the theory being Asking for the gross income of the respondent that more participants on the higher end will pro- or household is probably the least problematic in vide a correct age category if they do not have to terms of recall and accuracy. Most people can select the highest (oldest) age category. recall their current salary (gross income, before That said, select the age categories that make taxes and other deductions), although those in the most sense given your particular program sales or who own businesses will have more dif- needs. The categories should be equitable and ficulty in providing a single “annual” figure. have a logical basis. Which of the following best describes your Sometimes a specific age is needed for statisti- annual gross household income in 2004 (before cal analysis, precise trending or programming taxes and deductions)? Please include just your- purposes. If so, asking “What is your age?” is the self and other head of household, if applicable. preferable (softer) phrasing compared to “How old are you?” Example: Education What is your age? _____ years For consistency, ask for highest level of educa- tion obtained. Otherwise, respondents may select Alternatively, you can ask for year born, an education level that they are working on and although your age calculation will be off by one plan to complete. year for some respondents, depending on what month of the year you tabulate your survey data.
17 Response categories may be relatively few or Where do you live? numerous, depending on your particular infor- Here are some generic response categories to mation needs: consider: Example 1: Farm or ranch Some high school (or less) Rural area, not a farm or ranch High school degree or GED Town of under 5,000 people Associate or technical school degree Town or city of between 5,000 and 25,000 4-year college degree people Graduate or professional degree Town or city of between 25,000 and 100,000 people Example 2: Town or city of between 100,000 and 250,000 Some grade school (1-8) people Some high school (9-12) Town or city of more than 250,000 people High school degree or GED Some college or technical school Associate or technical degree Bachelor’s degree Master’s degree Ph.D. Professional degree (MD, DVM, JD, ED)
18 Appendices
19 Appendix A: Wording the Questions
Question wording should strike a balance • Include all necessary information. between two competing goals: collecting informa- In some cases, the respondents may not know tion in sufficient detail to answer the study ques- enough to adequately answer the question. For tions vs. simplicity for maximum understandabili- example: Do you agree or disagree with the ty and reliability. In writing questions, consider county’s new environmental policy? Respon- three things: the target audience for whom the dents may not know what the policy is or questionnaire is being designed, the purpose of the whether it is the most recent one. Provide a questionnaire and the placement of the questions statement summarizing distinguishing points in relation to each other. of the policy. Some general guidelines include: • Avoid questions that may be too precise. Few people can recall exactly how many times • Use simple wording. they ate out last year or visited a particular Web Adapt the wording to the vocabulary and site in the past 6 months. To help the respon- reading skills of people who will be asked for dents formulate an answer, use logical response information, but don’t talk down to them. Do categories, such as 0-5, 6-10, 11-15, etc. any words have double meanings, or are any words confusing? Other information, however, such as the num- ber of irrigated acres or head of cattle owned, • Be specific. are likely to be known by most respondents. A question about older youth should specify Knowing the target audience and having what age or grade is considered as “older.” A experience with previous questionnaires will question such as “How many times has your help you decide whether a particular piece of 4-H club met?” does not provide a time frame information requested is too precise. as a basis for responding (such as, since its inception, in the past 2 years, past 6 months, • Phrase any personal or potentially incrimi- etc.). Even the question “How many times did nating questions in less objectionable ways. your 4-H club meet last year?” should be more Some participants may object to questions specific. Does “last year” mean 2004, or the about income level, drug use or eating habits. past 12 months, or a fiscal year such as One way to elicit such information is to ask September through August? participants to select from among broad cate- gories [income less than $20,000, $20,000 to • Avoid using abbreviations, jargon or for- $29,999, $30,000 to $39,999, $40,000 and eign phrases. over, etc.] instead of specifying precise infor- Will the respondents understand what is mation. Also, a series of questions may be meant by such terms as client change, learn- used to soften or overcome the objectionable ing experiences, outcome programs, pre-post nature of certain information. or other technical terms that professionals commonly use as short-cuts? • Avoid questions that are too demanding and time consuming. • Use clear wording. Examples are: Rank the following 15 items in Words such as “regularly” and “occasional- order of their importance to you. ly” mean different things to different people. Avoid such words if possible or at least pro- A better approach is to ask participants to vide a definition or example of what is select their top five items from the list of 15 meant by it. Other examples of vague terms and rank those only. This will reduce the include: majority [more than half of what?], survey burden on the respondent while pro- often [daily, twice weekly, weekly?], govern- ducing more meaningful results for you. mental [city, county, state, federal?], older people [how old?].
20 Exhaustive for multiple responses, make sure that only one • Use exhaustive, mutually exclusive cate- answer is possible. Using the age example: gories. Which category includes your age? For questions that ask for one answer (“select Under 25 one only”), response categories need to be 25-35 exhaustive and mutually exclusive. 35-45 45-55 Exhaustive means that all possible response 55-65 categories are listed. A question with age cate- 65 or over gories provides a simple example: Here, participants age 35, 45, 55 or 65 have Which category includes your age? two response categories to choose from. 25-34 35-44 Another example: 45-54 How did you first hear about this Extension 55-64 workshop? (select one only) The list above is not exhaustive, as there are no at my Extension office accurate response categories for participants under at another Extension event age 25 or over age 64. Adding categories to cover at work the low and high ends makes the list exhaustive. newspaper on the web Which category includes your age? someone told me about it Under 25 25-34 If a participant heard about the workshop from 35-44 a friend at work, more than one answer is possi- 45-54 ble. Even the distinction between newspaper and 55-64 on the Web may be unclear for a participant who 65 or over first learned about the event in an online version of the newspaper. If the list of potential response categories is long (more than 12, for instance) think about a This question asked about sources but offered final category called “other” or a partially closed location categories as well—creating the potential structure with an “other, please specify” option. for overlapping responses. Questions about Or, if you want participants’ preference in just a sources and locations should be separated—each few categories of interest to you, consider this with an exhaustive set of response categories. question wording: • Avoid making assumptions. Of the five vegetables listed below, which one A question such as “Do you prepare beef tastes best to you? when you have friends in to eat?” makes an assumption—that the respondent has friends Broccoli in to eat. Write two questions instead. Use the Carrots first question to establish the situation, fol- Corn lowed by the question of concern. For exam- Potatoes ple: “Do you ever have friends in to eat?” Spinach Yes/No [If yes, “Have you ever prepared beef Mutually exclusive for them?”] Mutually exclusive means that the categories do • Avoid double questions. not overlap. Unless a question specifically allows Having two questions written together does not allow for the participant to respond in favor of 21 one part or the other. For example: “Did the Some examples of biased questions are shown poultry production seminar help you to identi- here: fy ways to improve sanitation and increase the 1. How would you rate the housing in which nutrition of your cage bird operation?” The you live? answer may be “yes” on the first point but “no” on the second point or visa versa. Ask about Fair sanitation and nutrition separately. Good Excellent Other double questions may be unduly ambig- No negative options, such as “poor,” are uous. For example: “Do you favor legalization provided. of marijuana for use in private homes but not in public places?” Respondents have no way to 2. More farmers in Greater County are using say whether they favor both places, oppose Superb than any other variety of wheat. both places, oppose home but favor public use Do you use Superb? or oppose legalization as a concept in general. Yes Ask about legalization of marijuana in the No home and in public places separately. This question suggests that the respon- dent should be using Superb. • Avoid bias in questions. 3. Do you agree that funding for Extension in Such questions influence people to respond in your county should be increased? a way that does not accurately reflect their position. A question can be biased in several No ways: It can imply that the respondent should Yes be engaged in a particular behavior; it can This is an unbalanced and leading ques- give unequal response categories or responses tion. A better wording: that are loaded in one direction; and it can use 4. Do you agree or disagree that Extension words with strong positive or negative emo- funding should be increased? (Circle one) tional appeal, such as bureaucratic, boss, Strongly agree equality, etc. Agree Disagree Strongly disagree
Poor logic Poor spacing Better