MIAMI UNIVERSITY — THE GRADUATE SCHOOL

CERTIFICATE FOR APPROVING THE DISSERTATION

We hereby approve the Dissertation

of

Andrew M. Dudas

Candidate for the Degree

Doctor of Philosophy

Dr. Philip A. Russo, Jr., Director

Dr. Ryan J. Barilleaux, Reader

Dr. Enamul H. Choudhury, Reader

Dr. John H. Benamati, Graduate School Representative

ABSTRACT

THE USE OF COMMUNITY OPINION SURVEYS IN LOCAL GOVERNMENT STRATEGIC DECISION MAKING

by Andrew M. Dudas

Local government officials often times fulfill a dual role in policy making. This dual role amounts to officials being both a policy analyst and a policy-maker. Much of the policy-making done by local government officials is strategic in nature – planning for the future growth, development, and needs of the citizenry. In order to strategically plan for the future and make decisions within this framework, local government officials must have data at their disposal to help inform their decision making. In essence, local government officials then must also act as policy analysts – collecting data, analyzing it, and finally translating that data into meaningful policy decisions. This research examines one means of providing local government officials with data to help inform their decision-making – citizen surveys. Local governments often conduct surveys to collect information on a host of issues confronting their community. This research focuses on broad-based community surveys, the data that is collected, and the translation of that data into policy outcomes. Through the presentation of case illustrations, a survey typology is presented that can be used to classify individual surveys as being either informational, strategic, decisionistic, or symbolic based on identifiable factors in each survey and how the results were ultimately utilized by the local government.

THE USE OF COMMUNITY OPINION SURVEYS IN LOCAL GOVERNMENT STRATEGIC DECISION MAKING

A DISSERTATION

Submitted to the Faculty of Miami University in partial fulfillment of the requirements for the degree of Doctor of Philosophy Department of Political Science

by

Andrew M. Dudas Miami University Oxford, Ohio 2005

Dissertation Director: Dr. Philip A. Russo, Jr.

TABLE OF CONTENTS

Acknowledgements iv 1 Introduction 1 2 A Typology Of Community Opinion Surveys 54 3 Case Illustrations 69 4 Translating Survey Results Into Decisions 102 5 Conclusion 151

Appendices 157 Appendix A – Personal Interview Questionnaire Appendix B – Village of Williamsburg Survey Instrument Appendix C – Oxford Township Survey Instrument Appendix D – Village of Coldwater Survey Instrument Appendix E – Hanover Township Survey Instrument

Bibliography 191

List of Tables Table 1: Informational Surveys Table 2: Strategic Surveys Table 3: Decisionistic Surveys Table 4: Symbolic Surveys

ii

For my wife Stacey: Without your love and support this would not have been possible.

iii ACKNOWLEDGEMENTS

When I began my graduate school career in the fall of 1993, it was hard to imagine completing a dissertation to punctuate all of the coursework, studying and research required for the degree. Through it all, one person has remained constant in my time at Miami University – Dr. Philip A. Russo, Jr. Dr. Russo has played many roles in my life, including Professor, mentor, boss, and friend. Without Dr. Russo’s help and support, I certainly would not be where I am today. He not only helped to keep me at Miami after the completion of my masters’ degree, he also gave me the opportunity to work as an intern at the Center for Public Management and Regional Affairs. Eventually, that internship would turn into full-time employment with the Center for which I am eternally grateful. When the prospects for completing my degree seemed bleak, Dr. Russo agreed to serve as the Chair of this dissertation. I thank him for his friendship, guidance, insight, and unwavering support of both me and this endeavor.

Other individuals played integral roles in the completion of this research. First and foremost, I must thank the other individuals who comprised my committee and guided this research project to completion. Dr. Ryan Barilleaux and Dr. Enamul Choudhury freely gave of their time and offered insight when needed. Dr. John “Skip” Benamati graciously agreed to serve on the committee as the outside reader. Moreover, I must also thank the four individuals who agreed to be interviewed and provide much of the needed data for this project. Hanover Township Trustee Tim Derickson, Village of Williamsburg Mayor Mary Ann Lefker, Oxford Township Trustee George Simonds, and Village of Coldwater Administrator Eric Thomas all freely gave of their time as well as providing much-needed specific information about their communities.

From a more technical perspective, my wife Stacey and Vince Tenaglia provided assistance navigating the resources of King Library. Furthermore, special recognition must go to Stacey and Mark Morris for completing the tedious task of proofreading the document. Mark also assisted with general guidance for sundry administrative aspects related to the formatting and layout of the dissertation. The past year of writing the dissertation was made far easier thanks to the efforts of Commissioner Gary Bettman and the National Hockey League Board of Governors who could not craft an agreement with Executive Director Bob Goodenow and the National Hockey League Players’ Association to end the NHL lockout.

My work at the Center served as the genesis of the idea for this research. When I began my work at the Center, Eric Frayer took me under his wing and showed me the ropes. Lori Libby is a valued colleague and has been supportive of my efforts

iv towards completing this dissertation. Finally, Mark Morris, who is not only a co- worker, but also a trusted friend, must be recognized. Mark always encouraged me to finish the dissertation and listened ad nauseum to my trials and tribulations. While I am sure he grew weary of hearing about it, he never wavered in his support of my work and always reminded me to “work on it a little bit each day.” Thank you for those kind words.

My time and Miami afforded me the opportunity to meet many people who I am proud to call my friends. The departmental secretary, Dottie Pierson, should be mentioned for her guidance and support during my days as a graduate student. I shared many good times and memories during graduate school with my former roommate Dr. Chris Woolard. While sharing a love of music and all things Pittsburgh, I have developed a lifelong friendship with Dr. Kevan Yenerall. Thank you for being such a close friend. Dr. Chris ‘CK’ Kelley is always a source of humor as well as being a good friend. Dr. Mark ‘Shark’ Sachleben is simply what I would call the quintessential friend. Shark is always supportive, caring, sharing, interested, and ready to offer assistance without hesitation. His presence in my life has helped me get through both good times and bad. Thanks to Hans Soder for scaling back our weekly phone calls for several months to allow me time to concentrate on writing. Other individuals such as Tim Gray, Marc ‘Pumpkin’ Sadowsky, and Dr. Darren ‘Tex’ Wheeler became good friends during their time at Miami as well. And thanks to all of my Ice Mongrels broomball and softball teammates for providing much needed diversions. And finally, there is some symmetry involved in the fact that I will graduate alongside Mike Demczyk with whom I started at Miami back in 1993.

Last but not least, I must thank my family for their support and love over the years. While not a blood relative, Gregg Taylor is like a brother to me and can always be counted on to be there through thick and thin. My parents, Donald and Maryann, always encouraged my educational pursuits despite never fully understanding why it took so long to finish my dissertation. My brother Steve (his wife Renee) and my sister Monica have supported me no matter what the task or project. Thank you for your love and support. My in-laws, Bob and Betty Dietrich, have graciously welcomed me into their family and supported my endeavors. Clearly, my wife Stacey served a much larger role in this project than just library assistance and proofreading. She is my source of strength and inspiration and her love and patience is a true blessing. Stacey encouraged me to welcome a dog into our family earlier this year. While concerned that the time and effort of a new puppy would distract from working on the dissertation, Fleury proved otherwise as she spent many a day at home with me while I wrote.

v 1 INTRODUCTION In the realm of local government planning and policy-making, a recent trend has seen more and more local government officials embracing the concept of strategic planning. Strategic planning at its most basic level consists of employing techniques designed to enable local governments to proactively plan for the future, solve current problems and issues facing the community, and as a means of goal-setting and measuring the achievement of agreed upon goals. The allure of strategic planning is its promise of assisting decision-makers in dealing with both the present (current issues and problems facing the community) as well as its capacity to assist with long-range planning for the future needs of the community. For the purposes of this research, local government officials include both elected and appointed officials (which includes council members, mayors, and manager or administrators), particularly those in smaller (based on geographic size and population, as well as the number of services delivered to residents), non-metropolitan communities. When local government officials start out along a path of strategic planning, the goal is to collect data and information about their community that will assist them in making policy decisions that will guide their jurisdiction for both the short and long term. This underlies a fundamental issue in public policy research: the differing roles of the policy analyst and the policy-maker. In public policy analysis research, the common perception is that policy research is conducted by professional policy analysts. Indeed, the field of policy analysis has developed into a profession. Classic models of policy analysis demonstrate a separation of roles in the policy process for analysts (staff) and decision makers (elected officials). A classic example occurs in the federal government as elected legislators (United States

1 Senators and members of the United States House of Representatives) are charged with making policy (legislation). Individual members of Congress, as well as Congressional committees, employee staff whose primary responsibility is that of conducting policy analysis. It is the Congressional staffers who are conducting research, collecting data and information, and analyzing the costs and benefits of a variety of policy options on any given issue. That data and analysis is then presented to members of Congress who must then use that information to ultimately make policy decisions and choose from among the policy options presented. A similar scenario plays out at the state level where there is also a separation of roles among those who analyze policy and those who make policy. At the local level, however, particularly among the small, non- metropolitan governments that form the focus of this research, this separation of roles often does not exist or is at best blurred. These local governments, in addition to being small and non-metropolitan, tend to have another quality that makes them unique in that they often do not have professional staff that can be charged with analyzing various policy options. Or if they do have professional staff (such as a Manager or Administrator to handle day-to-day government operations), policy analysis and related functions tend to be only a minor facet of their daily work for the local government. The conventional understanding of local government decision making and the role of local elected officials tends to focus on their handling of specific, almost mundane, issues such as nuisances and dispute resolution among citizens. These issues can broadly be lumped together as what the private sector would refer to as providing customer service. Dealing with these present and immediate issues, those that directly affect certain

2 residents, has been in the forefront. However, the new imperative of politics at the local level is forcing local elected officials to focus on developing a long-range strategic implementation plan and to become more future oriented in their thinking. Part of this long-range planning might include re- engineering or reinventing the way in which the local government operates or provides services. It may also amount to planning for expansion of services or giving serious consideration to providing new services as the citizenry changes and demands more and differing levels of service. Therefore, much of the policy-making being done by local government officials is becoming more and more strategic in nature – planning for the future growth, development, and needs of their citizenry. Indeed, the imperative for strategic planning is recognized as part of the current paradigm of public administration according to Nicholas Henry. In his text Public Administration and Public Affairs, Henry details the current paradigm of the field called ‘A New Public Management’ for which he establishes a timeline of 1992 to the present. This paradigm is marked by, among other things, an emphasis on developing performance measures for services, public program evaluation, and productivity enhancements. One of the techniques used to accomplish these goals is strategic planning efforts, particularly those that include community or citizen satisfaction surveys and studies. Henry indicates that, in previous paradigms, these issues were only intermittently addressed, but have become more mainstream within the field of public administration in the past decade.1 This new imperative can have a profound impact on how local elected officials operate. The impact is that local elected officials are now often

1 Henry, Nicholas. 2004. Public Administration and Public Affairs. Upper Saddle River, NJ: Pearson / Prentice Hall Publishing. 179.

3 forced to fulfill a dual role in policy making. This dual role amounts to officials having to be both a policy analyst and a policy-maker.2 An elected official’s duties as a policy-maker are statutorily defined. However, there is no legislation that dictates how those policy decisions are made. In order to strategically plan for the future and make decisions within this framework, local government officials must have data at their disposal to help inform their decision making. In essence, local government officials then must also act as policy analysts – collecting data, analyzing it, and finally translating that data into meaningful policy decisions. In order to effectively embrace and implement a strategic planning process, local elected officials must be able to analyze and synthesize data and reach some conclusions about how to proceed on issues. There are many techniques or tools that local government officials have at their disposal to help them collect the information they need to plan strategically and make policy decisions according to that plan. One of the more popular techniques that local governments use is the hiring of consultants. Local government consultants come in many forms; some are general government consultants that provide technical assistance on a variety of local government issues and problems, while other consultants specialize in planning and visioning exercises targeted at developing a long-term strategic plan for communities. Another tool is the application of the Delphi technique as a means of problem-solving and generating consensus. The Delphi technique, consisting of a panel of local government officials, can be utilized to systematically explore options and generate consensus among

2 The concept of elected officials fulfilling a dual role as policy analyst and policy-maker presented here is loosely based on the conceptualization of duality set forth by James H. Svara. For more information, see Svara, James H. 1985. “Dichotomy and Duality: Reconceptualizing the Relationship Between Policy and Administration in Council-Manager Cities.” Public Administration Review 45 (1): 221-232.

4 officials.3 Local governments may also devise a system implementing a series of focus groups consisting of government officials and concerned citizens as a means of generating information that can be used to develop a strategic plan. Another method of generating information for the strategic planning process that is utilized by local government officials is conducting community surveys to collect opinions and take the pulse of the residents of their community. It is this method of data collection for strategic planning efforts that this research examines; specifically how do community opinion surveys provide local government officials with the data required that informs their decision-making? Local governments often conduct surveys to collect information on a host of issues confronting their community. Some surveys may be targeted to a specific topic, while others may more broadly collect data and opinions on a broad array of issues and services within the community. This research focuses on broad-based community surveys, the data that is collected, and the translation of that data into policy outcomes. From a pure public policy process standpoint, the data collected from a community opinion survey can have a significant impact on the agenda setting stage of the public policy process for the entity conducting the survey. Data collected can dictate what issues reach a community’s agenda. Some issues may have already passed through the policy process and the survey results are being used to evaluate those past decisions. The results may cause the issue to come back onto the agenda – particularly if the findings warrant policy termination or re-design. Other issues may be currently on a community’s agenda as it is being discussed or deliberated.

3 Gupta, Dipak K. 2001. Analyzing Public Policy: Concepts, Tools, and Techniques. Washington, D.C.: Congressional Quarterly Press. 208-212.

5 Data collected by a community opinion survey can help refine those discussions and deliberations and result in a policy decision. Survey findings can also serve as a driving force to place issues on a community’s agenda – issues that may not have been anticipated, but seem to be ripe for a policy decision. This function could be termed as forcing issues onto the action agenda. Finally, data collected on a community opinion survey may serve to bring issues to what can be termed the discussion agenda of a community. These types of issues would be those that fit the traditional notion of strategic planning efforts – issues that may not require immediate action, but those that the community must be aware of in order to properly plan for the future. Of specific interest is how survey research findings and results are translated into decisions (using survey data to inform public policy). Of concern is the local government officials’ role as policy analyst. Are there factors that can be identified that determine how the survey data results collected may or may not be used by local governments? What are these factors? What influences or causes these factors to be present? A series of case illustrations will be used to identify these factors. Based on the factors that are identified, a typology will be developed that will enable individual surveys to be categorized. This typology will attempt to discern how and why the knowledge collected in community opinion surveys gets used in particular ways. Several categories of community opinion survey knowledge utilization will be presented and the case illustrations will be categorized according to the presence and/or absence of certain factors at both the aggregate level and in terms of individual sections or questions on the various surveys. It is the goal of this research to provide an assessment tool that can be used to predict how the

6 data collected in future community opinion surveys is ultimately translated into public policy decisions.

Literature Review

There are three bodies of literature that will be reviewed for this research. The first body of literature is that which pertains to the field of public policy and policy analysis, specifically that of knowledge utilization – how the data collected gets used in the decision-making process. A second body of literature that will inform this research is that of survey research. Literature in this category discusses survey research from both an academic and an applied perspective, including survey design, methodology, and interpretation of results. The third and final body of literature that will be examined is that which pertains to strategic planning, specifically how survey research can contribute as a valuable source of information to the process.

Public Policy and Policy Analysis A broad overview of public policy and policy analysis must first be explored. The literature attempts to provide answers to several questions about public policy. Some of these questions are: What is public policy? Who conducts policy analysis? Where is policy analysis conducted? And how (or what methods) are used to conduct policy analysis? The most basic of these questions is: what is policy analysis? At its very essence, public policy consists of the decisions that are made that, according to David Easton’s A Systems Analysis of Political Life,

7 authoritatively allocate societal values and resources.4 Other authors, such as Thomas Dye, have more succinctly defined public policy as whatever governments choose to do (or not to do as the case may be), why they choose to do it, and what difference it makes.5 Furthermore, public policy can be conceptualized as the sum of government activities that influence and affect the lives of citizens.6 Peters further expounds on the definition of public policy by examining three levels of policy – each having a differing impact on the lives of citizens. The first level is termed policy choices which are those decisions that are directed toward using public power to affect the lives of citizens. A second level is termed policy outputs or policies being put into action. Policy outputs would be more commonly known as government programs such as welfare or the promulgation of regulatory legislation. The third level is public policy impacts. This refers to the effect and ramifications that policies have on citizens. An example of this level of public policy would be tax legislation which can have a variety of effects such as redistribution on citizens.7 Each of these levels is seemingly interconnected as policy choices lead to a decision being made. That ultimate decision will have an output based on the implementation of that decision. That output will, in turn, have an impact on citizens. While these definitions help us understand what is meant by the term public policy, a subfield within public policy seeks to provide a deeper understanding of how public policy decisions come about. In other words, how is it determined that societal values and resources are allocated to a

4 Easton, David. 1965. A Systems Analysis of Political Life. New York: John Wiley and Sons. 5 Dye, Thomas. 2005. Understanding Public Policy. Upper Saddle River, NJ: Pearson Prentice Hall. 1. 6 Peters, B. Guy. 1986. American Public Policy: Promise and Performance. Chatham, NJ: Chatham House Publishers. 4. 7 Ibid, 4.

8 particular societal problem? This subfield focuses on public policy analysis: the collection of data and information used to inform public policy decisions. A seminal work in the field of policy analysis is that of David L.

Weimer and Aidan R. Vining’s Policy Analysis: Concepts and Practice. While providing an overview of the discipline of policy analysis, their book also sets forth the foundation and the tools necessary for conducting effective policy analysis.8 The very essence of policy analysis is best expressed by examining the various definitions of the field offered by scholars. William Dunn in Public Policy Analysis, states that “policy analysis is an intellectual and practical activity aimed at creating, critically assessing, and communicating knowledge of and in the policy making process.”9 Edith Stokey and Richard Zeckhauser, in their 1978 work A Primer for Policy Analysis, define policy analysis as a means of structuring complex problems in such a manner as to facilitate rational decision making.10 In Social Policy Research and Analysis, Walter Williams defines policy analysis as “…a means of synthesizing information including research results to produce a format for policy decisions (the laying out of alternative choices) and of determining future needs for policy relevant information.”11 While these definitions provide a solid foundation to our understanding of policy analysis, they fail to recognize the element of client orientation which is essential to the establishment of policy analysis as a

8 Weimer, David L. and Aidan R. Vining. 2005. Policy Analysis: Concepts and Practice, 4th Ed. Upper Saddle River, NJ: Pearson Prentice Hall. 9 Dunn, William. 1994. Public Policy Analysis: An Introduction. Englewood Cliffs, NJ: Prentice Hall. 29. 10 Stokey, Edith and Richard Zeckhauser. 1978. A Primer for Policy Analysis. New York, NY: WW Norton. 5-7. 11 Williams, Walter. 1971. Social Policy Research and Analysis. New York, NY: American Elsevier. xi.

9 profession. Weimer and Vining offer that policy analysis is an attempt to inform a public decision either implicitly or explicitly with the understanding that the information generated will inform the results of what can be expected from the passage of a piece of legislation (a decision).12 More formally, Weimer and Vining offer a definition of policy analysis: “…client oriented advice relevant to public decisions informed by societal values…”13 It should be noted that the client in their definition is further defined as an individual who participates in public decision making (usually an elected official or a high ranking public administrator). Other authors have incorporated client orientation into their definitions of policy analysis. Carl Patton and David Sawicki, in Basic Methods of Policy Analysis and Planning, postulate that policy analysis is a process that begins with defining the problem and devising alternatives to address that problem for a specific client. Generally, the analysis is conducted within a short-time frame and with an openly political approach. Thus, policy analysis is the result or outcome of these activities.14 Other authors have also developed similar definitions that incorporate the importance of client-oriented analysis. Arnold Meltsner, in Policy Analysts in the Bureaucracy identifies four central factors that are influential in developing quality policy analysis: the analyst, the client, the organizational situation, and the policy area.15 Meltsner’s conceptualization of the client/analyst relationship is of particular note.

12 Weimer, David L. and Aidan R. Vining. Policy Analysis: Concepts and Practice. 23. 13 Ibid, 24. 14 Patton, Carl V. and David S. Sawicki. 1986. Basic Methods of Policy Analysis and Planning. Englewood Cliffs, NJ: Prentice Hall. 15. 15 Meltsner, Arnold J. 1976. Policy Analysts in the Bureaucracy. Berkeley, CA: University of California Press. 3.

10 First, the reasons for which clients require policy analysis can differ greatly. Secondly, clients can vary widely in terms of their capacity to conceptualize the problem at hand and the style with which they approach the problem. According to Meltsner, several factors, including background, experience, and education, can influence the style with which clients operate and interact with policy analysts. These factors, in turn, influence the type of style that the client employs in their expectations of policy analysts. Such styles include: a managerial style or an intellectual style.16 The managerial style is typified by clients who have pre-conceived notions about the problem and expect specific responses from analysts. Meltsner describes these types of clients as those who view policymaking as a mostly political endeavor with a bit of policy analysis thrown into the equation.17 Conversely, the intellectual style tends to be more reflective when provided with analytical material. These types of clients tend to be problem solvers and enjoy helping the analyst to frame the problem. Meltsner goes on to indicate that these types of clients are generally open to new ideas and alternatives that may be considered innovative. However, there can also be “rigid intellectuals” who wish to utilize policy analysis to strengthen and further justify their already held beliefs.18 Furthermore, Norman Beckman outlines four definitional standards and characteristics of policy analysis as the introduction to a 1977 symposium on the subject that appeared in Public Administration Review. Beckman’s introduction, entitled “Policy Analysis in Government: Alternatives to Muddling Through,” identified policy analysis as being an

16 Ibid, 5. 17 Ibid, 219-220. 18 Ibid, 220.

11 integrative and inter-disciplinary process because of the wide range of skills required from a variety of professional disciplines and because it must take into account the social, environmental, foreign, and intergovernmental ramifications of any given policy option.19 Beckman also asserts that policy analysis must be anticipatory in nature. Quality analysis must be able to predict consequences and structure uncertainties surrounding the policy issue. Beckman carefully draws a distinction between policy analysis and program evaluation. The policy process is iterative as issues often change their form in response to experience and criticism, while program evaluation involves the search for new policy directions.20 Additionally, Beckman builds upon the definitions of policy analysis already presented here by examining the decision-oriented nature of the field as well as the fact that the process is both value-conscious and client- oriented. Policy analysts must confront problems that are real and must provide policy options that are feasible and clearly enumerate both the costs and benefits of each option.21 The bottom line is that a decision will most often be made based upon the research of the policy analyst. The fourth characteristic outlined by Beckman is that policy analysis must take into account the stakeholders on the issue at hand and how they might be affected by the various policy options being considered.22 Therefore, effective policy analysis will examine the values of stakeholders and take into account the client for which the policy analysis is being conducted. All of these definitions seem to incorporate some common elements. The most obvious is that the task of policy analysis is ultimately to inform

19 Beckman, Norman. 1977. “Policy Analysis in Government: Alternatives to Muddling Through.” Public Administration Review 37 (3): 221-222. 20 Ibid, 222. 21 Ibid, 222. 22 Ibid, 222.

12 some decision. At its essence then, policy analysis is conducted as a means to inform the decision making process. In order to effectively inform decisions, the resulting policy analysis must be provided or communicated to a person in a decision making position who can directly benefit from the analysis. This would be the client for whom the analysis is conducted. The research presented here aims to add a twist to the notion of client-oriented policy analysis in that the client and the analyst are one in the same in smaller local governments. After examining the elements that go into defining policy analysis, the next logical step is to understand how policy analysis is conducted. Weimer and Vining discuss the policy analysis paradigm of which the major objective is the “systematic comparison and evaluation of alternatives available to public actors for solving social problems.” Furthermore, this paradigm utilizes a style that requires analysts to synthesize existing research and theory to predict consequences of alternative policies.23 This paradigm is also characterized by having a specific person (such as an elected official) as the decision-making client for whom the analysis is conducted. The completion of the analysis provided to the client is generally tied to a specific deadline by which a specific decision will need to be made. Weimer and Vining point out that this paradigm suffers from the potential for myopia as the viewpoint of the analysis may become artificially narrowed due to deadline pressures and client orientation.24 While it is important to understand this definition of what occurs while engaging in the work of conducting policy analysis, how does one actually go about conducting policy analysis? The steps in this process are

23 Weimer and Vining. Policy Analysis: Concepts and Practice. 26. 24 Ibid, 26.

13 yet another aspect of policy analysis that public policy literature has addressed. As Dunn aptly points out, the kinds and types of recommendations that result from policy analysis can vary depending on the process or mode of analysis that is used. A generally accepted process for conducting policy analysis is described in Deborah Stone’s Policy Paradox: The Art of Political Decision Making. In her book, she provides a foundation for conducting policy analysis that consists of a five step process. Those steps are: 1) identify the problem or goal; 2) identify alternatives; 3) consider the consequences of each alternative; 4) valuate the outcomes; and 5) make a decision (or recommendation).25 This process is similar to that which is laid out in Stokey and Zeckhauser’s A Primer for Policy Analysis.26 For additional insight into how policy analysis contributes to the policy process, one should consult The Foundations of Policy Analysis by Garry D. Brewer and Peter deLeon. Brewer and deLeon examine the role of policy analysis in regards to the various stages in the policy process. They identify the stages of the policy process as consisting of: • Initiation – problem recognition and generation of alternatives • Estimation – coping with complexity and methods, tools, and procedures used in estimation • Selection – factors in selection and compromise in the political sphere • Implementation – factors affecting implementation • Evaluation – system purposes and performance

25 Stone, Deborah. 2002. Policy Paradox: The Art of Political Decision-Making. New York, NY: WW Norton. 26 Stokey and Zeckhauser. A Primer for Policy Analysis.

14 • Termination – ending policies and programs.27

Of particular importance is Brewer and deLeon’s conceptualization of the role of policy analysis in both the initiation and the estimation stages. Brewer and deLeon assign responsibility to the stages of the policy process with decision-makers (elected officials) responsible for selection and bureaucrats responsible for implementation. Policy analysts are charged with responsibility for generating policy choices and alternatives.28 Policy analysts must be able to inject creativity and innovation into the generation of alternatives to ensure that all possible options are considered for dealing with the problem. This stage is critically important to the policy process as any problems or errors made in this stage can be compounded in subsequent stages with a significant cost of both time and money.29 As for the estimation stage, Brewer and deLeon focus on the importance of sound policy analysis to provide a systematic understanding of the costs and benefits of each and every alternative developed in the initiation stage. It is the goal of the analyst in this stage to reduce uncertainties and predict the consequences (both positively and negatively) of each alternative.30 It is only with quality estimation, that policy analysts can make sound recommendations for decision-makers to ultimately select an alternative and implement a particular policy. Much of the research into policy analysis and how it is carried out has used the United States Congress as the focal point. This is due in large part to the scope and size of the legislative body and the number of issue areas

27 Brewer, Garry D. and Peter DeLeon. 1983. The Foundations of Policy Analysis. Homewood, IL: The Dorsey Press. 26. 28 Ibid, 61. 29 Ibid, 67. 30 Ibid, 83.

15 that are researched in any given Congressional session. Some of the most relevant Congressional research to this dissertation has indicated that Congressional staffers can wield significant influence on the policy process. Michael Malbin examines the role of Congressional staffers as policy analysts in Unelected Representatives arguing that they are primary decision-makers by virtue of controlling the options that are presented to members of Congress.31 Other research at the Congressional level has found that there has been a significant increase in the availability of policy research. David Whiteman, in Communication in Congress: Members, Staff, and the Search for Information, found that expanded staffs and a greater emphasis and reliance on policy analysts has led to an increase in the collective base of information at the disposal of members of Congress. Whiteman categorizes policy analysis in Congress as being substantive, elaborative, or strategic. Substantive analysis defines or creates an issue position. Elaborative analysis extends or refines an issue position, while strategic analysis advocates or reaffirms the merit of an existing issue position.32 While Whiteman’s focus is on the communication networks that exist within Congress that enable information sharing, it is the development of baseline information across policy areas that are of importance to the research at hand. Strategic planning processes require baseline data and information in order to design an effective strategic plan. Another important contribution to the literature on policy analysis within the legislative branch comes from Carol Weiss. Her 1989 article, entitled “Congressional Committees as Users of Analysis,” examines how

31 Malbin, Michael. 1980. Unelected Representatives: Congressional Staff and the Future of Representative Government. New York, NY: Basic Books. 252-256. 32 Whiteman, David. 1995. Communication in Congress: Members, Staff, and the Search for Information. Lawrence, KS: University of Kansas Press. 181.

16 committee staff utilizes policy analysis. Her research indicated that, at the time, an increase in staff professionalism seemed to correlate with an increased use and reliance on policy analysis. However, that use and reliance tended to be limited to situations in which there was some political advantage to using specific policy analysis. Furthermore, Weiss found that Congressional committee staff paid particular attention to the originating source of the policy analysis and any political motivation that might be a part of the analysis.33 Staying within the rubric of the use of policy analysis within Congressional committees, one must consider the findings of Nancy Shulock. In “The Paradox of Policy Analysis: If It Is Not Used, Why Do We Produce So Much of It?” Shulock examines how members of Congress use research reports and evaluations to persuade others to support the findings of committee reports on specific bills. This was particularly true when there was competition among committees for jurisdiction on legislation and when there was significant public interest in the outcome of the legislation.34 Other portions of the literature on public policy analysis have focused on the individuals who have embraced policy analysis as a process and a field of study and have chosen it as a profession. Initial studies into policy analysis have found that this method of political inquiry was being conducted by practitioners in a variety of fields. Some of the more common fields were economists35, planners, program evaluators, budget analysts,

33 Weiss, Carol. 1989. “Congressional Committees as Users of Analysis.” Journal of Policy Analysis and Management 8 (3): 413. 34 Shulock, Nancy. 1999. “The Paradox of Policy Analysis: If It Is Not Used, Why Do We Produce So Much of It?” Journal of Policy Analysis and Management 18 (2): 237-238. 35 For an understanding of the role of economists in government service as policy analysts, see Rhoads, Steven E. 1978. “Economists and Public Policy Analysis.” Public Administration Review 38 (2): 112-120.

17 operations researchers, and statisticians.36 Over time, however, it has grown into a field of its own with educational degree programs focused on the study and practice of policy analysis as well as individuals who consider their profession to be policy analysis. Beryl Radin provides an excellent overview of the development of policy analysis as a profession in Beyond Machiavelli: Policy Analysis Comes of Age. Radin examines how policy analysis has developed and changed from its origins in the 1960s to it current state in the 1990s. The maturation process of policy analysis as an academic field of study has brought about changes in how it is practiced as a profession. Some of these changes include: the profession’s attitudes towards politics, the skills and methodologies that the field requires, changes in decision-making processes, and how data is collected and processed.37 Radin makes a compelling case for policy analysis as a professional endeavor. Moreover, an argument can be made for policy analysis as an art, craft, and a science. Patton and Sawicki make this argument and characterize the field as also being reliant on compromise.38 To further this notion of the policy analysis as an art, craft, and science, Weimer and Vining outline five major areas of preparation that policy analysts need to be able to be successful in the field. The first trait is the ability to gather, organize, and communicate information efficiently and effectively. Policy analysts must also be able to put social problems into a meaningful context so as to be able to deal with them within the framework of the political climate. Analysts also need technical skills to be able to model scenarios and predict and

36 Weimer and Vining. Policy Analysis: Concepts and Practice. 31. 37 Radin, Beryl. 2000. Beyond Machiavelli: Policy Analysis Comes of Age. Washington, D.C.: Georgetown University Press. 38 Patton and Sawicki. Basic Methods of Policy Analysis and Planning. 22.

18 evaluate possible consequences of a variety of policy options. Another aspect of preparation that analysts must have is an understanding of the political realities and organizational behavior of the governments and agencies involved in any given policy issue. This understanding should be geared toward policy analysts being able to predict the feasibility of the adoption and potential success of a variety of policy options. Understanding the realities of the policy implementation process enables policy analysts to make recommendations and devise policy alternatives that have a greater likelihood of passing electoral muster and being successfully implemented by responsible agencies. Finally, as the field gains in strength and credibility as a profession, a natural outcome is the development of ethical standards and norms to guide the profession. Policy analysts should be able to adhere to an ethical framework for the profession and apply it accordingly. Ethical conduct is particularly important in policy analysis, particularly when there are divergent interests among clients’ preferences and what might be in the public interest (Weimer and Vining, 37-38).39 In Order Without Design, Martha Feldman provides evidence of how the political realities of any given policy issue impact how policy analysts conduct their research. Feldman studied the content of a variety of policy papers and found that these policy papers included much more than just factual data collection. Often times, policy papers would expound upon how different agencies might view the policy issue differently and therefore prefer certain policy options over others. Also, policy analysts took into account the viewpoints of affected or interested parties (stakeholders) and

39 Weimer and Vining. Policy Analysis: Concepts and Practice. 37-38.

19 how they might react to the various policy options.40 The bottom-line for Feldman was that bureaucratic policy analysts had a tendency to keep their agency’s interests in mind. In so doing, the political realities of particular policy issues can dictate the type of analysis that is conducted and the policy options that this analysis yields. While policy analysis has grown into a profession of its own, not everyone has access to a trained professional in the field. Small local governments may not have the resources available to access a policy analyst and must rely on other means of conducting research on policy issues and decisions. In these situations, small local governments may rely on line or managerial staff for analytical functions.41 In addition, local governments may also contract for policy analysis services from a paid consultant firm. Other non-profit organizations and agencies provide low or no cost assistance to local governments in the area of policy analysis. It is the work of one such organization, the Center for Public Management and Regional Affairs at Miami University in Oxford, Ohio, that will be used to provide the case illustrations for this research project. Other research has examined the individuals that conduct policy analysis and the factors that influence their analysis. Research has been conducted to ascertain how analysts differ in terms of the style of their analysis. Some of the factors that have been identified that can affect analytical styles include the nature of the relationship with the client and the role of the client in the political process.42

40 Feldman, Martha S. 1989. Order Without Design: Information Production and Policy Design. Palo Alto, CA: Stanford University Press. 41 Weimer and Vining. Policy Analysis: Concepts and Practice. 34. 42 Ibid, 32.

20 Further investigation of individuals who conduct policy analysis has allowed for a categorization of the roles that analysts might play. Weimer and Vining set forth a four part categorization for these roles. The first role is what is termed “desk officer.” Desk officers tend to be assigned to work on conducting research and coordinating policy efforts in specific program areas. Other analysts tend to fill the role of “policy developers.” These individuals are charged with generating multiple options and proposals on the policy issues at hand, with particular emphasis on options that are innovative or those that have not yet been tried. A third category is termed “policy oversight” for those analysts whose research is focused on policy and program evaluation. The final category is best described as “firefighting.” Policy analysts who fit into this category are those that are consumed by immediate pressing requests.43 Because these analysts are almost always operating in crisis mode and having to respond to whatever problem or issue has surfaced at the moment, the focus of their analysis can vary widely at any given moment and almost always comes with a pressing and immediate deadline for providing their analysis. A further examination of the roles of policy analysts is provided by Milan J. Dluhy in the book chapter entitled “Policy Advice-Givers: Advocates? Technicians? Or Pragmatists?” from New Strategic Perspectives on Social Policy. Dluhy defines role as a set of prescribed behaviors and relationships that are in accordance with the expectations that others have toward that role.44 Three policy roles that analysts take on are described in detail: advocate, technician, and pragmatist. The advocate role is marked

43 Ibid, 35-37. 44 Dluhy, Milan. 1981. “Policy Advice-Givers: Advocates? Technicians? Or Pragmatists?” New Strategic Perspectives on Social Policy. Ed. J. Tropman, M. Dluhy, and R. Lind. Elmsford, NY: Pergamon Press. 203.

21 by an appeal to values, the making of specific recommendations and is aimed at clientele groups or pre-disposed decision-makers as their primary audience. The technician role tends to rely more on data analysis, and logic while providing neutral advice with a comprehensive set of alternatives. The primary audience for technicians is peers. Finally, pragmatists are marked by the reasonableness and completeness of arguments and the making of specific recommendations, but only for feasible alternatives. Generally, the primary audience for the advice of pragmatists is organizational and institutionalized leadership.45 Other research has focused on the role of the individual policy analyst and the viability of conducting value-neutral analysis. In Public Policy: Issues, Analysis, and Ideology, Ellen Paul and Philip Russo assert: “Recognizing that values intrude on policy analysis, and that different values are manifested in the analysis of public policy, is an important dimension to making informed judgments.”46 Clearly, individual policy analysts can and do inject their own values or the values of their governmental agency into the process. While the merits of value-laden analysis can be debated, it must be acknowledged as a common occurrence when studying the field of policy analysis.

Knowledge Utilization The literature on how policy analysis is translated into public policy decisions is broadly classified as knowledge utilization. Within the framework of knowledge utilization, there are several common threads that

45 Ibid, 214. 46 Paul, Ellen Frankel and Philip A. Russo, Jr., Eds. 1982. Public Policy: Issues, Analysis, and Ideology. Chatham, NJ: Chatham House. 9.

22 link the literature. First, it is widely accepted that policy analysis rarely has an immediate or direct impact on public policy. Rather analysis is collected over time and continually added to as more and more information and research becomes available. Another common thread is that knowledge utilization in the public sector must conform to the constraints of its locus. This implies that the ultimate adoption and implementation of public policy entails more than just simple policy analysis and knowledge utilization. The process is much more complex as it is both an organizational and an inherently political process. Looking at how Congressional committees utilize policy analysis, David Whiteman published his findings in an article entitled “The Fate of Policy Analysis in Congressional Decision-Making: Three Types of Use in Committees.” Whiteman examined the role of policy analysis in Congressional decision-making and found three types of use for policy analysis: substantive, elaborative, and strategic.47 Each of these types of use is predicated on the assumption that policy makers do not act neutrally, but rather make decisions based on their world views or their conceptualization of the problem at hand. These three types of policy analysis must first be understood. Substantive use of policy analysis, according to Whiteman, can be applied in situations in which there is not a strong commitment on the part of the decision-maker for a particular solution or outcome for the problem. In this category, analysis is used to help frame the issue and develop a position or platform for the decision-maker to embrace. Elaborative use of policy analysis is applied when there is already commitment to a particular course

47 Whiteman, David. 1985. “The Fate of Policy Analysis in Congressional Decision-Making: Three Types of Use in Committees.” Western Political Quarterly 38 (2): 298.

23 of action on an issue and helps to extend or refine the decision-maker’s position on the issue. Finally, policy analysis can be used strategically in situations in which there is a strong commitment to a particular course of action and the analysis can be used to advocate or reassert that position or course of action.48 These three types of use of policy analysis will serve as a foundation to the research being conducted in this dissertation. When examining how data and information generated from a community opinion survey as part of a strategic planning process is utilized and translated into policy decisions, these three types can be used to categorize the resultant individual policy decisions. Carol H. Weiss dealt with the topic of the utilization of policy evaluation research in “Utilization of Evaluation: Toward Comparative Study.” In her research, Weiss decried “…the frequent failure of decision- makers to use the conclusions of evaluation research in setting future directions for action programs.”49 Weiss recognizes that most decisions are based on choosing courses of action among multiple alternatives. She goes on to identify factors such as the proper selection of issues that concern decision-makers, the involvement of people directly involved in the program, and analysis conducted in a timely fashion as factors that will increase the utilization of policy analysis and evaluation.50 The presence of these factors, Weiss argues, will enhance the probability that the policy evaluation will be utilized by decision-makers. While predominantly focused

48 Ibid, 298. 49 Weiss, Carol. 1972. “Utilization of Evaluation: Toward Comparative Study.” Evaluating Action Programs: Readings in Social Action and Education. Ed. Carol Weiss. Boston, MA: Allyn & Bacon, 318. 50 Ibid, 324-325.

24 on programmatic evaluation, these factors can be generally applied to any policy analysis being conducted. Robert E. Floden and Stephen S. Weiner contribute to our understanding of knowledge utilization with their 1978 article entitled “Rationality to Ritual: The Multiple Roles of Evaluation in Governmental Processes.” The basis for Floden and Weiner’s research is that policy evaluation provides information needed by rational decision-makers to make decisions.51 They document disillusionment with policy evaluation in its current form due to its inability to provide quality information and analysis to decision-makers.52 However, there are other functions that policy evaluation contributes to, such as managing conflict and promoting social change. In fact, Floden and Weiner suggest that policy evaluation has become a societal ritual which serves to calm the citizenry by perpetuating an image of government rationality. Floden and Weiner present a decisionistic model for policy evaluation which is based on three important premises. The decisionistic model is predicated on the assumption that policy evaluation research is conducted for the purpose of a policy maker ultimately making a decision based on the research. First, the model assumes that there are measurable goals in place that programs are designed to achieve. Secondly, the model requires collecting information that can be used to measure the effectiveness of the program in achieving its stated goals. Finally, the model is based on the ultimate utilization of the evaluation being conducted. Discrete decisions

51 Floden, Robert E. and Stephen S. Weiner. 1978. “Rationality to Ritual: The Multiple Roles in Evaluation of Government Processes.” Policy Sciences 9 (1): 9. 52 Floden and Weiner’s description of policy evaluation is one function within the broader field of policy analysis and knowledge utilization.

25 will result from the evaluation that is aimed at improving or changing the program being evaluated.53 The decisionistic model is generally only used when the performance of a policy or programmatic activity is considered unsatisfactory or that there is some belief that goals are not being met or not being met in a timely fashion. This model is also invoked during times of public pressure and scrutiny such as occurred at the federal level with the Head Start program in the 1970’s.54 There is, therefore, an implication that the utilization of the results of the evaluation will be implemented quickly or even immediately, particularly if there is a need to satisfy public pressure. However, Floden and Weiner’s research indicates that this is not the case as most evaluative research using this model is implemented over time and is more long-term oriented.55 Floden and Weiner go on to provide some alternatives to the decisionistic model such as evaluation for conflict resolution purposes. Often times, policy evaluation is used to manage conflict as a means of promoting gradual social change. This model is appropriate when there is a policy or program that is relatively new, usually in its pilot or experimental stage. Policy evaluation can then be used as a means of negotiating compromise among competing interests in the policy area.56 Another alternative model utilizes policy evaluation as a means of complacency reduction. Generally this model is based upon generating participation among the various stakeholders in the policy or programmatic activity being evaluated. This type of evaluation can have two impacts on

53 Ibid, 10. 54 Ibid, 11. 55 Ibid, 12-13. 56 Ibid, 14.

26 public policy. First, the participatory component of the evaluative technique should bring about a clarification of the policy’s goals and objectives among the stakeholders. Secondly, this type of evaluation can lead to a revision or an outright rejection of the existing goals and objectives, thereby bringing about an examination of program and policy alternatives. The key to this model, however, is its participatory nature by getting the program participants to rethink the policy and hopefully re-energize it as well by reducing complacency effects that may have developed over time.57 Lastly, Floden and Weiner present their case for policy evaluation as a ritualistic endeavor conducted by government as a means to calm the anxieties of citizens by creating an image of governmental rationality, efficacy, and accountability. This type of evaluation would tend to portray government as committed to these aforementioned ideals. In turn, this would enhance the image of public officials who are concerned with the performance of public policy, programs, and activities.58 Furthermore, it would also serve as a reassuring mechanism with the citizenry who will feel as if government is acting in such a way as to address societal problems and enhance policy and program efficiency and effectiveness.59 Ritualistic evaluation may not necessarily lead to any substantive change in the policy being evaluated, but the mere fact that policy evaluation is occurring can enhance the image of public officials and serve to calm the fears of citizens who might feel as if government is not committed to solving societal problems or finding the best ways to deliver policies, programs, and activities.

57 Ibid, 14. 58 Ibid, 16. 59 Ibid, 17.

27 Another seminal work in the field of policy analysis is Aaron Wildavsky’s Speaking Truth to Power: The Art and Craft of Policy Analysis. Wildavsky aimed to contribute to our understanding of public policy while helping to establish the field as a discipline of study. One of the major tenets of the book is an examination of the increased politicization of the field of public policy and policy analysis. Of particular interest to the research at hand, Wildavsky pays particular attention to citizens as integral components in the policy analysis process. Strategic planning processes and the use of community opinion surveys as a needs assessment tool are predicated on involving the citizenry in providing input, opinions, and information that elected officials will use to make decisions. Wildavsky focuses on citizenship in modern life and getting people involved in the policy process, provided that general feelings of citizen apathy can be overcome.60 In order for citizens to make informed policy choices or at least express their opinions about the policy choices at hand, Wildavsky argues that sensible choices on the parts of citizens will be made when they can compare efforts and results, learn from personal experience, and understand the relative importance of various policy issues.61 It becomes the role of the policy analyst to cultivate this sense of citizenship and participation in order to get citizens involved in the policy process. Unfortunately, this is not something that can happen overnight as Wildavsky stresses that this notion of citizenship needs to develop to the point that it becomes part of daily

60 Wildavsky, Aaron. 1979. Speaking Truth to Power: The Art and Craft of Policy Analysis. Boston, MA: Little, Brown. 253-254. 61 Ibid, 255.

28 life.62 It is the concept of strategic planning that tries to incorporate this notion of citizen participation into the policy making process. In Peter Szanton’s book Not Well Advised, he contributes to our understanding of knowledge utilization in the public policy process by examining the local government technical assistance efforts offered by universities, consulting firms, and professional associations. Szanton concluded, as evidenced by the title of his book, that these efforts went largely unfulfilled in being able to effectively make a difference and provide proper insight into the policy process and provide proper recommendations to decision-makers. Some of the reasons for this failure include a failure to get appropriate guidance from clients, a failure to identify the fundamental political nature of their effort, and a propensity to recommend a fundamental change in policy rather than incremental adjustments.63 Fundamental decisions are those decisions that adopt a course of action that is radically different from the status quo.64 Furthermore, Szanton identified several things that technical assistance providers can do to help make their recommendations and advice more on-target to fit the needs of their clients. Some of these lessons that Szanton discusses include, but are not limited to: 1. Gain a better understanding of the client. 2. Produce “usable advice” 3. Ask why you want to provide technical assistance.

62 Ibid, 260. 63 Szanton, Peter. 1981. Not Well Advised. New York: Russell Sage Foundation and The Ford Foundation. 62-65. 64 For a more detailed explanation of decision-making theory, including both fundamental and incremental decisions, see: Etzioni, Amitai. 1967. “Mixed Scanning: A 'Third' Approach to Decision-Making." Public Administration Review 27 (5): 385-392. Also, see: Lindblom, Charles. 1959. "The Science of Muddling Through." Public Administration Review 19 (2): 79-88.

29 4. If you’re going to do it, do it right.65 The lessons that can be taken from Szanton are echoed in the case illustrations that will be presented in this research. In order to collect usable data and information that decision-makers can rely on to make informed decisions, these lessons identified by Szanton must be taken into account. This will be examined more in-depth in Chapter 3 as the case illustrations are presented. Another seminal work in the field of public policy and knowledge utilization is Usable Knowledge: Social Science and Social Problem Solving by Charles E. Lindblom and David K. Cohen. Lindblom and Cohen propose the need for more research into the current research on how public policy is used by elected officials. Their work revolves around developing and conducting PSI (Professional Social Inquiry) as a means to give more meaning and relevance to public policy research.66 Their call for PSI stems from a sense of frustration and dissatisfaction with the usability of public policy research being conducted. Those who are responsible for providing public policy research tend to feel ignored, while those who receive public policy research are generally not interested in the research that is provided to them. In short, they do not feel that the data and information that is provided to them has any real usability. In this regard, there is a gap among providers and recipients that needs to be bridged in order for quality policy decisions to be made. Lindblom and Cohen succinctly point out one of the major tenets that underpin the literature on public policy knowledge utilization – the

65 Szanton, Not Well Advised. 135-144.

66 Lindblom, Charles E. and David K. Cohen. 1979. Usable Knowledge: Social Science and Social Problem Solving. New Haven, CT: Yale University Press. 8.

30 importance of the policy analysis and options being presented as being ultimately useful. Clearly, there are many factors that are critical to determining usability, most of which have been identified in this literature review. Ultimately, usability is determined by the decision maker who will use the knowledge to craft public policy. Knowing the audience (the decision maker) and how they are likely to use the information being transmitted to them is of paramount importance to policy analysts being able to meet the demand for quality analysis. A more recent contribution to our understanding of public policy, with an emphasis on the twenty-first century, comes from Lisa Anderson’s Pursuing Truth, Exercising Power: Social Science and Public Policy in the Twenty-first Century. Anderson begins with an assessment of the current state of public policy at the turn of the century and asserts that the locus of policy making has shifted dramatically toward the end of the twentieth century. Previously, the locus of public policy had squarely consisted of government. However, that locus has shifted to now include a wider variety of key players in policy making such as private sector consulting firms, not- for-profit organizations, as well as transnational and community-based organizations. This shift of loci coincides with a general tendency towards both privatization and globalization which took hold in the late twentieth century.67 Anderson uses this shift in public policy making as her point of departure for examining the future of public policy given these dramatic changes. In the end, Anderson calls for the training of policy makers to change to match the current state of policy making. Specifically, Anderson

67 Anderson, Lisa. 2003. Pursuing Truth, Exercising Power: Social Science and Public Policy in the Twenty-First Century. New York, NY: Columbia University Press. 4.

31 feels that the training of policy makers must transcend issues of provincialness and private interests and provide an understanding of the need for a commitment to society and public lives when it comes to public policy.68 Understanding this change in how public policy is being conducted and ultimately formulated is important to this research as it expands the realm of where local elected officials will look for assistance in developing public policy for their local government.

Survey Research This dissertation also builds upon the literature pertaining to survey research from both an academic and an applied perspective, including survey design, methodology, and interpretation of results. Of particular importance to this research is the literature directly related to survey design as it dovetails nicely with the literature on knowledge utilization. It is imperative to construct survey questions in such a way as to generate quality, usable, and meaningful results in order to be able to evaluate current policies and programmatic activity and to plan for future policy decisions. Before delving into the literature pertaining to the development of surveys, it is important to recognize the linkage between public policy and survey research. In an article entitled “The Role of Consumer Surveys in Public Policy Decision Making,” Manoj Hastak, Michael B. Mazis, and Louis A. Morris examine the impact of consumer research on the policy cycle as conceptualized in their six stage process. It should be pointed out that their research is conducted from the perspective of marketing theories and the survey research they discuss is targeted towards consumers who come into contact with product labels, advertisements, package inserts, and

68 Ibid, 110.

32 leaflets as regulated by assorted federal agencies (e.g. Food and Drug Administration (FDA), Federal Trade Commission (FTC), and the Office of Management and Budget (OMB).69 Their findings indicate that survey research could have more meaningful contributions during the executing and enforcing stage of the policy-making process, more commonly known as policy implementation. More directly related to this dissertation is the role of citizens in measuring the performance of government services in an attempt to improve the quality of life in communities. Lyle Wray and Jody Hauer explore this topic in their 1997 journal article that appeared in Public Management. In “Performance Measurement To Achieve Quality of Life: Adding Value Through Citizens,” Wray and Hauer explore the ways in which citizens can contribute to performance measurement exercises. Citizens can act as visionaries, customers, co-producer of services, evaluators, and owners.70 Examining the role of citizens as evaluators of services is of particular relevance to this research. Citizens can and should be asked to evaluate the services they receive from their government. It will benefit the community as a whole by engaging citizens and leading to a more interested and attentive citizenry while also demonstrating the government’s concern and interest in the opinions of its citizens.71 The literature on survey research is ripe with information pertaining to the development of surveys or questionnaires as part of conducting effective research. One such work comes from Robert A. Peterson who wrote Constructing Effective Questionnaires in 2000. Peterson delves into the

69 Hastak, Manoj, Mazis, Michael B., and Louis A. Morris. 2001. “The Role of Consumer Surveys in Public Policy Decision-Making.” Journal of Public Policy and Marketing 20 (2): 170. 70 Wray, Lyle and Jody Hauer. 1997. “Performance Measurement To Achieve Quality of Life: Adding Value Through Citizens.” Public Management 79 (8): 5-6. 71 Ibid. 6.

33 technical aspects of survey design as he discusses written questions and self- administered questions. Furthermore, Peterson details other important technical aspects of survey design including the types of questions that can be utilized, the ordering of questions on the survey, how questions should be worded, and the types of scale or ratings that can be used to structure question responses. Peterson also presents the relationship between questions and answers. Peterson indicates that how answers are provided is directly impacted by how the questions are written and configured within the survey. There must also be recognition of the fact that how questions (and the subsequent means of providing answers) are interpreted by survey respondents needs to be similar for all survey respondents. There should be a clear emphasis, then, on consistency. The goal of survey research is to generate meaningful information according to Peterson and the importance of the consistency of interpretation of survey questions and the means of answering those questions cannot be underestimated.72 To summarize, Peterson believes that the quality of data received is directly proportional to the quality of the survey which is directly proportional to the quality of the question construction and design. Keith F. Punch details the technical aspects of survey design and research in Survey Research: The Basics. The focus of Punch’s book is on identifying the critical elements of small-scale quantitative surveys. Of particular importance, according to Punch, is for the research to understand the relationships between variables on the survey. Also, this book identifies the essential elements of all surveys that must be considered before the

72 Peterson, Robert A. 2000. Constructing Effective Questionnaires. Thousand Oaks, CA: Sage Publications, Inc.

34 survey is implemented. These elements include: the goals and objectives of conducting the survey, the types of questions asked, sampling techniques, and survey administration which impacts data collection and analysis. Finally, Punch details the importance of having an implementation strategy for conducting the survey as well as having a mechanism in place for reporting the findings of the survey.73 This point is important to the process of conducting a survey as it adds credibility to the project if respondents know that the results will be presented, published, or otherwise made available to the appropriate decision-making bodies as well as the subject population. In Survey Research Methods, Floyd J. Fowler, Jr. details much of the same information concerning the technical aspects of conducting effective and meaningful survey research, including issues related to sampling, data collection, and data entry. Fowler contributes an important point about the importance of question design. Properly constructed questions are directly related to the reliability of the data collected. As the quality of the question design increases, so does the researcher’s ability to have confidence and reliability in the survey results.74 Arlene Fink, along with Jacqueline Kosecoff, wrote How to Conduct Surveys: A Step-By-Step Guide in 1998. Their discussion of survey research begins with a simple definition of the term survey. It is worth repeating here: “…method of collecting information from people about their ideas, feelings, health, plans, beliefs, and social, educational, and financial background.” Furthermore, surveys are used to help policy makers, program

73 Punch, Keith F. 2003. Survey Research: The Basics. Thousand Oaks, CA: Sage Publications, Inc. 74 Fowler, Jr. Floyd J. 2001. Survey Research Methods. Thousand Oaks, CA: Sage Publications, Inc.

35 planners, evaluators and researchers in their work.75 Fink and Kosecoff make an important contribution to our understanding of what needs to be done to properly analyze the data that is collected by a survey. Much of their discussion of analyzing the data is related to the types of statistical techniques that should be used when making sense of a survey’s findings, including descriptive statistics (such as counts, measures of central tendency, and variation), correlations and relationships between and among variables, measurement of differences (such as t-tests, chi squares, ANOVA), and measures of change over time. Finally, they point out that the data must be tested for statistical significance so that you can be confident that there is no bias in the data being reported.76 Fink also wrote The Survey Handbook in 2003 as the first volume in a ten volume series entitled The Survey Kit. Fink carefully points out that a clear purpose must be in mind when undertaking a survey project including some agreement on how the survey results will be used along with clearly defined objectives. Fink also covers the basics of the technical aspects of survey design, including survey methodology and sampling techniques, ensuring the reliability and validity of the survey content, how to report the findings (defining variables and the use of statistical methods), the difference between qualitative and quantitative data, and ethical issues to be considered when conducting survey research. Finally, Fink provides a useful checklist of tasks that need to be completed in order to conduct a successful survey.77

75 Fink, Arlene and Jacqueline Kosecoff. 1998. How to Conduct Surveys: A Step-By-Step Guide. Thousand Oaks, CA: Sage Publications, Inc. 1. 76 Ibid, 59. 77 Fink, Arlene. 2003. The Survey Handbook. Thousand Oaks, CA: Sage Publications, Inc.

36 The International City/County Management Association (ICMA) has published a guide to conducting citizen surveys. Written by Thomas I. Miller and Michelle Miller Kobayashi in 2000, Citizen Surveys: How to Do Them, How to Use Them, What They Mean presents a complete look at the process of conducting a citizen survey and utilizing the data collected. Early on in their book, Miller and Kobayashi point out that the main reason for local governments to conduct citizen surveys is to get a “…clear understanding of public preferences.” Furthermore, they point out that surveys bring of the public to the decision making table in regards to public issues.78 In other words, surveys are a vehicle to give citizens access to the decision making process and provide necessary information on issues on the agenda of local governments. Of particular importance to the research presented in this dissertation, Miller and Kobayashi use the first chapter of their book to discuss the purposes or reasons to conduct citizen surveys. Their work is particularly salient because they detail both positive and negative reasons for conducting citizen surveys. Among the positive or appropriate reasons to conduct citizen surveys, Miller and Kobayashi identify the following: needs assessments, long and short range strategic planning, a means to generate policy options, to evaluate the current state of the community and service delivery, and for matters of receptivity or improving the image of government.79 There are other reasons that are often invoked for conducting citizen surveys that Miller and Kobayashi warn are reasons that are not appropriate. These reasons include: as a front to push a particular program or to raise funding,

78 Miller, Thomas I. and Michelle Miller Kobayashi. 2000. Citizen Surveys: How to Do Them, How to Use Them, What They Mean. 2nd Edition. International City/County Management Association: Washington, D.C. 3. 79 Ibid, 12-14.

37 as ‘ammunition’ to help prove a point, as a referendum or a proxy vote on an particular issue, and to collect data that is already available from another source.80 The latter part of the book deals with how to put the results and the data collected to work in terms of making decisions. There are four keys to making sure the results get used that are identified by Miller and Kobayashi. First, they suggest forming a task force charged with reviewing the results and making policy recommendations based on the findings. Communication is an integral part of translating the results into meaningful decisions and it is recommended that the survey results are transmitted to the appropriate staff and elected officials charged with dealing with policy-making. A third key to using the results relates to the desire to evaluate the current state of service delivery by developing performance measurements in an attempt to improve service delivery. Finally, Miller and Kobayashi also recommend considering the creation of focus groups to address the survey results. These focus groups would include citizens as yet another means of interjecting citizen participation into the decision-making process.81 Their work concludes with a series of case studies of local governments who conducted citizen surveys that demonstrate how the results were put to work in decision-making in those jurisdictions. Moreover, Miller and Kobayashi also discussed what ingredients go into a successful citizen survey in the article “The Voice of the Public: Why Citizen Surveys Work” which appeared in the journal Public Management. In this article, Miller and Kobayashi present seven suggestions for how local governments can use survey results effectively. These suggestions include:

80 Ibid, 14-15. 81 Ibid, 148-152.

38 • Refer to results whenever citizens tell you they know what the community thinks. • Bring results into discussions with elected officials about strategic planning. • Monitor results to track the quality of service delivery. Allocate resources where they seem most needed. • Compare results with those of similar communities to identify opportunities to benchmark service performance. • Consider holding department directors to agreed-upon targets for consumer satisfaction. • Decide whether to press for a community policy that you’ve tested in a citizen survey. • “Jawbone” the results in your citizen newsletter and at press conferences.82

In general, these tips are important to keep in mind to ensure that survey results are utilized and have the greatest opportunity to be translated into decisions. In Reaching Decisions in Public Policy and Administration: Methods and Applications, an edited volume by Richard D. Bingham and Marcus E. Ethridge, two of the chapters deal specifically with survey research and citizen participation in decision making. The first chapter, entitled “Survey Research in the Study of Administration and Policy Problems” by Ronald D. Hedlund, outlines data collection techniques related to surveys and the steps in the process of conducting meaningful, usable surveys. Hedlund makes a significant observation related to the design of survey questions. Questions must be designed that will provide meaningful results and data that can be easily understood and translated into decisions which is a common theme within survey research literature. Hedlund points out that there is a flip-side to writing quality questions for a survey. Questions must be constructed in

82 Miller, Thomas I. and Michelle Miller Kobayashi. 2001. “The Voice of the Public: Why Citizen Surveys Work.” Public Management 83 (4): 9.

39 such a way as to be easily understood by the respondents. Hedlund uses the term ‘knowable’ in his description of questions.83 Respondents must have a clear understanding of what is being asked of them and how to best respond and provide a meaningful and usable response. The second chapter of this work, by Michael R. Fitzgerald and Robert F. Durant, is entitled “Citizen Evaluations and Urban Management: Service Delivery in an Era of Protest.” Fitzgerald and Durant’s perspective is based on democratic theory from the perspective that it is the right of citizens to evaluate government policies, programs, and activities.84 Citizen surveys would be one such mechanism for providing an opportunity for citizens to evaluate and express their opinions on the operation of government. In 1992, The Urban Institute along with the International City/County Management Association (ICMA) published a survey resource manual entitled How Effective Are Your Community Services? Procedures for Measuring Their Quality. Harry P. Hatry, Louis H. Blair, Donald M. Fish, John M. Greiner, John R. Hall, and Philip S. Schaenman were the contributing authors to this manual. The manual identifies specific service delivery areas that are common among local governments. These areas include: solid waste collection and disposal, parks and recreation, library services, crime control and policing, fire services, transportation and mass transit (if applicable), and water supply and distribution. Sample questions in each of these areas are presented with particular attention paid to the measurements that need to be targeted to specific services. In other words, it

83 Hedlund, Ronald D. 1982. “Survey Research in the Study of Administrative and Policy Problems.” Reaching Decisions in Public Policy and Administration. Ed. Richard D. Bingham and Marcus E. Ethridge. New York, NY: Longman, Inc. 7. 84 Fitzgerald, Michael R. and Robert F. Durant. 1982. “Citizen Evaluations and Urban Management: Service Delivery in an Era of Protest.” Reaching Decisions in Public Policy and Administration. Ed. Richard D. Bingham and Marcus E. Ethridge. New York, NY: Longman, Inc. 30-31.

40 is essential to identify appropriate means of measuring a variety of services and not all measurements may be applicable to all services. The importance of measuring service delivery quality in the public sector was at an all-time high at the time of the publication of this manual (1992) due in large part to local governments embracing the theory and concepts behind the Total Quality Management (TQM)85 movement.86 Other research has attempted to determine whether administrators can accurately predict the perceptions expressed by citizens on surveys. This research was published in Public Administration Review and was written by Julia Melkers and John Clayton Thomas. Their article, entitled “What Do Administrators Think Citizens Think? Administrator Predictions as an Adjunct to Citizen Surveys,” explains how citizen survey data can be supplemented by the a priori predictions and assumptions of municipal administrators. According to Melkers and Thomas, these predictions can “…be an excellent technique for increasing administrative interest in and use of data from citizen surveys…”87 Finally, one of the most recent contributions to the literature on survey research appeared in the May/June 2005 edition of Public Administration Review. Michael J. Licari, William McLean, and Tom W. Rice studied the potential for bias, due to their opinions on other issues and their sociodemographic backgrounds, among respondents to citizen surveys. Their study compared the opinions of residents and non-residents in the evaluation of streets and parks in selected Iowa communities. The research

85 For more information on Total Quality Management (TQM), see: Deming, W. Edwards. 1986. Out of the Crisis. Cambridge, MA: Massachusetts Institute of Technology, Center for Advanced Engineering Study. 86 Hatry, Harry P., Blair, Louis H, Fish, Donald M., Greiner, John M., Hall, John R. and Philip S. Schaenman. 1992. How Effective Are Your Community Services? Procedures for Measuring Their Quality. Washington, D.C.: The Urban Institute and International City/County Management Association. 87 Melkers, Julia and John Clayton Thomas. 1998. “What Do Administrators Think Citizens Think? Administrator Predictions as an Adjunct to Citizen Surveys.” Public Administration Review 58 (4): 333.

41 yielded some interesting observations and found that the opinions of both residents and non-residents were very similar, allowing them to conclude that citizen surveys can indeed convey accurate opinions on the state of community services.88

Strategic Planning Finally, this dissertation builds upon the body of literature pertaining to the use of community opinion surveys as part of strategic planning efforts. There seems to be a dearth of academic literature specifically dealing with this topic. Rather, much of the literature applicable to this dissertation within this topic tends to focus more on the broader issue of citizen participation, particularly within the field of land use planning as well as applied research that outlines how to conduct strategic planning efforts in local communities. Clearly strategic planning efforts must incorporate citizens into the process for those efforts to be effective. Literature generally pertaining to this issue will help to inform the importance of citizen participation in the form of community opinion surveys. Samuel D. Brody, David R. Godschalk, and Raymond J. Burby wrote “Mandating Citizen Participation in Plan Making: Six Strategic Planning Choices” in 2003. As local governments must be able to plan for and manage urban growth and development as one of the essential services provided, many states mandate citizen participation in the process of managing and designing comprehensive land use plans. Their research demonstrates that citizens can and do positively impact land use planning

88 Licari, Michael, McLean, William, and Tom W. Rice. 2005. “The Condition of Community Streets and Parks: A Comparison of Resident and Non-Resident Evaluations.” Public Administration Review 65 (3): 360.

42 processes and that the degree of impact can vary depending on the type of participation mechanism used by local governments. Overall, citizen participation efforts are an important component to planning efforts and the methods utilized can be effective at overcoming issues of citizen apathy and disinterest and enhance the probability that the resulting plan will be ultimately approved by local elected officials.89 Another article relating to citizen participation was written by Raymond J. Burby in 2003, entitled “Making Plans that Matter: Citizen Involvement and Government Action.” Burby asserts that comprehensive plans, in order to be successfully implemented, need to involve a wide array of stakeholders in a broad-based approach to developing the plan. According to Burby, the key to including stakeholders in the process is being able to understand that communication is a two-way street. Planners need to involve stakeholders and listen to their concerns as well as communicate clearly and directly the concerns and issues of planning.90 Moreover, the Healthy Cities movement has had an impact on matters of strategic planning and connecting citizens with their local government as well as other citizens.91 Frank Benest explores this issue in the article entitled “Reconnecting Citizens with Citizens: What is the Role of Local Government?” Benest advocates that community leaders must help citizens to solve their problems together and foster the sense of togetherness that underpins the Healthy Cities movement.92 Various strategies are presented,

89 Brody, Samuel D., Godschalk, David R., and Raymond J. Burby. 2003. “Mandating Citizen Participation in Plan Making: Six Strategic Planning Choices.” APA Journal 20 (2): 260-261. 90 Burby, Raymond J. 2003. “Making Plans That Matter: Citizen Involvement and Government Action.” APA Journal 69 (1): 33. 91 For more information on the Healthy Cities movement, see: Clark, Doug. 1998. “Healthy Cities: A Model for Community Improvement.” Public Management 80 (11): 4-8. 92 Benest, Frank J. 1999. “Reconnecting Citizens with Citizens: What is the Role of Local Government?” Public Management 81 (2): 11.

43 but the main thrust to Benest’s discussion is incorporating citizens into the process of bettering the community and problem-solving. While the theme of citizen participation is important to the literature on the subject of strategic planning, it is also imperative to understand the process of strategic planning and how community opinion surveys fit into that process. Strategic planning is a technique that can be used to “…chart a basic direction for an organization in light of its mission, mandates, internal and external environments, and key stakeholder interests.”93 John M. Bryson, along with William D. Roering, contributed a chapter on this topic to the second edition of the Handbook of Public Administration edited by James L. Perry in 1996. Bryson and Roering present eight different approaches to strategic planning and details their various strengths, weaknesses, and applicability to public sector organizations as a starting point for getting acclimated to the process of strategic planning.94 Furthermore, it is generally accepted that a strategic planning process is appropriate in order to establish a vision for the future of the community, define a future direction, to build consensus, and to set priorities. Often times a strategic planning process is employed to break the cycle of crisis management, respond to a change in organizational leadership, respond to societal forces, respond to external factors, and to manage growth and development. The process needs to include a variety of stakeholders including, at a minimum, elected officials, staff, and citizens. Beyond that, civic organizations, local school officials, and local business leaders should also be invited to participate.

93 Bryson, John M. and William D. Roering. 1996. “Strategic Planning Options for the Public Sector.” Handbook of Public Administration. Ed. James L. Perry. San Francisco, CA: Jossey-Bass Publishers. 479. 94 Ibid. 480-495.

44 Once this groundwork has been laid, the traditional strategic planning process begins with the establishment of a mission statement if one is not already in place. The next step is to conduct an inventory and an environmental scan of the current state of affairs of the community. The inventory should include an attempt to catalog the tools and resources at the disposal of the community as well as to gauge current service delivery levels. It is at this point that community opinion surveys would be utilized to collect baseline data on the quality of service delivery as well as a measure of citizen attitudes and opinions on growth and development and to get a sense of what makes up the public’s perception of an appropriate vision for the future. The environmental scan requires the community to conduct a situational analysis that would include an attempt to identify the strengths, weaknesses, opportunities, and threats facing the local government. Once the necessary data has been collected and analyzed, the next step is to use that information to formulate both vision (what you hope to achieve in the future) and mission (a statement of purpose) statements. Once these statements have been designed and agreed upon, the next step is to set goals and prioritize them. It is important to differentiate between goals (long-term desired outcomes) and objectives (measurements of goal achievement) and establish both. Once goals and objectives have been identified, a strategic action plan can be developed. Creating a strategic action plan requires an evaluation of the goals and objectives in order to sort them out and prioritize them. At this stage, it is important to build consensus on what those priorities should be. Once consensus is reached, a plan for implementation must be created to begin to determine the details of how these goals and objectives will be met. Finally, a method of evaluation should be put in

45 place to guarantee accountability and to measure goal achievement into the future. 95 Much of the aforementioned information for conducting a strategic plan comes from the work of John M. Bryson and his book entitled Creating and Implementing Your Strategic Plan: A Workbook for Public and Non- Profit Organizations. Bryson outlines a step-by-step process for strategic planning, including checklists of necessary tasks and worksheets to help gather information to be used in the process. It bears mentioning that Bryson discusses the importance of identifying and framing strategic issues facing the community. Once these issues have been identified, Bryson suggests sorting or prioritizing the issues based upon three factors (or categories): issues that require immediate action, issues that will require action in the near future and issues that may or may not require action, but must be monitored.96 This will help to focus on specific issues and design appropriate strategies to deal with those issues. Moreover, Bryson’s most recent contribution to the literature on strategic planning came in 2004 with the publication of the third edition of Strategic Planning for Public and Non-Profit Organizations: A Guide to Strengthening and Sustaining Organizational Achievement. Bryson presents his preferred method for strategic planning called the Strategy Change Cycle which is not only a process for strategic planning, but is also designed to provide the information needed for strategic management. Bryson’s method consists of a ten step process which links planning with implementation.97 A

95 Center for Public Management and Regional Affairs – Miami University. 2002. Ohio Municipal League – Mayors’ Association of Ohio, Leadership Training Academy Module V – Goal Setting and Team Building. Oxford, OH: Center for Public Management and Regional Affairs. 96 Bryson, John M. 1996. Creating and Implementing Your Strategic Plan: A Workbook for Public and Non-Profit Organizations. San Francisco, CA: Jossey-Bass Publishers. 63. 97 Bryson, John M. 2004. Strategic Planning for Public and Nonprofit Organizations: A Guide to Strengthening and Sustaining Organizational Achievement. San Francisco, CA: Jossey-Bass Publishers. 31.

46 strategic planning process is only as good as the resulting blueprint for implementing the strategies that are identified. The Strategy Change Cycle emphasizes the role of management charged with carrying out the strategic plan and seeing it come to fruition. Yet another applied research resource on how to conduct strategic planning comes from the Practical Management Series published by the ICMA. Gerald L. Gordon wrote Strategic Planning for Local Government in 1993. Gordon presents a process similar to what Bryson offers for conducting strategic planning, but specifically targeted to local governments. Gordon’s second chapter presents an analysis of the strategic plan and the role of stakeholders in the process. Once stakeholders are identified, a plan must be put into place that outlines the level of involvement that the stakeholders will have in the process. According to Gordon, the level of involvement will vary depending on the local government and their preferences. However, Gordon seems to recommend a grass-roots approach that relies heavily on citizen participation and involvement. This type of approach can be very effective because it can create greater legitimacy among the citizens for the goals, objectives, and strategies that ultimately result from the process.98 Building Communities from the Inside Out: A Path Toward Finding and Mobilizing a Community’s Assets, by John P. Kretzmann and John L. McKnight, contains some valuable how-to information particularly when it comes to the process used for asset-based community development. Kretzmann and McKnight outline a five step process that can be used to

98 Gordon, Gerald L. 1993. Strategic Planning for Local Government. Washington, D.C.: International City/County Management Association. 18-19.

47 mobilize an entire community to participate in asset-based community development. These steps are: • Map the capacities and assets of individuals, citizens’ associations and local institutions, • Build relationships among local assets for mutually beneficial problem-solving, • Mobilize the community’s assets fully for economic development and information sharing purposes, • Convene a broadly representative group to build a community vision and plan; and • Leverage activities, investments, and resources from outside the community to support asset-based, locally defined development. 99

It is the fourth step in the process that bears the most relevance to this dissertation and deserves some attention here. According to Kretzmann and McKnight, it is important to develop and define a local vision for the community for the future and to then design strategies in an attempt to implement that vision. Several questions need to be asked and answered in the process, including: Who are we in this community? What do we value most? Where would we like our community to go in the next five, ten, twenty years? 100 These questions or other questions designed to elicit related information, need to be asked of the community and one such way of collecting this information is using a community opinion survey. Finally, for some examples of strategic planning processes in action, the book Community Visioning/Strategic Planning Programs: State of the Art provides an examination of actual programs implemented in ten different states. The examination of these programs is important, according to the

99 Kretzmann, John P. and John L. McKnight. 1993. Building Communities From the Inside Out: A Path Toward Finding and Mobilizing a Community’s Assets. Evanston, IL: The Asset-Based Community Development Institute, Institute for Policy Research, Northwestern University. 345. 100 Ibid. 351.

48 authors, because of the realization that localities must be accountable for their future and strategic planning is an essential component. The programs are examined individually in order to provide an understanding of the format and design of the programs because they vary by locality. Noting those variations is important to understanding the relative success and failure of the various programs.

Research Question and Hypotheses The new imperative facing local governments that is forcing them to embrace a long-range strategic implementation plan and to become more future oriented in their thinking is pivotal to the research question this dissertation attempts to answer. As more and more local governments conduct community opinion surveys as a means to collect the necessary information to create a strategic plan and inform decision-making, it is natural to consider the relative success of local governments and their ability to convert the data collected into actual policy decisions. As with any endeavor, local governments will have varying amounts of success or failure. Thus, the research question at hand asks: What factor or factors can be identified that determine how the results of a community opinion survey may or may not be used by local governments? In order to answer this question, the factors identified will be used to create a typology that can be used to broadly categorize how community opinion surveys and their resulting findings are ultimately used by the local government conducting the survey. This research question implies that not all community opinion surveys ultimately lead to the creation of a long-range strategic implementation plan.

49 But what factors account for this variation? This research offers two hypotheses in an attempt to shed light on this subject. The first hypothesis is: “The primary reason for conducting a community opinion survey as part of a strategic planning process is the main determinant in how survey results are ultimately used.” This hypothesis will be tested by ascertaining the primary reason for conducting a community opinion survey through a personal interview question pertaining to the case illustrations. The interviews and case illustrations are explained in greater detail in the next section of this chapter. A second hypothesis relates to the types of questions that appear on a community opinion survey. Therefore, the second hypothesis is: “Community opinion surveys that ask questions designed to elicit opinions pertaining to long range growth and development will lead to the survey results being used to create a long-range strategic implementation plan.” This hypothesis will be tested by reviewing the types of questions that appear on the community opinion surveys that will be presented as case illustrations.

Methodology In order to develop a typology of how community opinion surveys are utilized by local governments, a series of case illustrations of local governments who conducted these types of surveys with the assistance of the Center for Public Management and Regional Affairs (CPMRA) at Miami University will be presented. The Center for Public Management and Regional Affairs engages in applied research, technical assistance services, training and education, and data base development in the areas of public management and capacity building, local government economic

50 development and planning, and public program evaluation and policy research. Furthermore, the Center’s activities are funded by external grants and contracts from a number of funding sources. The Center’s primary efforts are directed toward research and assistance to small/non-metropolitan cities, villages, townships, and counties. Projects undertaken by the Center are initiated upon request from local governments or by Center staff as an ongoing program for various local governments in the region and statewide.101 Specifically, four case illustrations will be presented. These four case illustrations are community opinion surveys that were conducted with the assistance of the Center for Public Management and Regional Affairs during a five year period (1998 to 2003). The case studies and the year in which the surveys were conducted are: Village of Williamsburg, Ohio (1998), Oxford Township (Butler County), Ohio (2000), Village of Coldwater, Ohio (2002), and Hanover Township (Butler County), Ohio (2003). The author of this dissertation was the staff member directly responsible for three of the four case illustrations presented. Data concerning these four case illustrations was collected through a series of personal interviews. In terms of who was selected to be interviewed for this research, elected/appointed officials from those jurisdictions who are still in office or serving in their appointed position were selected to be personally interviewed. Four individuals were interviewed: Timothy Derickson, Trustee, Hanover Township (Butler County), Ohio; Mayor Mary Ann Lefker, Village of Williamsburg, Ohio; George Simonds, Trustee, Oxford Township (Butler County), Ohio; and Eric Thomas, Administrator,

101 CPMRA Internet Site: About the Center. Center for Public Management and Regional Affairs – Miami University. 29 July 2004 .

51 Village of Coldwater, Ohio. Both Derickson and Simonds were elected Township Trustees at the time their community opinion surveys were administered and have continued to serve in their positions continuously since that time. Mayor Lefker, at the time the community opinion survey was administered in Williamsburg, Ohio, was serving as an elected member of Village Council and was also serving as Vice-Mayor. She has continued to serve in elected office since that time and has since been elected Mayor, which is her current position. Finally, Eric Thomas was appointed Administrator of the Village of Coldwater in 2001 during the community opinion survey project. He became Administrator while the survey results were being tabulated and was the official who was presented with the results of the survey. Thomas has remained in the position of Administrator since that time. The interview protocol was designed as a follow-up to the community opinion surveys to collect data on decision-making, specifically how the survey results were used to inform decision-making in these jurisdictions in the time that has passed since the surveys were conducted. Additionally, data was collected to evaluate decisions made prior to the conducting of the survey for which survey questions were designed to provide evaluative data on those decisions. Finally, data was collected to identify policy areas in which the local government was in the process of designing a policy strategy at the time of the survey and how the survey results helped to inform those strategies. The interviews were conducted on-site in the jurisdictions that the elected/appointed officials represent and were not tape recorded. Because the interviewees were specifically selected, respondents are not anonymous and the data collected is not confidential as it pertains to matters that are of public record in the jurisdictions in which the

52 elected/appointed officials represent. Appendix A contains the interview questions and protocol.

Chapter Summaries Chapter Two discusses the benefits and utility of community opinion surveys and examines each of those various utilities, including: providing a benchmarking/service delivery assessment, as a citizen participation mechanism, as a communication tool (providing information and education), and as a strategic planning tool for the future. Furthermore, this chapter will outline the survey typology (and the characteristics of each type) that this research informs which will be used to categorize the case illustrations presented in the next chapter. Chapter Three will examine the content and results of each community opinion survey presented as a case illustration. A summary of the survey design, methodology, and results will be offered for each community opinion survey serving as a case illustration. Chapter Four will present the data collected from the interviews conducted for each of the four case illustrations. The data collected from the interviews will then be analyzed and measured against the survey typology presented in Chapter Two. Each of the four case illustrations will then be categorized into one (or more) of the survey typologies. Chapter Five will assess the research question and conclusions will be drawn based upon the findings presented here. A brief discussion of future research that would build upon this work will also be presented.

53 2 A TYPOLOGY OF COMMUNITY OPINION SURVEYS This chapter will analyze the benefits and utility of conducting community opinion surveys. The literature on survey research identifies those various utilities as: providing a benchmarking/service delivery assessment, as a citizen participation mechanism, as a communication tool (providing information and education), and as a strategic planning tool for the future. Based on the reasons for wanting to conduct a community opinion survey, a survey typology (and the characteristics of each type) will be presented. This typology will be used to categorize the case illustrations presented in the next chapter. As previously mentioned, there seems to be general consensus within the literature on survey research as to what constitutes appropriate purposes or reasons to conduct citizen surveys. Miller and Kobayashi’s work on citizen surveys is particularly salient for this very reason. They detail both positive and negative reasons for conducting citizen surveys.102 Each of these reasons will be examined in detail as these reasons are a primary determinant for the first hypothesis of this research. The first such reason to conduct citizen surveys, according to Miller and Kobayashi, is for a community needs assessments. Generally, these assessments relate specifically to social services such as child care, mental health needs, job training programs, housing rehabilitation, health care, victim assistance, or transit needs.103 A survey conducted for this purpose would be designed to target the changing needs of a community. Are there social service programs that are under-utilized, over-utilized, or not offered? What new programs might need to be implemented? A survey conducted as

102 Miller and Kobayashi, Citizen Surveys: How to Do Them, How to Use Them, What They Mean. 12-14. 103 Ibid, 12.

54 a needs assessment survey should be able to answer these important questions. Ultimately, the most important data collected by a community needs assessment survey will be demographics. In order to properly understand the needs of the community for these social services, it is imperative to understand the characteristics of the community. These characteristics would include: income levels (poverty), age, employment/unemployment rates, and ethnic and racial diversity.104 This data would be critical to being able to design strategies and programs to meet the social services needs of a community. Another reason to conduct a community opinion survey is to collect data to enable long-range and short-term planning efforts. Miller and Kobayashi lump these planning efforts together, but they are sufficiently different to be considered individually. Miller and Kobayashi treat this reason as important to program planning by involving a wider and more representative group of citizens in the process.105 Specifically, including citizens in this process can help to shape the implementation of programs by eliminating policy options that are perceived negatively. Policy options that are positively received can then be pursued. This short-term planning function seems to be decisionistic in nature as the data collected by a community opinion survey would be utilized to make specific and immediate decisions about the course of programs and their implementation. Long-range planning efforts can also be enhanced by conducting a community opinion survey. These efforts would be aided by involving a wider and more representative group of citizens in the process just as they

104 Ibid, 12. 105 Ibid, 12.

55 were utilized in short-term planning.106 However, it seems that if a community were conducting a survey for long-range planning efforts, the resulting decisions and courses of action adopted may not be made immediately. Rather, the issues being considered and any resulting decisions would be those that have relevance over a longer period of time with a futuristic orientation. This long-term planning function seems to be strategic in nature as the data collected by a community opinion survey would be utilized to plan for the future needs of a community. Another reason for conducting a community opinion survey would be to receive feedback from citizens on specific policy options. According to Miller and Kobayashi, public officials sometimes shy away from collecting this type of data particularly if survey questions resemble a referendum on a specific policy option because of the pressure to implement the policy. To offset this concern, it is recommended that the survey contain a clear statement that the results will provide guidance for public officials in the decision-making process, but there is no guarantee that public officials will take a particular action.107 Questions that appear on the survey will be targeted to specific issues and specific policy options being considered to deal with those issues. In this sense, these survey questions are being used to give public officials some sense of the public’s feelings and opinions on issues that are currently on the community’s agenda. These issues are those that are being discussed or deliberated, but on which no formal action has been taken. Yet another reason to conduct a community opinion survey is to evaluate the current state of the community’s service delivery efforts.

106 Ibid, 12. 107 Ibid, 13.

56 Survey questions would be designed to collect citizens’ opinions on existing services so as to give public officials some insight as to what services need to be improved, what services need a greater or lesser commitment of resources, and what services citizens’ consider the most essential.108 This evaluative function would provide service delivery benchmarks for the community. Even if the data collected by the survey does not result in making changes to services, having an assessment of current service delivery levels can still be useful. In this situation, however, the community will want to have a series of benchmarking initiatives in order to measure how citizens’ opinions of a community’s services may have changed over time. Thus, if benchmarking is the reason for conducting a community opinion survey, a commitment may be needed to conduct a series of these surveys over time. Yet another reason for conducting a community opinion survey would be for matters of receptivity or improving the image of government. Miller and Kobayashi stress that surveys can have the impact or effect of demonstrating the community’s interest in the opinions of their citizens. In essence, the survey would be showing that public officials are receptive to learning the opinions of their citizens and not afraid to ask for those opinions. Furthermore, a survey conducted for this purpose can have the added impact of enhancing the image of government in the eyes of the citizens. For this to occur, however, the survey results will have to be taken seriously. Even if no formal decisions are taken based on the results, it is recommended that the results are studied and heralded by the community if only by releasing the results to the public.109

108 Ibid, 13. 109 Ibid, 13-14.

57 A final reason for conducting a community opinion survey, but one that is not discussed by Miller and Kobayashi, would be a change in elected or administrative leadership in the community. When there is turnover among elected officials or a change in administrative leadership, a community opinion survey can be an important learning tool for those new leaders. At the most basic level, the results of a community opinion survey can provide a snapshot of the community (demographics, opinions, and attitudes). Those results can provide a baseline of data to help the new leaders better understand their community. Moreover, the results of a community opinion survey can also provide new leaders with a better sense of what issues are most important to the residents. This gives the new leaders some data to inform the issue areas that they must focus on in the early part of their tenure in their new position. It bears noting that there are other reasons that are often invoked for conducting community opinion surveys that are not appropriate according to Miller and Kobayashi. It bears identifying these reasons, although they will not be included in the typology that will be presented later in this chapter as none of the case illustrations involved a survey conducted for inappropriate reasons. The first of these reasons is that a community opinion survey is conducted in order to push a particular program or for fundraising purposes. Often times, a survey will contain a handful of questions about current issues with the last part of the survey asking that the survey be returned along with a certain dollar amount as a donation to a particular cause. Often times these types of surveys are promulgated by state and federal representatives looking to fund their campaigns.110

110 Ibid, 14.

58 Other surveys are conducted for the purpose of raising the citizenry’s consciousness on a particular issue. These surveys tend to err on the side of providing citizens with information on particular issues rather than collecting opinions on those same issues. Both fundraising and consciousness raising surveys can have negative effects on citizens as they are likely to become wary of future surveys that are being conducted with more appropriate reasons behind them.111 Yet another inappropriate reason to conduct a community opinion survey is that the results will be used as ‘political ammunition’ to help prove a point. Often times this reason can apply to a desire to prove that citizens are generally pleased with their elected officials or with how their government operates. More likely, however, a survey is used to stockpile ammunition to prove that a particular policy option was the right thing to do and that there is public approval and support for that decision. This type of survey often occurs during or after a particularly rancorous debate on a specific issue. The survey results are then used as evidence of the public’s support or dissension on the issue.112 Another inappropriate reason for conducting a community opinion survey is that the survey is being touted as a referendum or a proxy vote on a particular issue. There are a host of political and methodological problems with using survey results as the sole criteria for making a particular decision. First, this type of promise undermines the legitimate statutory authority vested in most government legislatures. Secondly, the methodology of the survey may not incorporate all citizens of a particular community.113

111 Ibid, 14. 112 Ibid, 14-15. 113 Ibid, 15.

59 Regardless, the survey should make no promise that the results will be the sole determinant of any decisions that will be made by the community. Finally, community opinion surveys should not be conducted to collect data that is already available from another source. While this warning seems fairly obvious, it should be noted that community opinion surveys often ask questions for which data has already been collected from another source or exists in a database elsewhere. The time, resources, and effort that must be invested into a community opinion survey will be wasted if the survey simply asks for data that is already available.114 This would be yet another inappropriate reason for conducting a community opinion survey.

Community Survey Typology The appropriate reasons for conducting a community opinion survey serve as the foundation for the typology that is being presented here. The typology that is being proposed consists of four categories in which all community opinion surveys that are conducted for appropriate reasons can be assigned. The four categories being proposed are: Informational, Strategic, Decisionistic, and Symbolic. Each of these four categories will now be explained in detail. The first category in this typology is titled “Informational Surveys.” This category encompasses community opinion surveys that are being conducted primarily for the purpose of either collecting or disseminating information. The survey results will be used to enhance public officials’ ability to self-govern. This category is predicated on the assumption that community opinion surveys can be used as a tool for communities to communicate with their residents. It also recognizes that communication is a

114 Ibid, 15.

60 two-way endeavor and the survey affords residents with the opportunity to communicate back to their public officials. Informational Surveys are designed to collect information concerning residents’ opinions on any number of issues within the community. For example, an Informational Survey might provide an assessment of the current state of service delivery, otherwise known as a benchmarking initiative. The survey results are used to measure residents’ satisfaction levels with a variety of services provided by the local government. Often, Informational Surveys serve as one measurement of residents’ opinions in a series of surveys designed to collect this type of information. From the standpoint of disseminating information, an Informational Survey can also serve to inform or educate residents. The survey can be used to educate residents about news, events, plans, or activities going on in the community. Usually this education is targeted at issues and subsequent decisions that have recently been made. The survey, then, provides a vehicle for communicating these decisions to residents. The survey results from an Informational Survey are primarily used to give public officials a sense of how well their government is operating, particularly officials who are new to their position. In a purely Informational Survey, it is expected that no decisions would come about due to the results of the survey. Because no decisions would be made as a result of the survey, Informational Surveys have no impact on the public policy agenda of the community. The benchmarking initiative would be a snapshot of residents’ opinions at the time of the survey with the intent of collecting that same data again in the relatively near future. The primary reason for conducting the community opinion survey would be fulfilled by collecting the benchmarking data and/or communicating specific issues to residents.

61 Table 1: Informational Surveys

Primary Reason(s) for To Collect or Disseminate Conducting Survey: Information

Type of Information Collected: Benchmarking Data

Decisions Made Based None Upon Survey Results:

Implication for Policy Agenda: None

The second category in this typology is titled “Strategic Surveys.” This category encompasses community opinion surveys that are focused on collecting data that will be used for long range planning efforts. These are decisions that the community may be facing well into the future. These issues are on the horizon and may need to be dealt with at some point, but do not require immediate action. These issues would be on the community’s discussion agenda. Strategic Surveys are also conducted as part of a larger formalized strategic planning process. These community opinion surveys are part of larger more involved process that usually involves the classic strategic planning model. Therefore, the community opinion survey is just one part of a larger endeavor which generally includes a visioning process, establishing a mission statement, conducting an assessment of strengths, weaknesses, opportunities and threats, and developing an implementation plan.

62 Strategic Surveys are designed to collect information concerning residents’ opinions on the long-term future of the community. Usually these surveys focus on residents’ preferences for the future growth and development of their community. These preferences can take into account residential, commercial, and industrial growth and any related issues that might be brought about by these various types of development. The decisions that typically result from Strategic Surveys are not realized immediately. Rather than seeing discrete decisions being made based on the survey results, it is more likely that the survey results will help to frame or design strategies and plans. These strategies and plans should be designed to achieve an alternate state of affairs based upon the preferences expressed in the survey results. These strategies will have longer timelines for action as it may take years to realize that alternate state of affairs. The primary reason for conducting the community opinion survey would be fulfilled by collecting data and information that is used for long-range planning efforts and/or as part of a formalized strategic planning process.

63 Table 2: Strategic Surveys

Long-Range Planning Primary Reason(s) for Conducting Survey: Formal Strategic Planning Process

Type of Information Collected: Data Concerning Preferences for the Future of the Community

Decisions Made Based Strategies or Plans to Meet Those Upon Survey Results: Future Preferences

Implication for Policy Agenda: Issues Placed on the Discussion Agenda

The third category in this typology is titled “Decisionistic Surveys.” This category encompasses community opinion surveys that are focused on collecting data that will be used for making short and medium term decision making efforts. These are decisions that the community is currently facing and may be in the process of discussing and deliberating the various courses of action that might be taken. The survey would be designed to collect information that would help to guide that deliberation process. These are issues that would be considered to be on the community’s action agenda. There may be other issue areas in which the community has no plans for action or making a policy decision. However, the results of the survey may bring to light a problem, issue, or area in which the community decides that it needs to take action. The survey findings would serve as a driving

64 force to place issues on a community’s action agenda – issues that may not have been anticipated. Furthermore, community opinion surveys can be utilized to evaluate issues that have already passed through the policy process; the survey results are used to evaluate residents’ opinions about those past decisions. A distinction must be made here about survey questions that inform residents about past decisions as opposed to survey questions that ask residents to evaluate past decisions. Informational questions fit into the Informational Survey category of the typology if they only ask residents if they are aware or knowledgeable about a particular policy decision. If the survey question takes the issue a step further and asks for an evaluation or a rating of that decision, then those evaluative questions fit into the Decisionistic Survey category of the typology. This distinction is important because the results of evaluative questions may cause the issue to come back onto the action agenda – particularly if the results warrant policy termination or re-design. In those situations, the survey results may force a change in policy and a new decision to be made.115 Also, a community opinion survey can be used to set the action agenda in situations in which the community has undergone a recent change in elected or administrative leadership. A Decisionistic Survey, in this instance, would be conducted in order to help determine or guide the issue areas that those in new leadership positions should be focusing their efforts on. In order for a survey conducted by new leaders in a community to be Decisionistic, those new leaders will have to use the survey results to take action in areas identified by the survey results.

115 This evaluative aspect of Decisionistic Surveys is based upon the Decisionistic Model of policy evaluation presented in: Floden, Robert E. and Stephen S. Weiner. 1978. “Rationality to Ritual: The Multiple Roles of Evaluation in Governmental Processes.” Policy Sciences 9 (1): 9-18.

65 The decisions that typically result from Decisionistic Surveys are generally seen to occur relatively soon after the results of the survey are tabulated. These may be decisions in areas that were currently being deliberated, decisions in areas that are wholly new or unexpected, or decisions that arise from the evaluation of past policies and decisions. The primary reason for conducting the community opinion survey would be fulfilled by collecting data and information that is used to make more specific and immediate decisions in issue areas of relevance to the community. Table 3: Decisionistic Surveys

Short and Medium Term Primary Reason(s) for Decision-Making Conducting Survey: Change in Elected/Administrative Leadership

Data Targeted at Specific Issues Type of Information Collected: Policy Evaluation

Decisions on Policies Being Deliberated

Decisions Made Based Decisions to Terminate or Re- Upon Survey Results: Design Existing Policies

Decisions on Issues in New Policy Areas

Implication for Policy Agenda: Issues Placed on the Action Agenda

66

The final category in this typology is titled “Symbolic Surveys.” This category encompasses community opinion surveys that are conducted for issues related to receptivity and enhancing the image of the community. Surveys in this category are conducted for the purpose of demonstrating the community’s interest in the opinions of their citizens. The community gets credit for seeking the opinions of their residents regardless if any real outcomes or decisions result from the survey. This category of surveys aligns closely with the model of policy evaluation that is termed ritual. In ritualistic policy evaluation, the act of evaluating policy (or in this research using a community opinion survey to determine public opinion on certain issues and services) is done so as to calm the citizenry and espouse an image of governmental rationality and accountability.116 It can also serve to increase the citizenry’s confidence in their government. A Symbolic Survey can also occur in situations in which public officials get caught up in the new imperative to embrace strategic planning, but for whatever reason, lose interest or do not see the process through to its logical conclusion. In these situations, public officials may commission a community opinion survey, but the results languish with no decisions or long-range planning resulting from it. These surveys would be classified as Symbolic because they are conducted because public officials feel pressured because of the mindset that a survey is what they ought to be doing, but do not have the interest or commitment to see the results implemented. There are no decisions that typically result from Symbolic Surveys. In a purely Symbolic Survey, there would be no impact on the public policy

116 Floden and Weiner. “Rationality to Ritual: The Multiple Roles of Evaluation in Governmental Processes,” 16-17.

67 agenda of the community. Whether a survey is conducted as a ritualistic endeavor or because of the pressure felt by public officials to do what other communities are doing, the bottom line is that no actions or decisions are taken as a result of the community opinion survey. The primary reason for conducting the community opinion survey would be fulfilled by going through the process of collecting public opinion and ending the process there. Table 4: Symbolic Surveys

Receptivity Primary Reason(s) for Conducting Survey: Ritual

Pressure to Fulfill Imperative

Type of Information Collected: Public Opinion and Attitudes

Decisions Made Based None Upon Survey Results:

Implication for Policy Agenda: None

68 3 CASE ILLUSTRATIONS This chapter will examine the content and results of each of the four community opinion surveys presented as case illustrations. In order to summarize each of the four case illustrations, the following information for each community opinion survey will be provided: survey design, survey methodology, and summary (highlights) of results. The survey results presented here are not meant to be inclusive of providing data from every question that appeared on the survey. Rather, a broad overview of the major findings from each survey will be discussed. The case illustrations are being presented in chronological order based on the year in which the community opinion survey was conducted in each jurisdiction.

Village of Williamsburg, Ohio – 1998 The Village of Williamsburg, Ohio undertook a community opinion survey project, with the assistance of the Center for Public Management and Regional Affairs, in 1998. A project team, consisting of Dr. Philip A. Russo, Jr. (Director of the Center for Public Management and Regional Affairs), Dr. Susan Ann Kay (Professor of Political Science at Miami University), one Undergraduate Research Assistant, three Graduate Research Associates, and Project Manager Andrew Dudas serving as the lead staff person assigned to the project, was assembled to carry out this project. The Village established a committee to oversee the project which consisted of Mayor Dennis Spencer, Vice-Mayor Mary Ann Lefker, Village Administrator Tom Ryther, and Village resident Jane Croswell. The survey

69 was administered in July 1998 and the results were presented to Village officials in October 1998.117 The survey instrument was designed by the project team in consultation with the Village of Williamsburg survey committee. The actual survey questions were designed to collect data and responses in a variety of formats, including forced choice, ranking, rating intensity questions, and open ended questions that allowed the respondent to describe in their own words their answer(s) to certain questions.118 The survey contained six sections of questions which included and appeared in the following order: General Information, Village Streets, Roads, and Signs, Zoning, Land Use, and Code Enforcement, Parks and Recreation, General Village Government, and Additional Comments. No questions appeared in the Additional Comments section. The last page of the survey was left blank for respondents to provide, in their own words, any further comments.119 The first set of questions on the survey was generally categorized as General Information. This section asked respondents for information pertaining to life in the Village, as well as demographic information about the respondent and their household. Residents were asked how long they have lived in the Village, their overall level of satisfaction with living in the Village, and how life in Williamsburg has changed since the resident arrived in the Village. Furthermore, residents were asked to identify the three qualities that they liked the most and the three qualities that they disliked the most about the Village. Questions aimed at collecting demographic

117 Center for Public Management and Regional Affairs – Miami University. 1998. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998. Oxford, OH: Center for Public Management and Regional Affairs. 118 Ibid, 4. 119 Appendix B contains a copy of the survey instrument.

70 information sought to ascertain whether the respondent owned or rented their home, the total number of persons living in their household (by age categories), gender, and marital status. Residents were asked to provide their year of birth as well as the year of birth for their spouse. Finally, residents were asked to indicate their current employment status.120 The title of the second set of questions appearing on the survey was Village Streets, Roads, and Signs. Generally, this section sought opinions on issues related to street maintenance and appearance in the Village. Residents were first asked to indicate their level of satisfaction with the streets and roads, and sidewalks in the Village. Another question specifically targeted residents’ opinions on the appearance of the tree lawn area (the strip of land between the sidewalk and the street) and whether it should be maintained as grass. The next question asked residents to rate a number of street, road, and sign conditions in Williamsburg. These conditions included: street name signs, route signs, speed limit postings, traffic signals, street lighting, pothole repair, street gutters and curbs, street sweeping/cleaning, and snow/ice removal. As a follow-up questions, residents were then asked to rate how these street, road, and sign conditions (using the same listing of items) had changed over the past two years. Moreover, an open-ended question followed that asked residents to identify the three street, road, and sign conditions that needed the most improvement in the Village of Williamsburg.121 The third section of questions revolved around issues pertaining to Zoning, Land Use, and Code Enforcement in the Village. The first question in this section asked residents to imaging the Village five years into the

120 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 5-9. 121 Ibid, 9-12.

71 future. Keeping that timeframe in mind, residents were asked whether the Village should remain the same, pursue moderate growth, or pursue significant growth in terms of the following issues: Village boundaries, number of industries (e.g. factory, warehouse, distribution center), number of retail stores (e.g. convenience store, hardware store), number of service providers (e.g. banks, health services), single-family housing units, and multi-family housing units.122 This section continued with two blocks of questions (sub-sections) designed to assess the state of both Residential Property and Commercial Property within the Village. Residents were asked to provide their level of agreement with the statement: “I am satisfied with the appearance of residential property in the Village.” The same question was also asked in regards to commercial property. Next, residents were asked to write-in any concerns they might have had with the appearance of residential property in the Village. Similarly, residents had the opportunity to answer the same question about commercial property. A third question in this sub-section asked how the Village’s enforcement of zoning codes concerning residential property had changed over the past two years; residents were also asked how the Village’s enforcement of zoning codes regarding commercial property had changed over the past two years. Two final questions in these sub- sections regarded the availability of parking in both residential and commercial areas of the Village. Residents were asked to write-in any concerns they might have had with the availability of parking in these areas.123

122 Ibid, 12-13. 123 Ibid, 14-15.

72 The Zoning, Land Use, and Code Enforcement section of the survey contained two final questions that were designed to elicit opinions regarding public nuisances in the Village. Residents were asked to identify public nuisances in the Village from a list that included: animal control, noise, debris and junk in developed lots, weed and lawn height, debris and junk in vacant lots, abandoned buildings, junked cars, and storage of recreational vehicles/boats. A follow-up question asked residents to identify, from that same list, the top three nuisances in the Village of Williamsburg.124 The next section of questions dealt with issues related to Parks and Recreation. Specifically, the Village had a two-phase plan in place for developing the Williamsburg Community Park. The first question was designed to measure residents’ familiarity with this two-phase plan. The survey then sought to gauge how often residents might use the facilities that will be constructed at the Park in an average year. Residents were asked about the following facilities: shelter/picnic area, walking trail, playground area, basketball court, volleyball court, and ball diamond. The next question asked about other facilities that residents might like to see included or added to the Park, including tennis courts, horseshoe pits, fitness/exercise trail, all- purpose field, soccer field, bicycle trail, and swimming pool. Finally, residents were asked to identify, from that same list, the top three facilities they would like to see at the Williamsburg Community Park.125 General Village Government was the fifth section of questions appearing on the survey. This section of questions sought to ascertain several things related to residents and their interactions with Village government. Residents were first asked to rate how well Village

124 Ibid, 15-17. 125 Ibid, 17-20.

73 government operates. The next two questions focused on the frequency of attendance at Williamsburg Village Council meetings and Williamsburg Planning Commission meetings. Finally, the last question on the survey asked residents to rate how well Village government communicates its policies and operations with the citizens of the Village.126 The final two pages of the survey were set aside for Additional Comments which allowed respondents to write-in any thoughts or opinions that they had not expressed previously on the survey.127 In terms of the methodology, the survey was conducted via United States Postal Service mail with a survey instrument mailed to all households in the Village of Williamsburg. The Village provided a database of household addresses based on the Village’s utility billing list. That list accounted for 902 households in the Village of Williamsburg. Surveys were mailed to these households in July 1998. Each survey packet mailed to each household was identical and contained a survey instrument along with a return-addressed, postage-paid envelope. The instructions on the survey instrument asked that one member of the household who is eighteen years of age or older and a Village resident complete the survey. A second survey packet (identical to the first survey packet) was mailed approximately two weeks after the first mailing to provide another opportunity for households to participate.128 A total of 392 usable responses were returned for a response rate of 43.4%. Because the Village of Williamsburg has several apartment complexes, it was important to Village officials to know the response rate

126 Ibid, 20-21. 127 See Appendix B for a copy of the survey instrument. 128 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 5.

74 from citizens who own their home (50.3%) as compared to those who rent their home (22.8%). The standard margin of sampling error in this survey is plus or minus four percentage points (± 4%) in 95 out of 100 cases. From a statistical standpoint, this means that if a survey were conducted 100 times, the results will not vary by more than plus or minus four percentage points from the results if all Village residents had been surveyed and responded in 95 of those 100 cases. It should be noted that all surveys are subject to sources of error, such as bias in the wording of questions, timing, issue salience, etc. Every effort was made in the survey design, methodology, and timing to maximize response rate and to minimize bias. There was little reason to suspect any significant bias in the data collected, and therefore the survey results provided an accurate reflection of respondent opinion at the time.129 In terms of the results of the survey, the majority of surveys returned were completed in full. However, some surveys were not completed in full with certain questions or sections returned without responses. Incomplete surveys were included in the database, thus some questions may have more responses than others. Due to rounding, some of the reported percentages on some questions may not exactly equal 100%.130 It is important to understand the highlights of the significant findings of the survey. These highlights are taken principally from the Executive Summary portion of the published final report that was presented to Township officials – Village of Williamsburg, Ohio Community Survey Project – October 8, 1998.

129 Ibid, 5. 130 Ibid, 5.

75 The results of the survey showed that 45.8% of the respondents have resided in the Village for more than twenty years. Nearly 78% of respondents were either satisfied (56.3%) or very satisfied (21.3%) with living in the Village. Respondents provided several responses when asked to identify the three qualities they liked the most about the Village. A majority of respondents (77.6%) cited small town atmosphere as a positive quality of the Village. Other positive qualities identified were: schools (28.9%), police (22.4%), and the location of the Village (21.4%). Conversely, respondents cited several qualities that they disliked about the Village. Some of those negative qualities were: lack of business and industry (48.8%), lack of restaurants (19.8%), and water problems (16.4%).131 When asked about how the tree lawn area should be maintained, 57.6% felt that it should be maintained as grass. Nearly 90% of respondents imagined the Village pursuing moderate (37.7%) to significant (51.9%) growth over the next five years in terms of the number of retail stores (e.g. convenience store, hardware store). The primary public nuisance in the Village was identified as animal control by 41.2% of respondents.132 The vast majority of respondents were either not familiar with the Williamsburg Community Park or were generally familiar with the Park, but not familiar with the specific plans for its use and facilities. Finally, the majority of respondents (64.8%) rated Village government operations as average, above average or excellent, while 61.8% of respondents rated the

131 Ibid, 2. 132 Ibid, 2.

76 Village’s communication efforts with citizens as either average, above average, or excellent.133

Oxford Township (Butler County), Ohio – 2000 Oxford Township (Butler County), Ohio undertook a community opinion survey project, with the assistance of the Center for Public Management and Regional Affairs, in 2000. A project team, consisting of Dr. Philip A. Russo, Jr. (Director of the Center for Public Management and Regional Affairs), three Undergraduate Research Assistants, two Foreign Exchange Associates, and Project Managers Andrew Dudas and Mark Morris serving as the lead staff persons assigned to the project, was assembled to carry out this project. The Oxford Township Trustees (George Simonds, James McDonough, and Sam Woodruff) oversaw the project. The survey was administered in May 2000 and the results were presented to Township officials in July 2000.134 The survey instrument was designed by the project team in consultation with the Oxford Township Trustees. The actual survey questions were designed to collect data and responses in a variety of formats, including forced choice, ranking, rating intensity questions, and open ended questions that allowed the respondent to describe in their own words their answer(s) to certain questions.135 The survey contained nine sections of questions which included and appeared in the following order: Township Life, Township Services, Township Streets, Roads, and Signs, Public Safety, Parks and Recreation, Recycling/Refuse Collection, Zoning and Enforcement, Additional

133 Ibid, 2-3. 134 Center for Public Management and Regional Affairs – Miami University. 2000. Oxford Township Survey Project Final Report. Oxford, OH: Center for Public Management and Regional Affairs. 135 Ibid, 4.

77 Questions, and Additional Comments. No questions appeared in the Additional Comments section. The last half page of the survey was left blank for respondents to provide, in their own words, any further comments.136 The first section of the survey asked for respondents’ opinions on their overall views of life in Oxford Township. First, respondents were asked to indicate how many years they have lived in Oxford Township. Next, they were asked to rate their overall satisfaction with living in the Township. A follow-up question asked respondents to think about how the quality of life in the Township had changed in the past five years. Furthermore, residents were asked to identify the three qualities that they liked the most and the three qualities that they disliked the most about the Township. Residents were then asked to indicate whether they were happy living in the Township and the probability of whether or not they would stay or move in the next five years. Asked to imagine Oxford Township five years into the future, residents could provide their opinion on whether the Township should pursue significant growth, moderate growth, or remain the same. Finally, respondents were given the opportunity to identify the types of growth they would like to see in the Township in the form of an open-ended question.137 The next section of the survey was entitled Township Services. This section contained only one question, but it asked respondents to indicate their level of satisfaction with a number of Oxford Township services provided to resident. Those services included: police protection, fire

136 Appendix C contains a copy of the survey instrument. 137 Center for Public Management and Regional Affairs – Miami University. Oxford Township Survey Project Final Report, 5-8.

78 protection, EMS (emergency medical services), street and road conditions, and zoning enforcement.138 Township Streets, Roads, and Signs was the third section of questions on the survey as residents were asked to assess the condition and maintenance of these items. The first question in this section sought residents’ opinions on their satisfaction with a variety of items. The list of items included: street name signs, highway route signs, speed limit postings, railroad crossing signs, pothole repair, drains and ditches, and snow/ice removal. Using this same list of items, residents were then asked to evaluate how these street, road, and sign conditions had changed over the past two years. Finally, an open-ended question asked for any additional comments that residents had regarding street, road, and sign conditions in Oxford Township.139 The fourth section of the survey was Public Safety. The first three questions in this section asked residents to indicate their satisfaction with the current level of police protection, fire protection, and emergency medical services provided by the Township. Several questions were then targeted at police protection services as residents were asked whether they felt safe in their neighborhood. In an attempt to measure police visibility, residents were then asked how many times in the past month had they seen on-duty Oxford Township Police officers patrolling near their home. The next question asked residents to indicate how satisfied they were with several specific aspects of police protection, including: on-duty patrol, response time to requests for assistance, general community outreach, and vacation check. The final question in this section was designed more to inform residents,

138 Ibid, 8-9. 139 Ibid, 9-11.

79 rather than to collect data for the Township. It asked whether residents were aware of the Township’s plans to install a tornado warning system consisting of five sirens.140 A battery of questions pertaining to Parks and Recreation appeared next on the survey. The first question revolved around Oxford Township’s plans for a park (with areas for parking and picnicking) located on Corso Road adjacent to the historic Black Covered Bridge. The first question was informational in nature, seeking to find out whether residents were aware of these plans for the park. The next question sought residents’ opinions as to whether they would like to see additional parks developed within the Township. A follow-up question sought to ascertain which facilities residents would like to see developed at a park. Residents could choose from among the following selections: playground area, shelter/picnic area, walking path, bicycle path, basketball court, tennis court, baseball diamond, soccer field, swimming pool, and sand volleyball court.141 The next section on the survey asked residents to consider several issues pertaining to recycling and refuse collection in the Township. The first question in this section was informational as it asked residents if they were aware of the availability of drop-off recycling offered by Butler County on the first Saturday of the month at the Oxford Wal-Mart location and on the third Saturday of the month at Cook Field on the campus of Miami University. The next question was designed to collect data on residents’ monthly usage of these drop-off sites. Another question in this section was aimed at collecting data on how residents dispose of their non-recyclable refuse with several choices being provided, including contract with a private

140 Ibid, 11-13. 141 Ibid, 13-14.

80 hauler (e.g. Rumpke, BFI), incinerate refuse, or self-haul refuse. Finally, an open-ended question was asked of residents so that they could provide any additional comments regarding recycling and refuse collection in Oxford Township.142 Zoning and Enforcement was the seventh section of questions on the survey which related to issues involving the Butler County zoning regulations and the enforcement of these regulations which apply to Oxford Township. The first question in this section asked if residents thought that the Township should adopt its own zoning code. The next question dealt with nuisances in the Township. Residents were asked to rate on a scale of one to ten (where one is equal to needs immediate attention and ten is does not need attention), how much attention should be given to each of the nuisances provided in a list. That list consisted of: unattended pets, maintenance of buildings, vegetation height (weeds and brush), junked cars, unregistered vehicles, storage of recreational vehicles, noise, fences, miscellaneous junk, litter, and commercial signs.143 The next section, entitled Additional Questions, was a catch-all category for questions pertaining to Oxford Township government as well as questions designed to collect demographic information about residents. The first four questions in this section were related to Oxford Township government beginning with a question asking residents how many Township Trustee meetings they have attended in the past two years. The next question asked residents where they get their information about Township meetings, activities, and issues. Residents could select as many items as applied, including: The Oxford Press, Hamilton Journal-News, The Cincinnati

142 Ibid, 14-16. 143 Ibid, 16-17.

81 Enquirer, television, radio, word-of-mouth, Internet, and public notices/bulletin boards (e.g. the Post Office). A follow-up question asked residents to consider from what sources they would prefer to receive this information. Residents could choose from among the following selections: a dedicated column in local newspapers, cable television public access channel, Township newsletter, and Township Internet home page.144 The remaining seven questions in this section attempted to collect demographic information about Township residents. They were asked whether they owned or rented their home, their gender, and marital status. Also, residents were asked to indicate the number of persons living in their household by age categories as well as their year of birth and the year of birth of their spouse. The last question on the survey asked residents to provide their current employment status. The final half page of the survey was a section for Additional Comments which allowed respondents to write- in any thoughts or opinions that they had not expressed previously on the survey.145 In terms of methodology, the survey was conducted via United States Postal Service mail with a survey instrument mailed to all households in the unincorporated area of Oxford Township. The household mailing list was generated from an electronic telephone directory combined with an electronic criss-cross directory. The mailing list generated from these two sources was then cross-referenced with the City of Oxford, Ohio Police Department’s 911 Dispatch System. This was done to insure that only residents in the unincorporated area of the Township would be included in the list. This process yielded a list of 784 households to be surveyed.

144 Ibid, 17-18. 145 Ibid, 18-20.

82 Surveys were mailed to these 784 households in April 2000. Each survey packet mailed to each household was identical and contained a survey instrument along with a return-addressed, postage-paid envelope. The instructions on the survey instrument asked that one member of the household who is eighteen years of age or older and a Township resident complete the survey. A second survey packet (identical to the first survey packet) was mailed in May 2000 to provide another opportunity for households to participate. A reminder card was mailed one week later.146 The response rate was 62.3% based upon 463 usable responses being returned. The standard margin of sampling error in this survey is plus or minus 2.75 percentage points (± 2.75%) in 95 out of 100 cases. From a statistical standpoint, this means that if a survey were conducted 100 times, the results will not vary by more than plus or minus 2.75 percentage points from the results if all Township residents had been surveyed and responded in 95 of those 100 cases. It should be noted that all surveys are subject to sources of error, such as bias in the wording of questions, timing, issue salience, etc. Every effort was made in the survey design, methodology, and timing to maximize response rate and to minimize bias. There was little reason to suspect any significant bias in the data collected, and therefore the survey results provided an accurate reflection of respondent opinion at the time.147 In terms of the results of the survey, the majority of surveys returned were completed in full. However, some surveys were not completed in full with certain questions or sections returned without responses. Incomplete surveys were included in the database, thus some questions may have more

146 Ibid, 5. 147 Ibid, 5.

83 responses than others. Due to rounding, some of the reported percentages on some questions may not exactly equal 100%.148 It is important to understand the highlights of the significant findings of the survey. These highlights are taken principally from the Executive Summary portion of the published final report that was presented to Township officials - Oxford Township Survey Project Final Report. The survey results showed that 39.8% of respondents have resided in Oxford Township for more than twenty years. Conversely, 23.4% of respondents were new to the Township, having resided there for less than five years. The vast majority of respondents were either very satisfied (47.7%) or satisfied (43.5%) with living in Oxford Township. The quality of life in Oxford had improved in the past five years according to 19.3% of respondents, while 53.0% of respondents indicated that the Township had stayed the same during that same time frame.149 Furthermore, respondents identified several qualities that they liked the most about Oxford Township. These positive qualities included: lifestyle qualities (57.2%), government services (15.6%), and access (proximity to work, recreation, and retail shopping) (13.9%). Conversely, respondents identified several qualities that they disliked the most about the Township. These negative qualities included: government services (29.3%), other services (lack of recycling, cable television) (14.4%), lifestyle qualities (13.8%), and nuisances (13.8%).150 Because this was an open-ended questions, some respondents identified a particular quality as positive while other respondents identified that same quality as negative.

148 Ibid, 5. 149 Ibid, 2. 150 Ibid, 2.

84 A majority of residents (84.5%) also indicated that they were generally happy living in Oxford Township and would probably stay for the next five years. Staying with a future oriented outlook of five years in advance, 54.7% of respondents thought that the Township should remain the same in terms of growth.151 Respondents indicated general satisfaction with the services provided by Oxford Township. In terms of police protection, 78.1% of respondents were either very satisfied (32.6%) or satisfied (45.5%) with this service. Fire protection had similar levels of satisfaction with 24.5% being very satisfied and 41.7% being satisfied with this service. Similarly, 69.2% of respondents were either very satisfied (27.9%) or satisfied (41.3%) with EMS (emergency medical services).152 Respondents also expressed general satisfaction with street and road conditions as 20.2% indicated that they were very satisfied and 50.8% were satisfied with this service. The majority of respondents indicated that they were satisfied with street name signs, highway route signs, speed limit postings, railroad crossing signs, pothole repair, and snow and ice removal.153 From the Public Safety section of questions, the survey showed that nearly 90% of respondents either agreed (52.6%) or strongly agreed (35.4%) with the statement: “I feel safe in my neighborhood.” There was general unfamiliarity with the Township’s plans for a park on Corso Road adjacent to the historic Black Covered Bridge as 78.5% of respondents were unaware of this plan. Over half of all respondents (52.7%) were aware of the

151 Ibid, 2. 152 Ibid, 2-3. 153 Ibid, 3.

85 availability of scheduled drop-off recycling, but only 29.4% use the drop-off sites.154 In terms of Oxford Township government, the majority of respondents (76.7%) get their information about Township meetings, activities, and issues from The Oxford Press weekly newspaper. Oxford Township government’s communication of issues, policies, and operations to its residents was rated as average by 45.9% of respondents, as above average by 8.9% of respondents, and as excellent by 2.7% of respondents.155

Village of Coldwater, Ohio – 2002 The Village of Coldwater, Ohio undertook a community opinion survey project, with the assistance of the Center for Public Management and Regional Affairs, in 2002. A project team, consisting of Dr. Philip A. Russo, Jr. (Director of the Center for Public Management and Regional Affairs), and Project Manager Mark Morris serving as the lead staff person assigned to the project, was assembled to carry out this project. Village Administrator E. Thomas Ault served as the primary contact person for the Village until his resignation midway through the project. The survey was administered in February/March 2002 and the results were presented to Village officials in April 2002.156 The survey instrument was designed by the project team in consultation with the Village of Coldwater officials. The actual survey questions were designed to collect data and responses in a variety of formats, including forced choice, ranking, rating intensity questions, and open ended

154 Ibid, 3. 155 Ibid, 3. 156 Center for Public Management and Regional Affairs – Miami University. 2002. Village of Coldwater, Ohio Community Satisfaction Survey – April 2002. Oxford, OH: Center for Public Management and Regional Affairs.

86 questions that allowed the respondent to describe in their own words their answer(s) to certain questions.157 The survey contained nine sections of questions which included and appeared in the following order: Village Life, Village Services, Public Safety, Other Services, Parks and Recreation Facilities, Growth and Economic Development, Village Government, Demographics, and Additional Comments. No questions appeared in the Additional Comments section. The last page of the survey was left blank for respondents to provide, in their own words, any further comments.158 The first set of questions asked questions pertaining to Village Life in Coldwater. Residents were asked to indicate how long they have lived in the Village, their overall level of satisfaction with living in Coldwater, and whether the Village had become a better place to live, stayed about the same, or become a worse place to live in the past five years. Residents were then asked to identify the three qualities that they liked the most and the three qualities that they disliked the most about the Village. Finally, residents were asked whether they were happy living in the Village and their intentions to remain in Coldwater or move away in the next five years.159 The second set of questions gathered information on Village Services offered to residents. Residents were asked to indicate their level of satisfaction with Village services. These services included: police protection, fire protection, street & road maintenance, traffic signs, Coldwater parks, water and wastewater service, and zoning enforcement. Residents were then asked to rate the change over the past three years of a variety of street, road,

157 Ibid, 5. 158 Appendix D contains a copy of the survey instrument. 159 Center for Public Management and Regional Affairs – Miami University. Village of Coldwater, Ohio Community Satisfaction Survey – April 2002, 5.

87 and sign conditions. The items that were rated in this question were: street name signs, route signs, speed limit postings, pothole repair, street gutter & curb repair, street lighting, street cleaning, sidewalk maintenance, storm water drains, and snow & ice removal.160 Public Safety was the title of the third section of questions. Residents were asked to indicate their satisfaction with the current levels of police protection and fire protection provided by the Village. Other questions in this section specifically targeted police service as residents were asked how many times, in the past month, they had seen on-duty police officers patrolling in their neighborhood. They were also asked to indicate their level of satisfaction with several aspects of police service including: on-duty patrol, response time to requests, and general community outreach. The next question gave residents the opportunity to identify areas in which police service could improve. The choices provided were: more foot patrol, more cruiser patrol, improved response time to requests for assistance, improved general community outreach, and improved school programs and outreach. Residents were then asked to indicate how safe they felt in their neighborhood. Finally, an open-ended question allowed residents to provide any additional thoughts or comments regarding public safety in the Village of Coldwater.161 The fourth set of questions asked residents for their opinions about Other Services provided by the Village. Specifically, these questions asked residents about the cleanliness of the Village as a whole. Residents were asked to identify which public nuisances the Village has not adequately addressed. Residents could choose as many that applied from a selection

160 Ibid, 5. 161 Ibid, 5-6.

88 that included: lawn heights and vacant lots, litter, maintenance of vacant buildings, miscellaneous junk, noise, unattended pets, and unregistered vehicles. Residents were then asked to identify the three nuisances from that same list that needed the most attention from Village Council. Finally, an open-ended question allowed residents to provide any additional thoughts or comments regarding other services provided to Village of Coldwater residents.162 Parks and Recreation Facilities was the next section of questions on the survey. This battery of questions asked residents about their use of Coldwater’s parks, recreational facilities, and recreational programs. Residents were first asked how many times they typically use the facilities at the municipal parks in a month. They were then asked to indicate which facilities they had used during the past year (other than participating in organized recreational activities). The list of facilities included: ball fields, basketball courts, horseshoe pits, playground, picnic facilities, soccer fields, swimming pool, tennis courts, and volleyball court. Residents were also asked about their overall satisfaction with the recreational facilities at the municipal parks. The next two questions pertained to organized recreational programs provided by the Village. Residents were asked which activities they or their family had participated in during the past year. These activities included: youth sports league, adult sports league, Arts in the Park, gymnastics, and swim team. Residents were then given the opportunity to identify any other organized recreational programs not provided by the Village that they would like to see in the Village. Finally, an open-ended question allowed residents to provide any additional comments regarding the

162 Ibid, 6.

89 parks, recreational facilities, and recreational programs provided by the Village of Coldwater.163 A sixth set of questions addressed Growth and Economic Development issues in the Village. Residents were asked for their opinions about potential growth and economic development that could occur in the Village over the next five years. The first question asked residents to imagine Coldwater five years from now and whether the Village should pursue significant growth, pursue moderate growth, or remain the same. A follow-up question asked residents to rank order the type of development they would prefer if it were to occur in Coldwater. The choices were: residential housing, retail business, light industrial, and office buildings/professional services. Finally, an open-ended question allowed residents to provide any additional comments regarding economic growth and development in the Village of Coldwater.164 Questions pertaining to Village Government made up the seventh section of the survey. These questions asked residents about their attitudes and perceptions about Village government and were designed to measure their feelings of efficacy toward government.165 The next to last set of questions, entitled Demographics, collected a number of characteristics including: home ownership, family size and ages by category, gender, and marital status. The final page of the survey was a section for Additional Comments which allowed respondents to write-in any thoughts or opinions that they had not expressed previously on the survey.166

163 Ibid, 6. 164 Ibid, 6. 165 Ibid, 6. 166 Ibid, 6.

90 The methodology of the survey was such that it was conducted via United States Postal Service mail with a survey instrument mailed to all households in the Village of Coldwater. The mailing list of all Village households was generated from the Village utility billing list which accounted for 1,541 households. Surveys were mailed to these households in February 2002. Each survey packet mailed to each household was identical and contained a survey instrument along with a return-addressed, postage-paid envelope. The instructions on the survey instrument asked that one member of the household who is eighteen years of age or older and a Village resident complete the survey. Approximately two weeks after the initial mailing of the survey packets, a reminder card was sent to each household. A second survey packet (identical to the first survey packet) was mailed approximately thirty days after the first mailing to provide another opportunity for households to participate. A second reminder card was mailed two weeks later.167 The response rate was 64.6% based on 968 usable responses being returned. The standard margin of sampling error in this survey is plus or minus two percentage points (± 2%) in 95 out of 100 cases. From a statistical standpoint, this means that if a survey were conducted 100 times, the results will not vary by more than plus or minus two percentage points from the results if all Village residents had been surveyed and responded in 95 of those 100 cases. It should be noted that all surveys are subject to sources of error, such as bias in the wording of questions, timing, issue salience, etc. Every effort was made in the survey design, methodology, and timing to maximize response rate and to minimize bias. There was little reason to suspect any significant bias in the data collected, and therefore the

167 Ibid, 7.

91 survey results provided an accurate reflection of respondent opinion at the time.168 In terms of the results of the survey, the majority of surveys returned were completed in full. However, some surveys were not completed in full with certain questions or sections returned without responses. Incomplete surveys were included in the database, thus some questions may have more responses than others. Due to rounding, some of the reported percentages on some questions may not exactly equal 100%.169 It is important to understand the highlights of the significant findings of the survey. These highlights are taken principally from the Executive Summary portion of the published final report that was presented to Village officials – Village of Coldwater, Ohio Community Satisfaction Survey – April 2002. From a demographics standpoint, more than six out of ten respondents (63.5%) had lived in the Village of Coldwater for more than twenty years. Nine out of ten respondents (89.4%) owned their own home, while 45.0% of respondents were male and 55.0% were female. The most frequently cited marital status was married with 73.4% of respondents choosing this option.170 93.5% of the respondents are very satisfied (34.4%) or satisfied (59.1%) with living in the Village.171 The majority of respondents (78.0%) indicated that they were happy living in the Village and would probably stay in Coldwater for the next five

168 Ibid, 7-8. 169 Ibid, 8. 170 Ibid, 3. 171 Ibid, 3.

92 years. Moreover, 70.5% of respondents thought that the Village had stayed about the same in the past five years.172 Respondents were generally satisfied with police protection and fire protection provided by the Village as 70.4% and 93.4% were very satisfied or satisfied with these services respectively. In terms of street, road, and sign conditions, respondents most frequently identified pothole repair (37.3%) and street gutter & curb repair (21.7%) as having become worse over the past three years. Furthermore, slightly more than nine out of ten respondents (93.3%) rated the Village as very clean (38.9%) or clean (54.3%). Satisfaction levels with Coldwater’s recreational facilities and organized recreational activities both topped out at over ninety percent with 95.2% and 92.1% of respondents indicating that they were very satisfied or satisfied with the Village’s recreational facilities and the Village’s organized recreational activities respectively.173 In terms of future growth for the Village, 68.3% of respondents thought that Coldwater should pursue moderate growth when imagining the Village five years into the future. The type of development most preferred by respondents was light industrial development (42.9%). The least preferred type of development by respondents was residential housing (36.8%). Finally, the most frequently cited public nuisance not adequately addressed by the Village was unattended pets with 30.3% of respondents making this selection.174

Hanover Township (Butler County), Ohio – 2003 Hanover Township (Butler County), Ohio undertook a community opinion survey project, with the assistance of the Center for Public

172 Ibid, 3. 173 Ibid, 3-4. 174 Ibid, 4.

93 Management and Regional Affairs, in 2003. A project team, consisting of Dr. Philip A. Russo, Jr. (Director of the Center for Public Management and Regional Affairs), six Undergraduate Research Associates, and Senior Project Manager Andrew Dudas serving as the lead staff person assigned to the project, was assembled to carry out this project. Trustee Timothy Derickson served as the primary contact person for the project from Hanover Township. The survey was administered in October 2003 and the results were presented to Township officials in February 2004.175 The survey instrument was designed by the project team in consultation with Hanover Township officials. The actual survey questions were designed to collect data and responses in a variety of formats, including forced choice, ranking, rating intensity questions, and open ended questions that allowed the respondent to describe in their own words their answer(s) to certain questions.176 The survey contained nine sections of questions which included and appeared in the following order: Township Life, Township Services, Public Safety, Zoning and Enforcement, Parks and Recreation Facilities, Township Government, Township Communication, Demographics, and Additional Comments. No questions appeared in the Additional Comments section. The last page of the survey was left blank for respondents to provide, in their own words, any further comments.177 The Township Life section asked residents to provide information about several aspects of what it is like to live in Hanover Township. Residents were asked how long they have lived in Hanover Township, their

175 Center for Public Management and Regional Affairs – Miami University. 2003. Hanover Township Community Survey: October/November 2003. Oxford, OH: Center for Public Management and Regional Affairs. 9. 176 Ibid, 6-7. 177 Appendix E contains a copy of the survey instrument.

94 overall satisfaction with living in the Township, and how living in the Township had changed in the past five years. Residents were then asked to list “the three qualities that you like the most” and “the three qualities that you dislike the most” about living in Hanover Township. Looking to the future, residents were asked whether they intended to remain living in Hanover Township or might move away in the next five years. Keeping with a futuristic tone, residents were also asked to envision Hanover Township five years from now and provide their opinions on how much and what types of growth should be pursued by the Township.178 The second section of questions, entitled Township Services, was designed to gather information on a variety of services offered to Hanover Township residents. Residents were asked to rate whether service delivery had become better, stayed about the same, or become worse in the past three years in regards to: police protection, fire protection, emergency medical services, street and road conditions, zoning enforcement, and cemetery maintenance. The second question in this section asked residents to rate whether a variety of street, road, and sign conditions had become better, stayed about the same, or become worse in the past three years. The items that were included in this question were: street name signs, highway route signs, speed limit postings, railroad crossing signs, pothole repair, drains and ditches, and snow & ice removal.179 Public Safety was the third set of questions on the survey and this battery of questions focused specifically on issues related to police, fire, and emergency medical services provided to Hanover Township residents. The survey asked residents to indicate their current levels of satisfaction with

178 Center for Public Management and Regional Affairs – Miami University. Hanover Township Community Survey: October/November 2003, 7. 179 Ibid, 7.

95 each of these services. Specifically, police protection was evaluated on several dimensions, including: on-duty patrol, response time to requests, and general community outreach. Residents were also asked to indicate whether they thought the Township should consider the creation of a paid, full-time police department and a paid, full-time fire/emergency medical services department within the next three to five years.180 The next set of questions dealt with zoning and enforcement in Hanover Township. Residents were asked for their opinion as to whether the Township should adopt its own zoning code. Additionally residents were asked to identify which public nuisances that have not been adequately addressed by the Township. Finally, residents were asked to indicate their satisfaction with the enforcement of the Butler County zoning code in Hanover Township.181 The fifth section of questions was related to issues involving the parks and recreation facilities provided by the Township. Residents were first asked how often they (or their family) use the facilities at the Hanover School Memorial Park in an average year. Furthermore, they were asked to indicate how satisfied they were with the facilities there. Next, residents were asked their opinion as to whether Hanover Township should provide more public parks and recreation facilities. They were then asked to identify which specific recreational facilities they would like to see developed within the Township.182 A sixth set of questions asked residents about their attitudes and perceptions about Township government designed to measure their feelings of efficacy toward government. Closely related to this section is the seventh

180 Ibid, 8. 181 Ibid, 8. 182 Ibid, 8.

96 section of questions which focused on the Township’s communication efforts with residents. Residents were asked about their attendance at Township Trustee meetings in the past two years. Residents were then asked to identify from which sources they would like to receive information concerning Township news, meetings, and events.183 The final set of questions was put in place on the survey to collect information about the demographic characteristics of Hanover Township residents. These characteristics included: home ownership, family size and age groupings, gender, marital status, and the year of birth for both the respondent and their spouse, if applicable. The final page of the survey was a section for Additional Comments which allowed respondents to write-in any thoughts or opinions that they had not expressed previously on the survey.184 According to the 2000 United States Census, the total population of Hanover Township is 7,878 accounted for by 2,809 households. Portions of the incorporated Village of Millville are located within the Township borders; households within the Township borders were included in the total household population for the community survey project.185 In terms of methodology, the survey was conducted via United States Postal Service mail with a survey instrument mailed to all households in Hanover Township. The Township Trustees facilitated the acquisition of a mailing list for all Township households. The ultimate mailing list was created from a database of service addresses from the two water utilities providing service to Township residents. The databases from the Southwest Regional Water District and the Butler County Department of

183 Ibid, 8-9. 184 Ibid, 9. 185 Ibid, 6.

97 Environmental Services accounted for 2,889 households. Surveys were mailed to these households in October 2003. Each survey packet mailed to each household was identical and contained a survey instrument along with a return-addressed, postage-paid envelope. The instructions on the survey instrument asked that one member of the household who is eighteen years of age or older and a Township resident complete the survey. Approximately one week after the initial mailing of the survey packets, a reminder card was sent to each household. A second survey packet (identical to the first survey packet) was mailed approximately three weeks after the first mailing to provide another opportunity for households to participate. A second reminder card was mailed one week later.186 The response rate was 48.0% based on 1,330 usable responses being returned. The standard margin of sampling error in this survey is plus or minus two percentage points (± 2%) in 95 out of 100 cases. From a statistical standpoint, this means that if a survey were conducted 100 times, the results will not vary by more than plus or minus two percentage points from the results if all Township residents had been surveyed and responded in 95 of those 100 cases. It should be noted that all surveys are subject to sources of error, such as bias in the wording of questions, timing, issue salience, etc. Every effort was made in the survey design, methodology, and timing to maximize response rate and to minimize bias. There was little reason to suspect any significant bias in the data collected, and therefore the survey results provided an accurate reflection of respondent opinion at the time.187

186 Ibid, 9-10. 187 Ibid, 10.

98 In terms of the results of the survey, the majority of surveys returned were completed in full. However, some surveys were not completed in full with certain questions or sections returned without responses. Incomplete surveys were included in the database, thus some questions may have more responses than others. Due to rounding, some of the reported percentages on some questions may not exactly equal 100%.188 It is important to understand the highlights of the significant findings of the survey. These highlights are taken principally from the Executive Summary portion of the published final report that was presented to Township officials - Hanover Township Community Survey: October/November 2003. Demographically speaking, the results showed that the gender of respondents slightly favored females (52.5%) over males (47.5%). Nearly eighty percent of respondents (79.3%) indicated that they were married. An overwhelming majority of respondents (97.9%) indicated that they owned their home. A majority of respondents (46.0%) have lived in Hanover Township for more than twenty years.189 In terms of quality of life issues, 96.5% of the respondents said that they were very satisfied (47.7%) or satisfied (48.8%) with living in Hanover Township. Moreover, 79.5% of the respondents indicated they were “happy here and will probably stay for the next five years.” Asked to think about the Township in the past five years, 66.3% of respondents thought that Hanover Township had “stayed about the same.”190 Street and road conditions, zoning enforcement, and cemetery maintenance have “stayed about the same” according to 52.9%, 43.7%, and

188 Ibid, 10. 189 Ibid, 4. 190 Ibid, 4.

99 42.2% of respondents respectively. This question asked respondents to measure the change of these services over the past three years.191 The results pertaining to the provision of public safety services in Hanover Township showed solid satisfaction levels with police protection, fire protection, and emergency medical services. Specifically, 60.0%, 51.3%, and 48.2% of respondents felt that these services have “stayed about the same” in the past three years respectively. Over 75% of respondents either strongly agreed (24.1%) or agreed (52.0%) that they were satisfied with the current level of police protection provided to the Township by the Butler County Sheriff’s Office. The same question was applied to fire protection and yielded 70.5% of respondents either strongly agreeing (18.8%) or agreeing (51.7%) that they were satisfied with the current level of fire protection provided by the Hanover Township Volunteer Fire Department. The current level of emergency medical services (EMS) was rated satisfactory as 19.9% of respondents strongly agreed and 47.8% of respondents agreed with a similar statement when asked.192 Slightly less than 40% of respondents indicated that they thought that Hanover Township should adopt its own zoning code. The survey then asked respondents to identify, from a provided list, which public nuisances had not been adequately addressed by the Township. The most frequently cited nuisances were: junked cars (32.1%), vegetation height (weeds and brush) (22.9%), and miscellaneous junk (21.5%). Overall, 57.2% of respondents indicated that they were either very satisfied (7.8%) or satisfied (49.4%)

191 Ibid, 4. 192 Ibid, 4-5.

100 with the enforcement of the Butler County zoning code that applies to Hanover Township.193 The section of questions that was designed to collect information on the parks and recreation facilities in the Township showed that 53.5% of respondents would like to see more public parks and recreational facilities developed within the Township. The most commonly cited recreational facility was a fitness trail/hiking path/walking path as it was cited by 42.0% of respondents.194 Finally, data collected on the Township’s communication efforts showed that nearly three-fourths of respondents preferred to receive information concerning Township news, meetings, and events from a Township newsletter.195

193 Ibid, 5. 194 Ibid, 5. 195 Ibid, 5.

101 4 TRANSLATING SURVEY RESULTS INTO DECISIONS Each of the four case illustrations presented in the previous chapter represented jurisdictions that undertook a community opinion survey project. In order to effectively measure how those survey results were translated into public policy and policy evaluation, personal interviews were conducted with either elected or administrative officials in each of the four jurisdictions to ascertain a variety of information pertaining to this research. The interview questions were designed to elicit the reason(s) for conducting a community opinion survey, which questions on the survey were designed to evaluate public policy already in place, which questions were designed to generate policy options or collect opinions on policy issues that were currently on the agenda of the community, and which questions led the community to implement public policy in areas in response to the results of the survey. The case illustrations are being presented in chronological order based on the year in which the community opinion survey was conducted in each jurisdiction.

Village of Williamsburg, Ohio – 1998 The Village of Williamsburg, Ohio undertook a community opinion survey project, with the assistance of the Center for Public Management and Regional Affairs, in 1998. The survey was administered in July 1998 and the results were presented to Village officials in October 1998. Current Village Mayor Mary Ann Lefker was interviewed for this research in 2005 to ascertain how the survey results were used by the Village of Williamsburg. At the time of the community opinion survey project, Lefker was serving as Vice-Mayor and was elevated to Mayor in 1999 after the current Mayor

102 resigned. Lefker has since been elected Mayor in the 1999 election and subsequently re-elected Mayor in the 2003 election.196 Lefker was asked: what was the primary reason for conducting a community survey as part of a strategic planning process? From a personal perspective, she explained that she had been re-elected to Village Council in the 1997 election and became Vice-Mayor at the start of her second term on Council beginning in 1998. Lefker was a strong proponent of the community opinion survey project on Village Council. She felt that the Village had been “going in circles” during her first term in office and wished that the Village would “focus on what we wanted in the Village.”197 Lefker indicated that she was interested in pursuing a community opinion survey to help define this focus she referred to, and the Village approached the Center for Public Management and Regional Affairs (CPMRA) at Miami University to assist with such a project because the Village had used the CPMRA’s services in the past and the Village felt like they could trust that they would receive quality assistance.198 In order to help define this focus, Lefker said that she spearheaded an effort among the Village Council members to embrace a more traditional strategic planning process of which the survey would be an instrumental part of such a process. After becoming Mayor in 1999, Lefker established an Economic Development Committee. The Committee was initially charged with reviewing the survey results and making any necessary recommendations for action to Council.199

196 Lefker, Mary Ann. Personal Interview. 28 April 2005. 197 Ibid. 198 Ibid. 199 Ibid.

103 Furthermore, the Village conducted an internal visioning study in tandem with the community opinion survey. The internal visioning study was designed to develop common goals and perspectives among Village officials concerning the future of the Village.200 This visioning study included an inventory of assets and strengths at the disposal of the Village as well as potential problems and weaknesses facing the Village. Additionally, the visioning discussion ultimately led to the Village developing and agreeing on an official logo to represent the Village.201 Once the internal visioning study was completed, the Village now had a solid idea of what Village officials wanted to focus on for the future of the Village. That data was then combined with the external data collected by the community opinion survey for a more complete picture of what the Village needed to be focusing on in the immediate future.202 The completion of both of these projects became important when the Village was asked to participate in an initiative of the Clermont County Department of Community Planning and Development. In 2000, the SR 32 Corridor Vision Plan was being conducted by Clermont County. The study was commissioned because: “Since 1970, Clermont County, Ohio's population has nearly doubled and the numbers continue to grow. A great deal of this community growth is concentrated along the SR 32 corridor…”203 The Village of Williamsburg was one of six focus areas along State Route 32 in Clermont County that were the focus of this plan.204

200 Ibid. 201 Ibid. 202 Ibid. 203 SR 32 Corridor Vision Plan. Clermont County Department of Community Planning and Development. 10 June 2005 . 204 Ibid.

104 Naturally, the Village was invited to participate in the Vision Plan. According to Lefker, when preliminary meetings were held for the Vision Plan, Village officials had baseline data from the internal vision study and the community opinion survey to contribute to the discussions. The Village was the only jurisdiction in Clermont County that had this type of data and information available. Some of the most important data from the community opinion survey for the SR 32 Corridor Vision Plan came from a question that asked respondents to imagine Williamsburg five years from now and evaluate Village boundaries, number of industries, number of retail stores, number of service providers, single-family housing units, and multi-family housing units. Respondents were asked whether or not the Village should remain the same, pursue moderate growth, or pursue significant growth in terms of these items.205 Having data on this issue was a critical contribution from the Village of Williamsburg for having their interests represented on the SR 32 Corridor Vision Plan. Lefker felt that the combination of the visioning study and survey results enabled the Village of Williamsburg to be a “bigger player in County politics” specifically during the SR 32 Corridor Vision Plan.206 From her perspective, the strategic planning process that the Village undertook had important ramifications for how the Village would be represented in the County’s SR 32 Corridor Vision Plan. The survey results (as part of a larger process) were used in essence to enable the Village of Williamsburg to be better positioned externally for the growth and development that was naturally occurring in Clermont County.207

205 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 12-13. 206 Lefker, Mary Ann. Personal Interview. 28 April 2005. 207 Ibid.

105 The next question asked of Lefker was: what, if any, were the secondary reason(s) for conducting a community survey as part of a strategic planning process? Lefker identified two secondary reasons that the Village pursued a community opinion survey. First, the Village had recently had a change in administrative leadership. Tom Ryther had recently been hired as the new Village Administrator and, according to Lefker, was “gung-ho to collect as much information as possible” about the Village.208 Another reason for conducting a survey came about due to the Village’s involvement in a potential economic development project. The Village wanted to collect baseline demographic data about their residents as they were involved in preliminary discussions for a Batavia-Williamsburg Hike Bike Trail. Data collected by the survey would be useful as this potential project was discussed and the feasibility of such a trail was debated.209 These were the secondary reasons for the Village of Williamsburg deciding to undertake a community opinion survey project. The next interview question asked Lefker to identify questions on the survey that were designed to inform or educate the residents on decisions/policies that had already been made/implemented by the Village. Lefker indicated that a battery of parks and recreation questions were included in the survey for the purpose of informing and educating residents about the Village’s two-phase plan for developing the Williamsburg Community Park.210 Respondents were first asked to identify the statement that best described their familiarity with the two-phase park plan. Respondents were then given a list of the recreational facilities planned for the park under the plan and asked how often they expected to use these

208 Ibid. 209 Ibid. 210 Ibid.

106 facilities in an average year. Another question in this section asked respondents to identify other facilities that had been discussed for the park that they would like to see added to the park plan. Finally, a follow-up question asked respondents to identify their top three choices of possible facilities to be added to the park plan.211 The results of these questions demonstrated general unfamiliarity with the two-phase park plan as 41.3% of respondents said that they were unfamiliar with the Park while another 44.4% of respondents were generally familiar with the Park but unfamiliar with the specific plans for its use and facilities.212 Lefker pointed out that this battery of questions did help to inform and educate residents of the Park and the plans for its facilities as the Village had intended. As a response to the general unfamiliarity with the Park and the plans for its facilities, Village officials did install Park signage directing people to the Park as well as improving the signage at the entrance to the Park in an attempt to call more attention to the Park’s location.213 In terms of facilities that respondents identified as those that would be used the most in an average year, the most heavily used facility was the walking trail. Slightly over forty percent (40.3%) of respondents indicated that they would use the walking trail more than ten times per year. Other facilities that had high responses in terms of frequency of use were: playground area (26.8% would use it more than ten times per year) and basketball court (12.8% would use it more than ten times per year).214

211 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 17. 212 Ibid, 17-18. 213 Lefker, Mary Ann. Personal Interview. 28 April 2005. 214 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 18.

107 Having this information available made it possible for the Village to re-visit the Park plan to ensure that it matched respondents’ preferences for facilities. The two-phase Park plan is still in the implementation stage as not every facility originally planned for the Park has been installed or completed yet. From the original plans, the following facilities have been installed: shelter/picnic area, basketball court, and ball diamond. The Village was able to leverage volunteers from civic organizations (Rotary Club and Jaycees) to finish the basketball court. The Village employees installed the remaining facilities. Additionally, the Village has installed a frolf215 (Frisbee golf) course at the Park. While still planned for the Park, the Village has yet to install a playground area or a volleyball court.216 Lefker was then asked which questions appeared on the survey that asked residents to evaluate decisions/policies that the Village had already made/implemented. The Village was concerned generally about the maintenance and upkeep of Village streets, roads, and signs. Specifically, the Village was interested in receiving feedback concerning their street resurfacing efforts and sidewalks.217 The survey asked respondents to indicate how satisfied they were with the streets and roads in the Village. The majority of respondents were either very satisfied (9.4%) or satisfied (55.4%) with the streets and roads.218 The next question in this battery asked respondents to rate their satisfaction with sidewalks in the Village. Slightly more than 32% of respondents were satisfied with sidewalks. Conversely, 24.7% of respondents were dissatisfied with sidewalks.219 A third related

215 For an example of the use of the term of ‘frolf’ in the popular culture lexicon, see Episode #822 of the NBC television show entitled “The Summer of George” originally airing on May 15, 1997. 216 Lefker, Mary Ann. Personal Interview. 28 April 2005. 217 Ibid. 218 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 10. 219 Ibid, 10.

108 question sought to provide feedback as to whether citizens felt that the “tree lawn area (the strip of land between the sidewalk and the street)” should be maintained as grass. The majority of respondents (57.6%) felt that the tree lawn area should be maintained as grass. According to Lefker, taking these three questions in a bundle, the Village sought to address their street resurfacing program and sidewalk improvement and installation. The Village has adopted a policy that every street that gets resurfaced in the Village must not only be resurfaced, but must also be curbed with sidewalks. Bids are solicited for all street resurfacing projects with these stipulations in the bid specifications. Furthermore, there were two streets resurfaced just before this policy was enacted by Village Council that did not have sidewalks installed. The Village went back to these two streets and installed new sidewalks.220 In terms of the tree lawn area, the Village felt that there was enough support among citizens to maintain that area as grass. In an effort to improve the tree lawn area around the Village, the Village is in the process of applying to become recognized by the Tree City USA program.221 This program is sponsored by the National Arbor Day Foundation and “…provides direction, technical assistance, public attention, and national recognition for urban and community forestry programs in thousands of towns and cities…”222 The Village has established a Tree Board to work with citizens to plant and replace trees around the Village as necessary. The Village has gone so far as to create a small nursery of sorts near the Village reservoir to grow the trees that will eventually be planted around the

220 Lefker, Mary Ann. Personal Interview. 28 April 2005. 221 Ibid. 222 National Arbor Day Foundation Programs – Tree City USA. The National Arbor Day Foundation. 10 June 2005 .

109 Village.223All of this is being done for beautification efforts within the Village and to respond to citizens desires that the tree lawn area be maintained as grass with trees. Lefker was then asked whether questions appeared on the survey that asked residents for their opinions on decisions/policies that the Village was currently deliberating, discussing, or considering implementing. Lefker responded that there were several areas in which the Village was seeking input from residents concerning courses of action that the Village was considering. The first such issue dealt with problems that some residents were having with water distribution in the Village.224 Survey respondents were asked to identify the three qualities that they most disliked about the Village. The third most frequently identified quality was generally categorized as “water problems” with 16.4% of respondents identifying this problem. Included within the general category of “water problems” were a lack of water pressure and complaints about the quality (taste) of the water supplied by the Village.225 According to Lefker, this response was not unexpected as Village officials were aware that a section of the Village was having ongoing problems with low water pressure. Village officials were considering their options as they were working on a plan to improve water service in the Village that would alleviate the water pressure problems. To alleviate the problem, the Village built a new water tower that solved the distribution problem that was causing low water pressure for a portion of Village residents. Moreover, the Village eventually decided that it was no longer in

223 Lefker, Mary Ann. Personal Interview. 28 April 2005. 224 Ibid. 225 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 8.

110 their best interest to supply water on its own in part because of growing dissatisfaction with the quality of the water being provided. Instead, the Village decided to contract with the Clermont County Water and Sewer District to buy the water that is distributed to Village residents. The Village of Williamsburg is still distributing the water that is supplied by the county district, and the Village feels that the water being provided by the county is of a higher quality. The contractual situation has been working well for the Village thus far. In fact the Village is currently negotiating with the Clermont County Water and Sewer District to take over the distribution of water in addition to supplying it.226 Another policy area that the Village of Williamsburg was currently deliberating, discussing, or considering implementing policy changes dealt broadly with issues of zoning and land use. While a majority of respondents indicated that they felt that the Village’s enforcement of zoning codes concerning residential property (46.5%) and commercial property (40.9%) had stayed about the same over the past two years, 38.8% of respondents expressed a concern for “improved maintenance of homes” in the Village.227 Village officials interpreted these results as an indication that enforcement had remained consistent over the past two years, but needed to be improved particularly in regards to the upkeep of residential homes.228 A two-pronged approach was developed to address these zoning issues. First, the Village hired a consultant to assist with a comprehensive review of the Village’s zoning code. The zoning code was updated, condensed, and made easier to

226 Lefker, Mary Ann. Personal Interview. 28 April 2005. 227 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 14-15. 228 Lefker, Mary Ann. Personal Interview. 28 April 2005.

111 understand for residents.229 The second prong of the approach taken to improve zoning in the Village was to place a greater emphasis on enforcement of the zoning code. Toward that end, the Village began to utilize the Village Police Department to assist with enforcement. Village Police Officers became more involved in identifying zoning code violations in the Village and serving notices to property owners who were in violation.230 In regards to specific zoning code issues, the survey sought to ascertain respondents’ opinions on what they thought were public nuisances in the Village. Respondents could choose any that applied from a list that included: animal control, noise, debris and junk in developed lots, weed and lawn height, debris and junk in vacant lots, abandoned buildings, junked cars, storage of recreational vehicles/boats, and other (a space where respondents could write in their own nuisance if it was not included in the list). The most frequently cited public nuisance in the Village was animal control with 41.1% of respondents selecting this item.231 Lefker said that the Village has tried to improve their efforts at reducing animal control problems in the Village, realizing that it is probably an issue that cannot be completely eradicated. For one, wild animals such as raccoons and possums live naturally in wooded areas in the Village. These types of animals are difficult to control because of their wild habitat. The other problematic animal in the Village is cats, both stray as well as those whose owners allow them to roam free. In order to reduce the number of cats roaming free in the Village, officials have implemented a program designed at capturing cats in

229 Ibid. 230 Ibid. 231 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 16.

112 traps. Unfortunately, the traps have only had limited success at alleviating the number of cats in the Village. For one, a number of the traps have been stolen in a series of, as yet unsolved, thefts. Secondly, the Village is charged a fee of $50 per cat for disposal by the county.232 Both of these issues have made it problematic for the Village to make any significant headway in dealing with the issue of animal control as a public nuisance affecting the Village. Finally, the Village had been discussing a revamping of their economic development strategy. As previously mentioned, the survey collected information on respondents’ opinions on a variety of zoning, land use, and economic development issues as they were asked to imagine Williamsburg five years into the future. The same data that helped the Village position itself in regards to the SR 32 Corridor Vision Plan, also helped the Village to redesign their strategy toward economic development. That new strategy focused specifically on the businesses located on Main Street which is the primary commercial area of the Village. Specifically, the Village emphasized their Main Street program which sought to keep Main Street storefronts (from Third Street to the Williamsburg Bridge) at full occupancy. Also included in this program was an emphasis on eliminating blight and improving the appearance of the storefronts. While there has been turnover among the storefronts, the Village has been successful at quickly replacing businesses that close. According to Lefker, much of the success can be attributed to a combination of the emphasis on the Main Street program, hard work on the part of Village officials, and an attitude on the

232 Lefker, Mary Ann. Personal Interview. 28 April 2005.

113 part of Village officials that is encouraging to business owners to locate on Main Street in the Village of Williamsburg.233 Finally, Lefker was asked whether the survey findings led the Village to make decisions or implement policy in new areas because of the opinions expressed. There were two specific instances of decisions made by Village Council as a direct result of the survey’s findings. The first such decision came ‘out of the blue’ according to Lefker as many survey respondents wrote comments on the survey wishing that the Village had a convenience store.234 The Village had a BP gas station on Main Street which was sold in January 1999 shortly after the survey results were returned to the Village. Village officials met with the new owner (Crazy Cruizin Corporation) and used the survey results to convince the new owner to expand the gas station to include a convenience store with a greater selection of snacks and beverages. Village officials have been very pleased with the results as has the new owner of the BP Williamsburg station.235 Convincing the new owner of the BP Williamsburg station to expand to include a convenience store came about as a direct result of the feedback provided by the results of the community opinion survey. The other area in which the Village implemented new policy came in the area of the Village’s communication efforts with residents. The survey asked respondents to rate how well the Village government communicates its policies and operations to the citizens. A majority of respondents (37.2%) rated the Village’s communication efforts as “average.”236 In an effort to improve communication efforts and get information to Village

233 Ibid. 234 Ibid. 235 Ibid. 236 Center for Public Management and Regional Affairs – Miami University. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998, 21.

114 residents in a more timely fashion, the Village implemented a quarterly newsletter that is distributed via the United States Postal Service to all residents of the Village. The newsletter regularly contains a message from the Mayor, community notes, news from the Village, and an upcoming schedule of meetings and events in and around the Village. Lefker believes that the effort that the Village has put into the quarterly newsletter (compiling all of the necessary information and formatting it) has been well worth it and is convinced that the Village has adequately responded to the survey results which indicated that improvement could be made in the area of the Village’s communication efforts with residents.237

Analysis Based on the data collected in the personal interview, where does the Village of Williamsburg community opinion survey appear on the typology presented in Chapter 2? Of primary concern to selecting the appropriate categorization is the primary reason that the Village conducted the survey. As previously noted, Lefker was able to convince Village officials to embrace a more traditional strategic planning process. The community opinion survey was an integral part of that process. To complement the community opinion survey, the Village engaged in a visioning process, and assigned a committee to review the survey results and implement strategies and plans to help the Village achieve the preferences for growth and development expressed in the survey and in the visioning process. Furthermore, the survey results played an important role externally to the Village as the results were used to influence how Clermont County would help the Village meet their goals for the future through the SR 32 Corridor

237 Lefker, Mary Ann. Personal Interview. 28 April 2005.

115 Vision Plan. Data from the survey proved invaluable when it came time for the Village of Williamsburg to be represented in this plan. In terms of specific decisions made by the Village based on the survey results, Lefker noted that the Village implemented long-range strategies or plans to help meet the preferences expressed in the survey. Specifically, the Village revamped their economic development strategy by placing more emphasis on the Main Street program. Clearly, a revamped economic development strategy was not a decision that would have an immediate impact on the Village, but one that would take time to implement and show results. These types of strategies are indicative of issues that are placed on the discussion agenda because it will take time for these strategies to develop. Based on these factors (primary reason for conducting the survey and the implementation of plans and strategies), it seems that the Village of Williamsburg community opinion survey best falls into the category of a Strategic Survey. The survey was part of a larger, more formalized strategic planning process and resulted in the implementations of plans and strategies. According to Lefker, the greatest impact of the survey was that the results enabled the Village to be well represented externally, particularly with Clermont County.238 For these reasons, the Village of Williamsburg community opinion survey is categorized as a Strategic Survey. It must be noted that as a secondary categorization, a case can be made that the Village of Williamsburg community opinion survey can also be considered a Decisionistic Survey. This is due to the sheer number of discrete decisions that resulted from the community opinion survey. The Village used the survey results to evaluate their street resurfacing and

238 Ibid.

116 sidewalk programs. Buoyed by the survey results, the Village placed a greater emphasis on planting trees in the Village, particularly in regards to the tree lawn area. The Village had been discussing and deliberating policy options in terms of the Village’s water distribution system as well as their zoning code and its enforcement. It did not come as a surprise to Village officials that these issues would have to be dealt with, but the specific policy options that were considered and eventually chosen were shaped by the survey results. The Village would ultimately decide to divest themselves of their water distribution system, turning it over to the county. The zoning code would be rewritten and greater emphasis was placed on enforcement of the code. Additionally, the Village made decisions in areas that were new to the Village’s policy agenda. The Village made it a priority to attract a convenience store to the Village based on the results of the survey. Also, the Village implemented a quarterly newsletter in response to concerns about the communication efforts of the Village with its residents. These decisions could warrant justification for the Village of Williamsburg survey being classified as a Decisionistic Survey.

Oxford Township (Butler County), Ohio – 2000 Oxford Township (Butler County), Ohio undertook a community opinion survey project, with the assistance of the Center for Public Management and Regional Affairs, in 2000. The survey was administered in April 2000 and the results were presented to Township officials in July 2000. Township Trustee George Simonds was interviewed for this research in 2005 to ascertain how the survey results have been used in the approximately five years since the survey was administered.

117 Simonds was asked to identify the primary reason for conducting a community survey as part of a strategic planning process. He responded that the Township was interested in gathering information on two levels. First and foremost, the township wanted to collect some baseline data and information on the image of township government from the perspective of residents. Simonds elaborated on this point, stating that there was concern among township elected officials as to how township residents perceived the township and its operations. Secondly, Oxford Township officials were interested in finding out what residents wanted in terms of services, programs, and activities going forward.239 It is this future oriented outlook that Oxford Township Trustee Jim McDonough commented on while the Trustees were reviewing the survey results. McDonough told The Oxford Press that the “…report offers us guidelines for long-range planning in varied areas of concern.”240 Next, Simonds was asked: what, if any, were the secondary reason(s) for conducting a community survey as part of a strategic planning process? Simonds indicated that, beyond the two aforementioned reasons, there were no other reasons discussed by township officials for conducting the survey.241 Simonds was then asked to identify questions on the survey that were designed to inform or educate the residents on decisions/policies that had already been made/implemented by Oxford Township. Simonds noted that the community opinion survey afforded the township a unique opportunity to communicate with residents about several issue areas in which the

239 Simonds, George. Personal Interview. 16 May 2005. 240 White, Bob. “Oxford Twp. trustees study survey results.” The Oxford Press. Thursday, February 22, 2001. A3. 241 Simonds, George. Personal Interview. 16 May 2005.

118 township had recently made some decisions. The first such issue dealt with the installation of a tornado warning system in the township. The township had entered into a contract to install five sirens throughout the township as a safety mechanism to warn residents of imminent severe weather and tornadic events. Simonds explained that the township had already decided to install the sirens before the survey was designed, but that the survey provided an opportunity for the township to communicate this public safety improvement by asking residents whether they were aware of this decision.242 The results for this question showed that only 7.6% of respondents were aware of the plans to install such a warning system. Many respondents, however, commented on the survey that they were pleased to know about these plans.243 It would seem that the results for this question underscored the concern that township officials had about residents being unaware of the township’s course of action on this issue. Oxford Township officials were also concerned that residents might be unaware of the plans for the development of new park within the township. Township officials had a plan in place that would establish a park area on Corso Road adjacent to the historic Black Covered Bridge. The park would consist of small areas for parking and picnicking. More survey respondents were aware of these plans (21.5%) than were aware of the plans for the tornado warning system.244 Once again, this question seemed to serve the intended purpose of informing a sizable portion of survey respondents about the plans for the new park.

242 Ibid. 243 Center for Public Management and Regional Affairs – Miami University. Oxford Township Survey Project Final Report, 13. 244 Ibid, 13.

119 Finally, a third issue that the Township sought to communicate with residents about was that of recycling. Oxford Township officials sought to inform residents about the drop-off recycling program offered by Butler County within the Township. Drop-off recycling for township residents was offered on the first Saturday of the month at the Oxford Wal-Mart location and on the third Saturday of the month at Cook Field on the Miami University campus. Over half of the respondents (52.7%) were aware of the availability of this service.245 Furthermore, a related question asked respondents to indicate how often they used either of these two drop-off sites to recycle. The vast majority of respondents (70.6%) indicated that they “never” use these sites to recycle.246 For township officials, the survey results indicated that there were reasons other than lack of information about these drop-off sites for recycling that were causing residents not to use them. The next question asked of Simonds sought to ascertain which questions appeared on the survey that asked residents to evaluate decisions/policies that the Township had already made/implemented. According to Simonds, there were no questions on the survey designed to specifically collect data on the evaluation of recent township policy decisions.247 Simonds was then asked to identify questions that appeared on the survey that asked residents for their opinions on decisions/policies that Oxford Township was currently deliberating, discussing, or considering implementing. At the time of the survey, the Oxford Township Trustees were concerned about their overall communication efforts with residents. The three aforementioned questions that appeared on the survey which

245 Ibid, 14-15. 246 Ibid, 15. 247 Simonds, George. Personal Interview. 16 May 2005.

120 sought to inform/educate residents about decisions made by the township were indicative of this concern. The results of these questions showed that this concern was generally well-founded on the part of the Trustees. According to Simonds, township officials had discussed several options for improving communication efforts with residents, but had focused primarily on the possibility of implementing a periodic township newsletter that would be mailed to all township residents.248 The survey asked residents a series of questions pertaining to communication designed to collect some information to help determine an appropriate course of action for the township to pursue on this issue. The first such question asked respondents: “In general, where do you get your information about Township meetings, activities, and issues?” Respondents could check all options that applied from among the following selections: The Oxford Press, Hamilton Journal-News, The Cincinnati Enquirer, television, radio, word- of-mouth, Internet, public notices/bulletin boards (e.g. the Post Office), and other. The most frequently selected option was The Oxford Press with 76.7% of respondents indicating that they receive information about the Township from this source. The only other options that were selected by over thirty percent of respondents were word-of-mouth (39.0%) and the Hamilton Journal-News (31.9%).249 Two additional questions in this section were designed to measure other issues related to Oxford Township’s communication efforts with citizens. First, the survey asked respondents to provide their opinion on how well the Township government communicates its issues, policies, and operations to the residents of Oxford Township. The majority of

248 Ibid. 249 Center for Public Management and Regional Affairs – Miami University. Oxford Township Survey Project Final Report, 17.

121 respondents (45.9%) rated Oxford Township communication efforts as average. Slightly more than ten percent indicated that the Township’s communication efforts were excellent (2.7%) or above average (8.9%). Conversely, over one quarter of respondents rated these efforts as either below average (15.5%) or poor (11.1%).250 Furthermore, respondents were asked to identify from what sources they would prefer to receive information concerning Township news, meetings, and events. Several options were provided to select from, including: a dedicated column in local newspapers, cable television public access channel, Township newsletter, Township Internet home page, and other. The most frequently cited selection was a dedicated column in local newspapers with nearly 60% of respondents choosing this option. Slightly more than half of respondents (51.9%) wanted a Township newsletter to be implemented, while only 16.7% of respondents were interested in a Township Internet home page.251 Based on these responses, Simonds explained that Township officials realized that they needed to improve their communication efforts with residents. Two local newspapers (The Oxford Press and the Hamilton Journal-News) were the main sources of information for residents currently. Given some options about where township residents wanted to get their information from (local newspapers and a Township newsletter), the Township Trustees debated these various options in order to develop a strategy to improve communication efforts overall. Simonds explained that the Trustees debated the issue and had quick consensus for focusing on local newspapers and increasing the amount of coverage of Township news and

250 Ibid, 17. 251 Ibid, 18.

122 information. While the Township has been unable to get either The Oxford Press or the Hamilton Journal-News to commit to a dedicated column appearing in either paper on a regular basis, Simonds feels that the Township has effectively implemented subtle strategies and efforts to increase coverage of Township news and information in both of these papers. The Township regularly invites reporters to attend Township meetings and sends out press releases whenever necessary all in an effort to increase media coverage.252 The option of implementing a periodic Township newsletter to improve communication efforts with residents was discussed extensively by Township officials, according to Simonds. He described this issue as being a ‘hot topic’ for some time after the survey results were presented and analyzed. However, in the end, Township officials ultimately decided not to pursue implementing a Township newsletter. Simonds indicated that Township officials felt that there was not enough Township news and information to justify the expense of publishing and mailing a newsletter to all township residents.253 It should also be noted that in the approximately five years since the survey was conducted, the Township has developed an Internet presence with an official home page serving as a source of information to residents.254 Simonds made it clear that the idea of establishing a Township home page was not seriously considered while examining the survey results in 2000, but was a decision made well afterwards.255

252 Simonds, George. Personal Interview. 16 May 2005. 253 Ibid. 254 The official Internet home page of Oxford Township (Butler County), Ohio can be accessed at: http://www.oxfordtwpohio.org/ 255 Simonds, George. Personal Interview. 16 May 2005.

123 The final interview question sought to ascertain whether the survey findings led the Township to make decisions or implement policy in new areas because of the opinions expressed. Simonds indicated that the results, overall, showed that Township residents were generally pleased with the level of service being provided by the Township. With positive satisfaction levels in the eyes of Township officials, there was no need to delve into new areas based on the survey findings. Simonds did note, however, that the Township did re-instate twenty-four hours per day police coverage by the Township Police Department, but it was not as a result of the survey findings. Simonds explained that the Township previously provided twenty- four hours per day police coverage until that coverage was reduced in a budget cutting maneuver. Simonds indicated that Township officials promised to re-instate police coverage once the Township’s financial standing improved, a promise which has since been kept.256

Analysis Based on the data collected in the personal interview, where does the Oxford Township community opinion survey appear on the typology presented in Chapter 2? The primary reason that the Township conducted the survey, according to Simonds, was to collect baseline data on how residents perceived Township government and operations. Based on this reason, it would seem that the Township was interested in collecting benchmark data about Township services. The second reason espoused for conducting the survey was to enable the Township to pursue long-range planning initiatives.

256 Ibid.

124 In terms of specific decisions made by the Township based on the survey results, Simonds cited very few instances where decisions were made as a direct result of the survey. There were several decisions and policies that the Township had already implemented that the survey served to inform and educate residents about. The tornado warning siren system, the plans for a Township park, and recycling options available to residents within the township were all issues that the survey informed residents about. The survey did not ask residents to evaluate specific policies or decisions that had already been made. In terms of issues that were being discussed or deliberated, Township communication efforts with residents were enhanced as the Township implemented a strategy to increase the coverage that news, activities, events received in the local newspapers. Simonds indicated that there were no decisions made in new areas due to the survey results. Based on these factors (primary reason for conducting the survey and very few decisions made as a result of the survey), it seems that the Oxford Township community opinion survey best falls into the category of an Informational Survey. The stated goal of collecting benchmark data was achieved; the Township now had a baseline of citizen satisfaction levels for the provision of services. Furthermore, the Township used the community opinion survey to communicate policies and decisions in three different instances. From the standpoint of decision-making, the survey resulted in only one decision, and that decision was a strategy that required the participation of another party (the media) in order for it to be effective. Providing benchmark data, but lacking any real impact on decision-making, the survey had little impact on the Township’s policy agenda.

125 Village of Coldwater, Ohio – 2002 The Village of Coldwater, Ohio undertook a community opinion survey project, with the assistance of the Center for Public Management and Regional Affairs, in 2002. The survey was administered in March 2002 and the results were presented to Village officials in April 2002. Village Administrator Eric Thomas was interviewed for this research in 2005 to ascertain how the survey results have been used in the approximately three years since the survey was administered. Thomas was asked: what was the primary reason for conducting a community survey as part of a strategic planning process? Thomas indicated that some background information (pre-dating his employment with the Village) needed to be explained. This was the second such survey project conducted by the Village of Coldwater. The Village had previously commissioned the CPMRA to conduct a community opinion survey in 1996 shortly after hiring Joe Sansone as the new Village Manager. However, during the execution of the 1996 survey project, Sansone suddenly resigned, and the results of the 1996 survey were presented to Village Council while the Village Manager position was vacant. According to Thomas, several members of Village Council (who still held elected office in the Village in 2001) were familiar with the process from the 1996 project and were pleased with the survey administration and its results. These Council members sought to have another survey conducted to measure changes in attitudes, opinions, and service delivery benchmarks over the past five years.257 In late 2001, the Village Manager at the time, Tom Ault, contacted the CPMRA to assist with conducting another community opinion survey, in order to collect another set of data to benchmark the Village’s service

257 Thomas, Eric. Personal Interview. 2 June 2005.

126 delivery performance. Ironically, Ault would also resign from his position with the Village during the execution of the 2002 survey. Thomas was appointed as Ault’s successor in April of 2002 and has served in the same position since that time. Later that month, the final results of the 2002 Village of Coldwater community opinion survey were presented to Thomas. The primary reason that the Village of Coldwater conducted the community opinion survey in 2002 was that the results would serve as a “good thermometer of the community’s needs.” One of Thomas’ first assignments from Village Council was to review the survey results, along with the Village Planning Commission, and make any policy recommendations to Council as warranted by the results. From a benchmarking perspective, the review of the results demonstrated that the vast majority of residents expressed overall satisfaction with the services provided by the Village.258 Respondents were asked to indicate how satisfied they were with police protection, fire protection, street and road maintenance, traffic signs, Coldwater parks, water and wastewater service, and zoning enforcement. Over 50% of respondents indicated that they were either satisfied or very satisfied with all of these services; the only exception was street and road maintenance (only 48.1% were either satisfied or very satisfied).259 Furthermore, Thomas indicated that upon taking over as Village Manager, his hiring represented a change in administrative attitude as opposed to previous managers. Thomas strives to be more proactive in dealing with issues and responding to citizen complaints and inquiries than previous managers. Having recent survey results so early in his tenure as

258 Ibid. 259 Center for Public Management and Regional Affairs – Miami University. Village of Coldwater, Ohio Community Satisfaction Survey – April 2002, 11.

127 Village Manager helped him to prioritize issues that were important to the citizens.260 The survey results enabled Thomas to review citizen opinions on Village affairs and services and focus his initial efforts in his new position on issues brought to light by the survey. Next Thomas was asked: what, if any, were the secondary reason(s) for conducting a community survey as part of a strategic planning process? Thomas was unaware of any secondary reasons for conducting the survey. However, he admitted that the survey provided him with a snapshot of the Village. Thomas was able to use the demographic data collected by the survey to get a better sense of the makeup of the residents of the Village which was helpful in his first days and months in his new position. Thomas indicated this would have been another good reason for conducting a survey – to provide baseline data and information to a new Village Manager.261 However, it was certainly not a reason expressed by the Village under the circumstances during the time in which the survey was commissioned. Thomas was then asked to identify questions on the survey that were designed to inform or educate the residents on decisions/policies that had already been made/implemented by the Village. Thomas indicated that there were no questions designed specifically to inform or educate residents about Village policies, activities, or programs appearing on the survey.262 The next question asked of Thomas sought to ascertain which questions appeared on the survey that asked residents to evaluate decisions/policies that the Village had already made/implemented. Thomas indicated that the results of the 1996 survey showed that there was some

260 Thomas, Eric. Personal Interview. 2 June 2005. 261 Ibid. 262 Ibid.

128 dissatisfaction with sidewalk repair in the Village.263 The results of the 1996 survey showed that 19.0% of respondents were either dissatisfied or very dissatisfied with sidewalk repair in the Village.264 In response to this dissatisfaction, the Village implemented a Sidewalk Improvement Program designed to repair existing sidewalks and install sidewalks in new areas of the Village being developed. The Village was divided into six sections and Village officials went section by section and identified sidewalks that were in need of repair. Residents were notified if their sidewalks were not up to standard and were given the opportunity to make any necessary repairs. If they chose not to make the necessary repairs on their own, the Village would repair the sidewalks for the residents and levy an assessment on the resident’s property tax bill to pay for the expenses. Thomas said that Village officials were hopeful that the results of the 2002 survey would show increased levels of satisfaction with sidewalks in the Village based upon the implementation of the Sidewalk Improvement Program.265 On the 2002 survey, residents were asked to evaluate whether sidewalk maintenance (as part of a list of a variety of street, road, and sign conditions) had become better, stayed about the same, or become worse over the past three years. Thirty-six percent of respondents indicated that sidewalk maintenance had become better over the past three years; 52.3% of respondents indicated that it had stayed about the same, while only 6.4% of respondents indicated that it had become worse.266 Thomas said that he and

263 Ibid. 264 Center for Public Management and Regional Affairs – Miami University. Village of Coldwater, Ohio Community Satisfaction Survey – April 2002, 20. 265 Thomas, Eric. Personal Interview. 2 June 2005. 266 Center for Public Management and Regional Affairs – Miami University. Village of Coldwater, Ohio Community Satisfaction Survey – April 2002, 12.

129 other Village officials were pleased with these results as evidence of the success of the Sidewalk Improvement Program. The program remains in place in the Village currently and is an important service provided by the Village from the perspective of Village officials. Thomas indicated that a significant number of Village residents are interested in walking as both a form of exercise and a recreational activity, making this a particularly valuable program for these people. Furthermore, the Village is about to undertake the installation of additional sidewalks at the Village’s Memorial Park after receiving a $35,000 anonymous donation specifically earmarked for this purpose, further evidence of the importance of walking and sidewalks to Village residents.267 More generally, the Village sought to address two specific service areas, namely water and wastewater services and street and road maintenance, because the survey results showed the greatest levels of dissatisfaction with these services among citizens. Slightly more than twenty percent (20.5%) of the survey respondents were either dissatisfied or very dissatisfied with water and wastewater services, while 27.9% of survey respondents were either dissatisfied or very dissatisfied with street and road maintenance.268 Dissatisfaction with water and wastewater services came as no surprise to Village officials as they were painfully aware of a variety of problems with the water distribution system in the Village. These results were a contributing factor to the Village undertaking a significant capital improvement project to construct a new water plant and distribution system. The Village is spending upwards of $3,000,000 to upgrade the water system

267 Thomas, Eric. Personal Interview. 2 June 2005. 268 Center for Public Management and Regional Affairs – Miami University. Village of Coldwater, Ohio Community Satisfaction Survey – April 2002, 11.

130 in the Village which should alleviate most of the problems identified in the survey in regards to the water system in the Village.269 Street and road maintenance was another service that Village officials sought to improve based upon the survey results. Thomas indicated that most of the dissatisfaction with this service revolved around the lack of timely pothole repairs and streets and alleys that were overdue to be resurfaced. According to Thomas, street repairs and resurfacing projects had been routinely delayed or cancelled due to budgetary cutbacks in the years immediately prior to the survey. After reviewing the results of the survey in regard to street and road maintenance, Village officials made improving this service a priority. First, the budgetary issues needed to be resolved before a strategy could be devised to improve the service. Initially, the Village began to tap into their financial reserves to allocate more money to the Public Works Department. Additionally, as Village revenue streams improved, the Public Works Department was allocated even more money.270 The strategy employed by the Village, once additional money had been allocated for street and road maintenance, was to begin to repair and resurface the most utilized streets in the Village. These high visibility streets included Main Street, Second Street, as well as several other portions of streets that serve as connectors to the highways that run through the Village.271 Village officials made a concerted effort to address the most heavily traveled streets first. While many of these streets were the most in need of repair, Village officials also received the ancillary benefit of Village residents seeing progress and improvement being made with this service. According to Thomas, since 2002, the Village has caught up on the repair

269 Thomas, Eric. Personal Interview. 2 June 2005. 270 Ibid. 271 Ibid.

131 and resurfacing of the streets and alleys most in need, and is back on track adhering to their normal repair and resurfacing schedule.272 The survey also asked respondents to identify which public nuisances, if any, they believed the Village has not adequately addressed. The survey results showed that unattended pets were the most frequently cited public nuisance that needed to be better addressed by the Village. Thomas indicated that the Village has attempted to increase enforcement of the Village ordinance on unattended pets by placing cages, when practical, around the Village as necessary to capture animals. Thomas admitted that this is a problem that the Village is unlikely to completely solve, but could certainly demonstrate improvement to its residents.273 The Village also took action on two other matters related to pets. The Village started an education campaign to increase awareness and disseminate information to residents concerning the Village ordinance requiring owners to clean up fecal matter deposited by their pets. According to Thomas, emphasizing the requirements of this ordinance has helped the Village cut down on the problems associated with defecating animals. Another pet related issue that has been dealt with since the survey in 2002 is that the Village has updated their animal ordinance to place restrictions on the keeping of exotic/vicious animals as pets due to safety concerns.274 While the survey results showed that lawn heights and vacant lots were not identified near the top of the list of nuisances not being adequately addressed by the Village, the Village still sought to take action on this issue. Village officials have been concerned about the appearance of the Village and have wanted property owners to keep their property maintained in terms

272 Ibid. 273 Ibid 274 Ibid.

132 of weed and lawn height. The Village has increased enforcement of their current ordinance regarding weed and lawn height and has begun to mow property if the owners do not comply with the ordinance.275 Thomas was also asked to shed some light on whether questions appeared on the survey that asked residents for their opinions on decisions/policies that the Village was currently deliberating, discussing, or considering implementing. Thomas indicated that when he took over as Village Manager, there were no specific issues being discussed by the Village Council that were impacted by the survey results. However, there was a broader ongoing issue that the survey results could have impacted.276 The survey collected data on residents’ preferences for growth and development within the Village. Nearly seventy percent (68.3%) of respondents wanted the Village to “pursue moderate growth” over the next five years.277 No specific strategies or decisions needed to be implemented in order for Village officials to meet this desire expressed in the survey results. According to Thomas, the Village had already implemented a comprehensive land use plan for the Village that stressed measured and controlled growth. Thomas indicated that Village Council instructed him to review the Village’s comprehensive land use plan to make sure he understood the type of growth and development that the Village was interested in pursuing. Since that time, the survey results on the issue of growth and development have served as a reminder to Village officials

275 Ibid. 276 Ibid. 277 Center for Public Management and Regional Affairs – Miami University. Village of Coldwater, Ohio Community Satisfaction Survey – April 2002, 18.

133 whenever these types of issues have been brought to the attention of the Village.278 The final interview question sought to ascertain whether the survey findings led the Village to make decisions or implement policy in new areas because of the opinions expressed. One of the issue areas that the Village of Coldwater began to explore based on the results of the survey was in the area of cable television service within the Village. The Village has had a cable television franchise agreement in place with Adelphia Cable. Thomas noted that, while residents were not asked to specifically rate cable television service on the survey, a few respondents took the time to write-in responses related to the cable television service provided by Adelphia Cable. This served to put Village Council on notice that there may be concerns with the service provided and the rates charged by Adelphia.279 In the past three years since the survey was conducted, Thomas noted that there has been an increasing number of complaints from residents about Adelphia Cable, so much so that Village Council began to pursue strategies designed to encourage competition for this service within the Village. To that end, the Village enticed a start-up cable television company (Hometown Cable, Inc.) to come to the Village so that residents would have an alternative to Adelphia Cable for their television service. While there was no commitment or guarantee of funds from the Village to Hometown Cable, the Village did assist the cable start-up by leasing office space as well as land for Hometown Cable to erect an antenna for broadcasting television signals.280 While Hometown Cable has not yet begun to deliver services, they are expected to begin broadcasting and connecting homes before the

278 Ibid. 279 Ibid. 280 Ibid.

134 end of 2005 and will be negotiating a franchise agreement with the Village of Coldwater.281

Analysis Based on the data collected in the personal interview, where does the Village of Coldwater community opinion survey appear on the typology presented in Chapter 2? Of primary concern to selecting the appropriate categorization is the primary reason that the Village conducted the survey. For the Village of Coldwater, the purpose of conducting a community opinion survey was to serve as the second observation in an ongoing benchmarking exercise. Additionally, the survey was conducted to identify the needs of the community which became particularly salient considering the fact that there was a change in administrative leadership during the community survey project. The survey afforded the new Village Administrator with the unique opportunity to have data available to help set the Village’s policy agenda. In terms of specific decisions made by the Village of Coldwater based on the survey results, Thomas identified several decisions made by the Village as a direct result of the survey. The Village asked residents to evaluate several current policies and programs within the Village. Due to the results of the survey, many of these policies and programs were re- designed or tweaked. The Village had implemented a Sidewalk Improvement Program and the feedback on the Program showed that residents were generally positive about it. The Village also received evaluative data on their water distribution system. The survey data showed

281 Lawrence, Betty and Janie Southard. “Coldwater, St. Henry To Get Cable Competitor” The Daily Standard. March 29, 2005. .

135 that residents had some significant problems with the system and this dissatisfaction was a driving force behind the Village taking on a multi- million dollar project to upgrade the system. Furthermore, the data returned on street and road maintenance also showed some dissatisfaction with street resurfacing. The Village had fallen behind in their resurfacing schedule and made it a priority to catch up with resurfacing that was overdue. Village Council also made some changes to deal with problems identified with the zoning code. Increased enforcement along with some tweaking to the Village’s pet ordinance were the strategies used to deal with these problems. Additionally, the survey results spurred Village Council to more aggressively deal with property in which the weed and lawn height exceeded what was allowable under the zoning code. While it did not result in any formal action by Council, the Village’s comprehensive land use plan was reviewed to ensure that it was in line with residents’ opinions towards growth and development in the Village. Finally, problems with the only cable television provider in the Village led to the Village working to attract and helping to establish a start-up cable company to provide competition in the Village. This was a wholly new policy area that the Village had not expected to be placed on the Village’s policy agenda due to the survey results. Based on these factors (primary reason for conducting the survey and multiple decisions being made to alter existing policy), it seems that the Village of Coldwater community opinion survey best falls into the category of a Decisionistic Survey. The survey enabled Village officials to evaluate existing policies and activities and take immediate action to alter these existing policies and activities to better meet the desires and needs of their residents. Multiple decisions were made in a variety of issue areas. For these

136 reasons, the Village of Coldwater community opinion survey is categorized as a Decisionistic Survey.

Hanover Township (Butler County), Ohio – 2003 Hanover Township (Butler County), Ohio undertook a community opinion survey project, with the assistance of the Center for Public Management and Regional Affairs, in 2003. The survey was administered in October 2003 and the results were presented to Township officials in February 2004. Township Trustee Timothy Derickson was interviewed for this research in 2005 to ascertain how the survey results have been used in the approximately year and a half since the survey was administered. Initially, Derickson was asked to identify the primary reason for conducting a community survey as part of a strategic planning process. Derickson responded that the primary reason that Hanover Township pursued a community opinion survey project was because the Trustees agreed that they needed “to put the finger on the pulse of the residents.”282 Derickson elaborated on this reason, stating that the Trustees wanted to collect the opinions of township residents to see if they matched the opinions of the three elected trustees. Having this data on hand was very important to Derickson who wanted to make sure that township officials had some direction and a sense of what areas the Township should be committing time, money, and resources to. In order to consider the community opinion survey project successful, Derickson felt that it was important that all three of the elected Trustees be committed to following through on using the

282 Derickson, Timothy. Personal Interview. 19 April 2005.

137 survey results to inform short-term decision making as well as long-term planning.283 Derickson expressed a similar sentiment as the survey was being conducted back in 2003. Derickson stated: “The objective of the survey is to find out what residents like and dislike about township life, because, of course, they ultimately fund all services.”284 Furthermore, he went on to add: “…we must be frugal and ensure we are spending money in areas that our residents feel are important.”285 Clearly, Hanover Township officials were interested in having data at hand to plan and expend township resources on policies, programs, and activities that were identified as important by the survey respondents. It should also be noted that Hanover Township had previously conducted a community opinion survey project in 1992 with the assistance of the Center for Public Management and Regional Affairs. Only one of the three current Township Trustees (Michael Mignery) has remained in office since the first community opinion survey project. Derickson said that Mignery spoke very highly of the 1992 survey project and the utility of the data generated by that survey. He encouraged the trustees to conduct another such survey, particularly in light of how drastically the Township had changed in the past eleven years.286 As for secondary reasons for Hanover Township to pursue a community opinion survey project, Derickson indicated that he was personally interested in conducting a survey because he was serving his first term in office as an elected Trustee. Derickson was first elected to office in

283 Ibid. 284 Young, Nancy. “Hanover Township taking the pulse of its residents.” The Cincinnati Enquirer. Thursday, October 23, 2003. . 285 Ibid. 286 Derickson, Timothy. Personal Interview. 19 April 2005.

138 2000, and being relatively new to the position of Trustee, he wanted to have some baseline data on the township in order to help guide his thinking on issues and what he should be focusing on as a Trustee.287 Derickson pointed out that while he had personal reasons to collect this baseline data, he felt that data would be useful to the Township in general as they would have some benchmarks for assessing service delivery. It is the goal of the Trustees to periodically collect survey data to be able to compare benchmarks over time. It is Derickson’s hope that the 2003 survey would be the first in a series of benchmarking data collected over the coming years.288 Furthermore, Derickson indicated the trustees were also partially motivated by a single issue that had been debated and discussed by Township officials, namely significant changes to service provided by the Township. Hanover Township currently provides for and contracts with the Hanover Township Volunteer Fire Department for fire protection and emergency medical services for the township. There had been recent discussion of the potential need in the future to increase the amount of fire protection and emergency medical services due to growing demands from township residents. Much of that discussion centered around the probable need to shift from a volunteer provided service to a Fire Department staffed by full-time employees. The thinking on the part of Township officials was that such a significant change in the method of service delivery could be quite costly. One option to fund such a change in service delivery would be to ask township voters to pass an operating levy specifically for fire and

287 Ibid. 288 Ibid.

139 emergency medical services.289 The survey then would be a good vehicle to get this issue circulated amongst the residents. The next question asked Derickson about which questions appeared on the survey that were designed to inform or educate residents on decisions already made by Hanover Township. Derickson indicated that there were no questions on the survey that fit the criteria of decisions already made by the township. However, there were questions on the survey that were designed to be informational in nature on issues that the Township might have to face in the next several years as the Township continues to grow and change.290 The first such question was the aforementioned issue related to fire protection and emergency medical services. While this was not an issue that would need to be decided in the near future, the Trustees thought the survey could serve to educate and inform residents of the potential for increasing and improving fire service. The specific question asked respondents: “Do you think the Township should consider creating a paid, full-time fire/emergency medical services department in the next three to five years?” Over one-third of respondents (36.2%) thought that the Township should consider this change to the way fire/emergency medical services are provided by the Township, but 42.1% did not think this issue should be considered in that timeframe.291 A similar question appeared on the survey in regards to police protection. Currently, Hanover Township contracts with the Butler County Sheriff’s Office for police protection, but the Trustees wanted to gauge the possible interest in the Township creating a paid, full-time police department

289 Ibid. 290 Ibid. 291 Center for Public Management and Regional Affairs – Miami University. Hanover Township Community Survey: October/November 2003, 20-21.

140 in the next three to five years. The majority of respondents (55.1%) did not think the Township should consider this issue in the next three to five years, while 25.5% were in favor of considering a paid, full-time police department.292 This data seemingly jived with the data on respondents’ overall satisfaction levels with the police protection currently provided by the Butler County Sheriff’s Office. Respondents were asked to indicate their level of agreement with the statement: “I am satisfied with the current level of police protection provided to the Township.” Over three quarters of respondents either agreed (52.0%) or strongly agreed (24.1%) with the statement.293 Satisfaction levels that high may impact a respondents’ interest in changing the way in which police protection is currently provided. A third issue area in which a question was asked as a means of being informational to respondents was that of zoning. The survey informed residents that under state law, townships could adopt their own zoning code instead of following their county’s zoning code. Hanover Township currently uses the Butler County zoning code, but wanted to gauge respondents’ opinions on whether they thought Hanover Township should adopt its own zoning code. Derickson admitted that this was not an issue that the Township Trustees had previously considered (at least during his tenure in office). However, it was an issue that they were interested in informing residents about Ohio law on the matter.294 The results showed that 39.3% of respondents were in favor of the Township adopting its own zoning code, while 32.1% were not in favor of that course of action. The

292 Ibid, page 20. 293 Ibid, page 18. 294 Derickson, Timothy. Personal Interview. 19 April 2005.

141 remaining 28.5% of respondents expressed no opinion on the issue.295 While the Township Trustees were not ready to act on any of these three issues, they were important questions to ask of residents in order to educate and inform the public on issues that the Township may be facing in upcoming years. The next question asked Derickson to identify which questions appeared on the survey that asked residents to evaluate decisions/policies that the Township had already made/implemented. Derickson identified three areas on the survey that were designed to evaluate decisions/policies already in place in the Township. The first such area related to the Hanover School Memorial Park. The trustees had recently enhanced the playground equipment at the Park and wanted to measure respondents’ satisfaction levels with the Park.296 The survey asked: “In general, how satisfied are you with the recreational facilities at the Hanover School Memorial Park?” While the question did not specifically address the new playground equipment, Township officials wanted to get a broader sense of satisfaction with the Park in general. The survey findings showed that over half of the survey respondents were either very satisfied (24.9%) or satisfied (29.5%) with the recreational facilities at the Park. However, it should be noted that a previous question on the survey, designed to gauge average usage of the Park, showed that nearly 70% of respondents indicated “none” when asked: “On average, how many times in a month do you or your family use facilities at the Hanover School Memorial Park?”297 While there was satisfaction with the recreational facilities at the Park, that data was

295 Center for Public Management and Regional Affairs – Miami University. Hanover Township Community Survey: October/November 2003, 21. 296 Derickson, Timothy. Personal Interview. 19 April 2005. 297 Center for Public Management and Regional Affairs – Miami University. Hanover Township Community Survey: October/November 2003, 23.

142 somewhat muted by the fact that a majority of respondents do not regularly use those facilities. Another question that was designed for evaluative purposes focused on the sources from which respondents would prefer to receive information concerning Township news, meetings, and events. The vast majority of respondents (72.5%) indicated that they would prefer to receive a Township newsletter to inform them about Township news, meetings, and events.298 Derickson noted that the Trustees had published one newsletter since his election to office. The survey results were a clear indication that residents would like to see more newsletters published by the Township. Shortly after the survey results were presented to the Trustees, a newsletter summarizing the survey results was mailed via United States Postal Service to all Township residents in April 2004. The Trustees are currently working on a 2005 edition of the Township newsletter.299 A final area of policy evaluation that the survey provided information about was in the area of police and fire/emergency medical services. Derickson noted that for both of these services, the Township contracts with the Butler County Sheriff’s Office and the Hanover Township Volunteer Fire Department. Both of these contracts had been recently renewed, and Township officials were interested in gathering some service delivery benchmarks for each of these services to help inform future contract negotiations with each entity.300 The survey results, as previously noted, showed that over three quarters of respondents either agreed (52.0%) or strongly agreed (24.1%) with the statement that they were satisfied with the

298 Ibid, 26-27. 299 Derickson, Timothy. Personal Interview. 19 April 2005. 300 Ibid.

143 current level of police protection provided to the Township.301 Likewise, nearly 70% of respondents either agreed (51.7%) or strongly agreed (18.8%) with the statement that they were satisfied with the current level of fire protection provided by the Township.302 A similar number of respondents had the same satisfaction levels with emergency medical services as 47.8% of respondents agreed and 18.8% or respondents strongly agreed with the same statement applied to emergency medical services.303 Derickson indicated that the Trustees were pleased with satisfaction levels being this high for all of these services.304 The next question asked Derickson whether questions appeared on the survey that asked residents for their opinions on decisions/policies that the Township was currently deliberating, discussing, or considering implementing. Derickson identified two such issues that were impacted by the survey results: parks and recreational facilities and public nuisances (specifically junked cars). Hanover Township officials had been discussing expanding the amount of land dedicated to parks and recreational facilities in the Township. There was some sentiment among the Trustees that they should be focusing their efforts at acquiring land for a park(s) at new locations in the Township rather than expanding the Hanover School Memorial Park. Derickson described the idea as being one of establishing small pocket or neighborhood parks in various locations throughout the Township. This would entail land acquisition on the part of the Township

301 Center for Public Management and Regional Affairs – Miami University. Hanover Township Community Survey: October/November 2003, 18. 302 Ibid, 18. 303 Ibid, 19. 304 Derickson, Timothy. Personal Interview. 19 April 2005.

144 and the Trustees wanted to make sure there was support for such expenditures before agreeing to purchase any land.305 The survey results, however, painted a much different picture than what Township officials were originally considering. The survey asked: “Would you like to see more public parks and recreational facilities developed within the Township?” The results showed that over half of all respondents (53.5%) wanted to see more public parks and recreational facilities in the Township.306 That seemingly answered the first part of the question for Trustees – that there was support for additional park land acquisition. However, a follow-up question asked respondents: “If you answered “yes” to Question #23, what facilities would you like to see developed within the Township?” The two most frequently cited choices were: fitness trail/hiking path/walking path (42.0%) and bike path (33.5%).307 These two choices were facilities that would require a sizable chunk of park land to be procured. According to Derickson, these results pretty much eliminated the idea of pocket or neighborhood parks sufficing to meet the recreational demands expressed on the survey. Instead, Township officials turned their attention to securing larger tracts of land that would be more suitable for the trails and paths that respondents indicated they would like to see developed within the Township.308 Derickson went on to describe the course of action that Township officials ultimately chose to pursue in regards to additional parks and recreational facilities. According to Derickson, the Township sought to

305 Ibid. 306 Center for Public Management and Regional Affairs – Miami University. Hanover Township Community Survey: October/November 2003, 24. 307 Ibid, 24. 308 Derickson, Timothy. Personal Interview. 19 April 2005.

145 acquire a tract of land large enough to accommodate trails and paths and this decision was “one-hundred percent driven by the survey results.”309 In late 2004, the Township was offered the opportunity to purchase farm land that was adjacent to the Hanover School Memorial Park. When the offer was made, the Trustees made it a priority to pursue the acquisition of this land based on the results of the survey on this issue. The Trustees were able to strike a deal and were able to purchase a total of fourteen acres adjacent to the current Park. The fourteen acres were split into two tracts – a six and an eight acre tract.310 As Derickson went on to explain, acquiring land was only a small step toward developing the trails and paths requested on the survey. Since the land acquisition, the Township has hired an architectural firm to develop a master park plan for the development of the entire fourteen acres of land. The centerpiece of the master park plan is the inclusion of a pathway for bikes and walking/hiking. Also included in the plan will be multiple fitness stations located at various points along the path. While the plan is being developed by the architectural firm, the Township is actively seeking grant funding to help pay for the implementation of the first phase of the master park plan which would amount to the establishment of the path and landscaping. The Township is currently seeking a grant from the Ohio Department of Natural Resources (ODNR) to help offset the costs associated with the new park facilities. Derickson reiterated that this was a policy area in which the Township relied heavily on the results of the survey in order to

309 Ibid. 310 Ibid.

146 formulate their plan for expanding parks and recreational facilities in the Township.311 Another area in which the survey brought attention to an issue area that the Township was currently deliberating, discussing, or considering implementing was within the broad context of zoning, specifically junked cars as a public nuisance. A survey question asked respondents to identify: “Which of the following public nuisances, if any, do you believe Hanover Township has not adequately addressed?” The most frequently cited public nuisance was “junked cars” with 32.1% of respondents identifying it.312 Derickson explained that junked cars are covered under the Butler County Zoning Code, but the junked car issue is a matter of enforcement. According to Derickson, the ordinance is only as good as the enforcement behind it, specifically identifying the junked cars and notifying the owner of the violation. Hanover Township decided upon taking a more pro-active strategy to rid the Township of junked cars with a more aggressive approach to enforcement. Derickson indicated that all three of the elected Trustees now have a heightened awareness for junked cars and are on the lookout for these nuisance vehicles. Furthermore, the Trustees have contacted Butler County officials in an attempt to make dealing with junked cars a priority issue to be dealt with swiftly by the County.313 While the Township had been trying to deal with the issue of junked cars for some time, the survey results framed the issue as one that needed the immediate attention of Township officials.

311 Ibid. 312 Center for Public Management and Regional Affairs – Miami University. Hanover Township Community Survey: October/November 2003, 22. 313 Derickson, Timothy. Personal Interview. 19 April 2005.

147 Finally, Derickson was asked whether the survey findings led the Township to make decisions or implement policy in new areas because of the opinions expressed. One of these new issue areas revolved around the Township’s communication efforts with residents and the sources from which respondents would prefer to receive information concerning Township news, meetings, and events. The third most selected source of information was Township Internet web site which was identified by 23.2% of respondents.314 Derickson said that he had personally considered implementing a policy to establish an official Township web site, but had not brought the issue up with the other Trustees until the survey results were tabulated. Derickson explained that the need for a Township web site was pushed more to the forefront of his mind due to the survey results and prompted him to bring the item up for discussion as an agenda item at a Trustee meeting.315 Derickson said that the Trustees fully supported the establishment of an official Township web site and www.hanovertownship.net came into existence in September 2004. At this point, the web site contains only static content as Township officials are just becoming comfortable with the site and deciding how best to use it as a tool to communicate with residents.316 It should be noted that the development of an official Township web site was a project in which the Township received assistance from the Center for Public Management and Regional Affairs (CPMRA). The CPMRA helped design the web site and currently hosts the site on their server as part of the CPMRA’s services offered to townships at no cost. It must be mentioned

314 Center for Public Management and Regional Affairs – Miami University. Hanover Township Community Survey: October/November 2003, 26-27. 315 Derickson, Timothy. Personal Interview. 19 April 2005. 316 Ibid.

148 that had this free service not been available, Hanover Township may not have decided to pursue this matter when they did.

Analysis Based on the data collected in the personal interview, where does the Hanover Township community opinion survey appear on the typology presented in Chapter 2? In order to select the appropriate categorization, the primary reason that the Township conducted the survey must be identified. For the officials of Hanover Township, it seemed that the purpose of conducting a community opinion survey was to collect information about what residents wanted from the Township so that the Township officials would have information on where and how to focus their resources and efforts. This was particularly important considering that Derickson was relatively new to elected office. It afforded him an opportunity to better understand the opinions of Township residents. As for other secondary reasons to conduct the survey, the Township was interested in collecting benchmark data on service delivery as well as to deal with a single issue (the possible request for a fire levy in the future). In terms of specific decisions made by the Township based on the survey results, Derickson identified several decisions made by the Trustees as a direct result of the survey. The Township had been discussing and deliberating various options for increasing parks and recreation land in the Township. The survey results showed that residents’ opinions on additional parks and recreational facilities were different than the options that the Trustees had been discussing. The survey data re-focused the issue for the Trustees, and they sought to expand the acreage of the current park to accommodate the facilities requested by residents. Furthermore, the

149 Trustees took action to more effectively deal with the public nuisance of junked cars. The decision that was made was one of more aggressively identifying violations and enforcing the zoning code for junked cars. Finally, the Trustees took action in a wholly new area by developing an official Township web site as a means of improving Hanover Township’s communication efforts with its residents. Based on these factors (primary reason for conducting the survey and several discrete decisions being made), it seems that the Hanover Township community opinion survey best falls into the category of a Decisionistic Survey. The survey enabled Township Trustees to take immediate action on several issues and helped to clarify the policy options in those issue areas. For these reasons, the Hanover Township community opinion survey is categorized as a Decisionistic Survey.

150 5 CONCLUSION As the concept of strategic planning becomes more and more important to local government officials, so does the use of community opinion surveys as a technique to gather data from citizens to inform this process. At their essence, community opinion surveys are an integral part of decision-making both in terms of short-term immediate decisions as well as long-range planning for the future. The data returned by such surveys are particularly critical to local government officials from small, non- metropolitan communities who must serve a dual role as both legislator and policy analyst. As policy analysts, they must make sense of the data and devise policy options that fit the survey findings. Ultimately, these are the same options that elected officials will consider when asked to pass legislation. The research presented here attempts to provide a typology that can be used to categorize or classify community opinion surveys in terms of how the data collected by the survey is ultimately utilized by the elected and appointed officials in the community. The typology includes four types of surveys: Informational, Strategic, Decisionistic, and Symbolic. Each of the four types of surveys has their own characteristics that serve to define each categorization. The characteristics are based upon what decisions are made as a direct result of the survey. The research question was an attempt to identify the factor or factors that determine how the results of a community opinion survey may or may not be used by local governments. The first factor to identify and understand is the primary reason for conducting the community opinion survey. In each of the four case illustrations, the original intent of the survey was carried out based on the kinds and types of decisions that resulted. The

151 only variation from original intent occurred with the Village of Coldwater community opinion survey. The intent was to carry out a second benchmarking observation along with identifying needs within the community. That variation can be explained, in large part, due to the change in administrative leadership that occurred in the Village of Coldwater during the survey project. Another factor revolves around the types of questions that appear on the community opinion survey. Are questions designed from a policy evaluation perspective that asks for opinions on past decisions? Are questions designed to measure awareness and/or inform citizens about past decisions? Does the survey contain questions that collect information on issues that are currently being discussed by the community? Does the survey contain questions that ask for opinions on the future of the community? The survey questions and the purpose behind each question is an important factor in determining how the results of the survey will be used. From the perspective of the public policy cycle, the various types of questions would relate to the various types of policy agendas that exist. Some policies (those already implemented and being evaluated) may have to be redesigned or terminated based on the survey results. Issues that are currently on the action agenda (being discussed and deliberated) may be clarified or ultimately decided and moved along to the implementation stage. The survey results might also serve as the driving force to place entirely new issues onto the action agenda. Finally, data collected on long-range planning issues would result in issues being placed on the discussion agenda – the place for issues to be planned for but not necessarily decided upon immediately.

152 After the survey has been conducted, the actual decisions that result from the survey can be used to further identify the appropriate category for the survey. Are decisions made that re-design or terminate existing policies? Are final decisions rendered on issues that were being discussed and deliberated during the survey process? Are decisions made on new issues that were brought to the forefront because of the survey results? Finally, did the survey result in any decisions being made? The survey may have been conducted for the purpose of collecting benchmark data on service delivery or to evaluate existing policies that were found to be satisfactory based on the survey results. In either of these cases, the result would show no decisions being made. The community opinion survey can be said to have resulted in the local government staying with the status quo in these cases. The first hypothesis was that the primary reason for conducting a community opinion survey as part of a strategic planning process is the main determinant in how survey results are ultimately used. From the case illustrations presented here, this hypothesis held true. In each of the four case illustrations, the stated primary reason for conducting the community opinion survey held true based on how the survey results were utilized and the types of decisions that resulted. Only in the case of the Village of Coldwater community opinion survey was there minor variation in how the survey results were used as compared to the original intent behind the survey. In this case illustration, the intervening variable of a change in administrative leadership is what caused the variation. The second hypothesis asserted that community opinion surveys that contained questions designed to elicit opinions pertaining to long range growth and development will lead to the survey results being used to create a long-range strategic implementation plan. While each of the surveys in the

153 four case illustrations presented here contained questions that asked for residents’ opinions on the future growth and development of their community, only one of the four local governments actually implemented a formalized strategic plan. The data collected from the Village of Williamsburg’s community opinion survey became an integral component of the Village’s long-range strategic planning initiative. Moreover, the survey results became an integral component of Clermont County’s long-range strategic plan for the SR 32 Corridor. In this case illustration, the creation of the long-range strategic plan was the primary reason to conduct the survey. In the other three case illustrations, a formalized long-range strategic plan was not the primary reason to conduct the survey and was not formulated after the survey results were tabulated. At best, these remaining three case illustrations show that the results helped the public officials to begin to think and discuss about the future of their communities and plan accordingly, but without utilizing a formalized strategic planning process. Therefore, this hypothesis did not hold true. There are some limitations to the research presented here that must be addressed. One of the characteristics of an Informational Survey is that the survey data that is collected can be used to establish service delivery benchmarks for the community. It can be argued that all surveys provide the capacity for benchmarking because they capture current attitudes and opinions about essential services. While not all local governments will use a community opinion survey in a traditional benchmarking endeavor – primarily because they do not collect follow-up data and therefore do not have multiple observation points of satisfaction levels with services. In a sense, all surveys that collect benchmark information could technically be classified as an Informational Survey.

154 Another limitation to this research is that it is possible for some surveys to meet the criteria of multiple categories in the typology. As was demonstrated in the Village of Williamsburg community opinion survey case illustration, some surveys can demonstrate characteristics for which a case could logically be made for placing that survey into one or more categories. The fact that surveys can serve multiple purposes should not come as a surprise. The opportunity to collect survey opinion data from citizens does not come along often, and it is not advisable to repeatedly survey citizens within a short timeframe. Otherwise, citizens may develop survey fatigue and lose interest in responding to surveys. The time, effort, and resources involved in conducting a community opinion survey (particularly in communities with scarce resources) also force communities to make the most of the opportunity to collect data and information. So when local governments do decide to survey citizens, it is in their best interest to collect as much data as possible and therefore maximize the utilization of that data. The push to collect data because another chance to do it again soon is usually not feasible results in surveys serving multiple purposes. While the results of any survey can be used in a variety of ways to influence a variety of types of decisions, what is the final arbiter of selecting one category in the typology over another? The answer would seem to point toward the original intent of the survey. Ascertaining what local government officials cited as the primary reason for conducting the survey can be the determining factor when assigning a particular survey to a category. As this research showed (albeit with only four case illustrations), the way in which community opinion surveys are used tends to stay true to the intended purpose.

155 Based on the research presented here, it would seem that future research could be done to extend this typology and apply it to a wider sampling of community opinion surveys. However, it is the theory of the dual role that certain elected officials must serve as both policy maker and policy analyst that is of particular interest. Without much in the way of formalized policy analysis training, how do elected officials collect and process data and information regarding policies, programs, and activities? How do they generate the policy options that are considered by the legislative body that they themselves are members of? And how do they select the policy option that is eventually chosen as the final decision to be implemented? Developing answers to these questions would go a long way towards a better understanding of how local government officials approach and fulfill this dual role.

156 APPENDIX A

Andrew M. Dudas Department of Political Science 2 Harrison Hall - Miami University Telephone: 513.529.6959 / Fax: 513.529.6939

PERSONAL INTERVIEW QUESTIONNAIRE

Within the past 5 years, the Center for Public Management and Regional Affairs at Miami University was asked by your jurisdiction to conduct a survey of township households to assess a variety of issues and services that affect residents as part of a strategic planning initiative in your community. I am conducting follow-up research for my doctoral dissertation using that community survey project as a case illustration to examine how survey results are translated into decisions. Specifically, I would like to collect information about decisions that have been made by your jurisdiction since the completion of the survey project and about any policies that have been implemented as a direct result of the survey findings. With your permission, I would like to ask you several questions as part of this research endeavor.

1) What was the primary reason for conducting a community survey as part of a strategic planning process?

2) What, if any, were the secondary reasons for conducting a community survey as part of a strategic planning process?

3) Did questions appear on the survey that were designed to inform or educate the residents on decisions/policies that had already been made/implemented by your jurisdiction? If so, what were those questions/issues?

4) Did questions appear on the survey that asked residents to evaluate (provide their opinions) on decisions/policies that your jurisdiction had already made/implemented? If so, what were those questions/issues?

5) Did questions appear on the survey that asked residents for their opinions on decisions/policies that your jurisdiction was currently deliberating, discussing, or considering implementing? If so, what were those questions/issues?

6) What specific courses of action did your jurisdiction take on those issues that were currently being deliberated, discussed, or considered for implementation at the time of the survey? How did the survey findings help to inform those decisions?

7) Did the survey findings lead your jurisdiction to make decisions or implement policy in new areas because of the opinions expressed? If so, what were those decisions?

Thank you for your time!

157 APPENDIX B

VILLAGE OF WILLIAMSBURG SURVEY INSTRUMENT

158

APPENDIX C

OXFORD TOWNSHIP SURVEY INSTRUMENT

167

APPENDIX D

VILLAGE OF COLDWATER SURVEY INSTRUMENT

175

APPENDIX E

HANOVER TOWNSHIP SURVEY INSTRUMENT

183

BIBLIOGRAPHY

Anderson, Lisa. 2003. Pursuing Truth, Exercising Power: Social Science and Public Policy in the Twenty-First Century. New York, NY: Columbia University Press.

Beckman, Norman. 1977. “Policy Analysis in Government: Alternatives to Muddling Through.” Public Administration Review 37 (3): 221-253.

Benest, Frank J. 1999. “Reconnecting Citizens with Citizens: What is the Role of Local Government?” Public Management 81 (2): 6-11.

Brewer, Garry D. and Peter DeLeon. 1983. The Foundations of Policy Analysis. Homewood, IL: The Dorsey Press.

Brody, Samuel D., Godschalk, David R., and Raymond J. Burby. 2003. “Mandating Citizen Participation in Plan Making: Six Strategic Planning Choices.” APA Journal 20 (2): 245-263.

Bryson, John M. 2004. Strategic Planning for Public and Nonprofit Organizations: A Guide to Strengthening and Sustaining Organizational Achievement. San Francisco, CA: Jossey-Bass Publishers.

______. 1996. Creating and Implementing Your Strategic Plan: A Workbook for Public and Non-Profit Organizations. San Francisco, CA: Jossey-Bass Publishers.

Bryson, John M. and William D. Roering. 1996. “Strategic Planning Options for the Public Sector.” Handbook of Public Administration. Ed. James L. Perry. San Francisco, CA: Jossey-Bass Publishers.

Burby, Raymond J. 2003. “Making Plans That Matter: Citizen Involvement and Government Action.” APA Journal 69 (1): 33-49.

Center for Public Management and Regional Affairs – Miami University. 2003. Hanover Township Community Survey: October/November 2003. Oxford, OH: Center for Public Management and Regional Affairs.

191 ______. 2002. Ohio Municipal League – Mayors’ Association of Ohio, Leadership Training Academy Module V – Goal Setting and Team Building. Oxford, OH: Center for Public Management and Regional Affairs.

______. 2002. Village of Coldwater, Ohio Community Satisfaction Survey – April 2002. Oxford, OH: Center for Public Management and Regional Affairs.

______. 2000. Oxford Township Survey Project Final Report. Oxford, OH: Center for Public Management and Regional Affairs.

______. 1998. Village of Williamsburg, Ohio Community Survey Project – October 8, 1998. Oxford, OH: Center for Public Management and Regional Affairs.

CPMRA Internet Site: About the Center. Center for Public Management and Regional Affairs – Miami University. 29 July 2004 .

Clark, Doug. 1998. “Healthy Cities: A Model for Community Improvement.” Public Management 80 (11): 4-8.

Deming, W. Edwards. 1986. Out of the Crisis. Cambridge, MA: Massachusetts Institute of Technology, Center for Advanced Engineering Study.

Derickson, Timothy. Personal Interview. 19 April 2005.

Dluhy, Milan. 1981. “Policy Advice-Givers: Advocates? Technicians? Or Pragmatists?” New Strategic Perspectives on Social Policy. Ed. J. Tropman, M. Dluhy, and R. Lind. Elmsford, NY: Pergamon Press.

Dunn, William. 1994. Public Policy Analysis: An Introduction. Englewood Cliffs, NJ: Prentice Hall.

Dye, Thomas. 2005. Understanding Public Policy. Upper Saddle River, NJ: Pearson Prentice Hall.

192 Easton, David. 1965. A Systems Analysis of Political Life. New York: John Wiley and Sons.

Etzioni, Amitai. 1967. “Mixed Scanning: A 'Third' Approach to Decision- Making." Public Administration Review 27 (5): 385-392.

Feldman, Martha S. 1989. Order Without Design: Information Production and Policy Design. Palo Alto, CA: Stanford University Press.

Fink, Arlene. 2003. The Survey Handbook. Thousand Oaks, CA: Sage Publications, Inc.

Fink, Arlene and Jacqueline Kosecoff. 1998. How to Conduct Surveys: A Step-By-Step Guide. Thousand Oaks, CA: Sage Publications, Inc.

Fitzgerald, Michael R. and Robert F. Durant. 1982. “Citizen Evaluations and Urban Management: Service Delivery in an Era of Protest.” Reaching Decisions in Public Policy and Administration. Ed. Richard D. Bingham and Marcus E. Ethridge. New York, NY: Longman, Inc.

Floden, Robert E. and Stephen S. Weiner. 1978. “Rationality to Ritual: The Multiple Roles of Evaluation in Governmental Processes.” Policy Sciences 9 (1): 9-18.

Fowler, Jr. Floyd J. 2001. Survey Research Methods. Thousand Oaks, CA: Sage Publications, Inc.

Gordon, Gerald L. 1993. Strategic Planning for Local Government. Washington, D.C.: International City/County Management Association.

Gupta, Dipak K. 2001. Analyzing Public Policy: Concepts, Tools, and Techniques. Washington, D.C.: Congressional Quarterly Press.

Hastak, Manoj, Mazis, Michael B., and Louis A. Morris. 2001. “The Role of Consumer Surveys in Public Policy Decision-Making.” Journal of Public Policy and Marketing 20 (2): 170-185.

193 Hatry, Harry P., Blair, Louis H, Fish, Donald M., Greiner, John M., Hall, John R. and Philip S. Schaenman. 1992. How Effective Are Your Community Services? Procedures for Measuring Their Quality. Washington, D.C.: The Urban Institute and International City/County Management Association.

Hedlund, Ronald D. 1982. “Survey Research in the Study of Administrative and Policy Problems.” Reaching Decisions in Public Policy and Administration. Ed. Richard D. Bingham and Marcus E. Ethridge. New York, NY: Longman, Inc.

Henry, Nicholas. 2004. Public Administration and Public Affairs. Upper Saddle River, NJ: Pearson / Prentice Hall Publishing.

Kretzmann, John P. and John L. McKnight. 1993. Building Communities From the Inside Out: A Path Toward Finding and Mobilizing a Community’s Assets. Evanston, IL: The Asset-Based Community Development Institute, Institute for Policy Research, Northwestern University.

Lawrence, Betty and Janie Southard. “Coldwater, St. Henry To Get Cable Competitor” The Daily Standard. March 29, 2005. .

Lefker, Mary Ann. Personal Interview. 28 April 2005.

Licari, Michael, McLean, William, and Tom W. Rice. 2005. “The Condition of Community Streets and Parks: A Comparison of Resident and Non- Resident Evaluations.” Public Administration Review 65 (3): 360-368.

Lindblom, Charles. 1959. "The Science of Muddling Through." Public Administration Review 19 (2): 79-88.

Lindblom, Charles E. and David K. Cohen. 1979. Usable Knowledge: Social Science and Social Problem Solving. New Haven, CT: Yale University Press

Malbin, Michael. 1980. Unelected Representatives: Congressional Staff and the Future of Representative Government. New York, NY: Basic Books.

194 Melkers, Julia and John Clayton Thomas. 1998. “What Do Administrators Think Citizens Think? Administrator Predictions as an Adjunct to Citizen Surveys.” Public Administration Review 58 (4): 327-334.

Meltsner, Arnold J. 1976. Policy Analysts in the Bureaucracy. Berkeley, CA: University of California Press.

Miller, Thomas I. and Michelle Miller Kobayashi. 2001. “The Voice of the Public: Why Citizen Surveys Work.” Public Management 83 (4): 6-9.

______. 2000. Citizen Surveys: How to Do Them, How to Use Them, What They Mean. 2nd Edition. International City/County Management Association: Washington, D.C.

National Arbor Day Foundation Programs – Tree City USA. The National Arbor Day Foundation. 10 June 2005 .

Patton, Carl V. and David S. Sawicki. 1986. Basic Methods of Policy Analysis and Planning. Englewood Cliffs, NJ: Prentice Hall.

Paul, Ellen Frankel and Philip A. Russo, Jr., Eds. 1982. Public Policy: Issues, Analysis, and Ideology. Chatham, NJ: Chatham House.

Peters, B. Guy. 1986. American Public Policy: Promise and Performance. Chatham, NJ: Chatham House Publishers.

Peterson, Robert A. 2000. Constructing Effective Questionnaires. Thousand Oaks, CA: Sage Publications, Inc.

Punch, Keith F. 2003. Survey Research: The Basics. Thousand Oaks, CA: Sage Publications, Inc.

Radin, Beryl. 2000. Beyond Machiavelli: Policy Analysis Comes of Age. Washington, D.C.: Georgetown University Press.

Rhoads, Steven E. 1978. “Economists and Public Policy Analysis.” Public Administration Review 38 (2): 112-120.

195 Shulock, Nancy. 1999. “The Paradox of Policy Analysis: If It Is Not Used, Why Do We Produce So Much of It?” Journal of Policy Analysis and Management 18 (2): 226-244.

Simonds, George. Personal Interview. 16 May 2005.

SR 32 Corridor Vision Plan. Clermont County Department of Community Planning and Development. 10 June 2005 .

Stokey, Edith and Richard Zeckhauser. 1978. A Primer for Policy Analysis. New York, NY: WW Norton.

Stone, Deborah. 2002. Policy Paradox: The Art of Political Decision- Making. New York, NY: WW Norton.

Svara, James H. 1985. “Dichotomy and Duality: Reconceptualizing the Relationship Between Policy and Administration in Council-Manager Cities.” Public Administration Review 45 (1): 221-232.

Szanton, Peter. 1981. Not Well Advised. New York: Russell Sage Foundation and The Ford Foundation.

“The Summer of George.” Seinfeld. By Larry David and Jerry Seinfeld. Perf. Jerry Seinfeld, Jason Alexander, Michael Richards, and Julia- Louis Dreyfus. NBC. 15 May 1997.

Thomas, Eric. Personal Interview. 2 June 2005.

Weimer, David L. and Aidan R. Vining. 2005. Policy Analysis: Concepts and Practice, 4th Ed. Upper Saddle River, NJ: Pearson Prentice Hall.

Weiss, Carol. 1989. “Congressional Committees as Users of Analysis.” Journal of Policy Analysis and Management 8 (3): 411-431.

______. 1972. “Utilization of Evaluation: Toward Comparative Study.” Evaluating Action Programs: Readings in Social Action and Education. Ed. Carol Weiss. Boston, MA: Allyn & Bacon.

196 White, Bob. “Oxford Twp. trustees study survey results.” The Oxford Press. Thursday, February 22, 2001. A3.

Whiteman, David. 1985. “The Fate of Policy Analysis in Congressional Decision-Making: Three Types of Use in Committees.” Western Political Quarterly 38 (2): 294-311.

______. 1995. Communication in Congress: Members, Staff, and the Search for Information. Lawrence, KS: University of Kansas Press.

Wildavsky, Aaron. 1979. Speaking Truth to Power: The Art and Craft of Policy Analysis. Boston, MA: Little, Brown.

Williams, Walter. 1971. Social Policy Research and Analysis. New York, NY: American Elsevier.

Wray, Lyle and Jody Hauer. 1997. “Performance Measurement To Achieve Quality of Life: Adding Value Through Citizens.” Public Management 79 (8): 4-8.

Young, Nancy. “Hanover Township taking the pulse of its residents.” The Cincinnati Enquirer. Thursday, October 23, 2003. .

197