Impact and Process Evaluation of the U.S. Department of Energy's Wind
Total Page:16
File Type:pdf, Size:1020Kb
Impact and Process Evaluation of the U.S. Department of Energy’s Wind Powering America Initiative Prepared for: Department of Energy Office of Energy Efficiency and Renewable Energy Prepared by: Navigant Consulting, Inc. FINAL REPORT May 2013 DOE/EE-0897 Acknowledgements This study has benefited from the contributions of many individuals. Jeff Dowd of the U.S. Department of Energy’s (DOE) Office of Energy Efficiency and Renewable Energy (EERE) initiated the study. Navigant performed the evaluation under subcontract to Lawrence Berkeley National Laboratories (LBNL). Ed Vine and Yaw Agyeman, LBNL’s study managers, provided invaluable oversight and critical guidance that helped keep the project on track. The Navigant study team comprised Frank Stern, principal investigator; Charlie Bloch, principal analyst for the study; and a diverse team of analysts, researchers, writers and advisors, including Lindsay Battenberg, Mark Bielecki, Rachel Dickenson, Terese Decker, Jennifer Hampton, Jan Harris, Jane Hummer, Julianne Meurice, Bill Provencher, Stuart Smoller, Rebecca Stoecklein, and Dan Violette. Various industry stakeholders and current and former Wind Powering America (WPA) staff provided valuable guidance and input during the initial evaluation planning and data collections stages, including Jim Ahlgrimm, Dwight Bailey, Ian Baring-Gould, Stan Calvert, Phil Dougherty, Michele DesAutels, Larry Flowers, Randy Manion, Walt Musial, Brian Parsons, Brian Smith, Jennifer States, Amanda Vanega and Ryan Wiser. The team wishes to particularly thank Heather Rhoads-Weaver of eFormative Options for the provision of detailed small-wind capacity installation data and the team at the North Carolina Solar Center, North Carolina State University and the Interstate Renewable Energy Council for maintaining the Database of State Incentives for Renewables and Efficiency (DSIRE) website. The following reviewers provided valuable input during the study. Those who reviewed the evaluation design and draft report provided advice that greatly helped the study team deal with the challenges of the research subject and evaluation design. The study team is grateful for their comments. External reviewers: Eric Blank, President and Co-Founder, Community Energy Edgar DeMeo, President, Renewable Energy Consulting Services, Inc. Tom Feiler, Director of Business Development, AMSC Kenneth M. Keating, PhD, Consultant Internal reviewers: Jeff Dowd, Program Evaluation Lead, Policy & Analysis Team, Strategic Programs, DOE/EERE Yaw Agyeman, Project Manager, Lawrence Berkeley National Laboratory Ed Vine, Project Manager, Lawrence Berkeley National Laboratory Jonathan Bartlett, Strategic Marketing & Outreach Manager, Wind & Water Power Program, DOE/EERE Ian Baring-Gould, WPA Technical Lead, National Renewable Energy Laboratory Ben Chicoski, Senior Analyst, Energetics Incorporated Michele DesAutels, former DOE National Coordinator of WPA (2009-2012), U.S. Coast Guard Impact and Process Evaluation: DOE Wind Powering America Initiative Page i Notice This document was prepared as an account of work sponsored by an agency of the United States government. Neither the United States government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, usefulness, or any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States government or any agency thereof. This report is being disseminated by the Department of Energy. As such, this document was prepared in compliance with Section 515 of the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554) and information quality guidelines issued by the Department of Energy. [Further, this report could be “influential scientific information” as that term is defined in the Office of Management and Budget's Information Quality Bulletin for Peer Review (Bulletin). This report has been peer reviewed pursuant to section II.2 of the Bulletin.] Impact and Process Evaluation: DOE Wind Powering America Initiative Page ii Table of Contents Executive Summary ..................................................................................................... x 1. Introduction ....................................................................................................... 1-1 1.1 WPA Initiative and Objectives ............................................................................................... 1-1 1.1.1 WPA Design and Approach ..................................................................................... 1-1 1.1.2 Initiative Objectives .................................................................................................. 1-2 1.1.3 WPA State-Based Activities ..................................................................................... 1-2 1.1.4 WPA Timeline and Funding ..................................................................................... 1-3 1.2 Evaluation Objectives............................................................................................................ 1-5 1.3 Overview of this Report ......................................................................................................... 1-6 2. WPA Logic Model .............................................................................................. 2-1 2.1 WPA Logic Model Diagram ................................................................................................... 2-1 2.2 Market Barriers ...................................................................................................................... 2-3 2.3 Stakeholders ......................................................................................................................... 2-3 2.4 WPA Inputs and External Influences .................................................................................... 2-3 2.5 WPA Strategies and Activities .............................................................................................. 2-4 2.6 WPA Initiative Outputs and Outcomes.................................................................................. 2-5 3. Evaluation Methodology .................................................................................. 3-1 3.1 Researchable Questions and Metrics ................................................................................... 3-1 3.2 Research Design ................................................................................................................... 3-4 3.2.1 Historical Tracing Approach ..................................................................................... 3-4 3.2.2 Expert Judging Approach ......................................................................................... 3-8 3.2.3 Modified Delphi Process (Seeking Consensus and Bounding Uncertainty) .......... 3-12 3.2.4 Data Analysis and Derivation of Capacity-Equivalent Estimates of WPA’s Influence ................................................................................................................. 3-13 3.2.5 Limitations of Data and Analysis ............................................................................ 3-17 3.2.6 Statistical Approaches Considered ........................................................................ 3-19 3.3 Sample Development .......................................................................................................... 3-21 3.4 Final Disposition of Interviews and Response Rates in Sampled States ........................... 3-24 4. WPA Impact Findings ....................................................................................... 4-1 4.1 WPA’s Influence on Wind Capacity Additions in Targeted States ........................................ 4-1 4.1.1 Market Factors’ Perceived Share of Influence on Wind Capacity Additions............ 4-2 4.1.2 WPA’s Estimated Influence on Wind Capacity Additions in Targeted States .......... 4-5 4.2 WPA’s Influence on Wind Capacity Additions in Non-Targeted States .............................. 4-15 4.2.1 Market Factors’ Perceived Share of Influence on Capacity Additions ................... 4-15 4.2.2 WPA’s Estimated Influence on Wind Capacity Additions in Sampled Non-Target States .................................................................................................. 4-16 4.3 Wind Working Group’s Ability to Leverage WPA Funding .................................................. 4-18 4.3.1 Availability of WPA Budget Data ............................................................................ 4-18 4.3.2 WPA Approach to Initiative Funding ...................................................................... 4-18 4.3.3 Sources and Importance of Leveraged Funding .................................................... 4-19 4.3.4 Coordinating Organizations’ Ability to Secure Additional Funding ........................ 4-21 4.3.5 Timing of Leveraged Funds ................................................................................... 4-21 4.4 Replication of WWG Activities