<<

Behavioral Perspectives on Home Energy Audits:

The Role of Auditors, Labels, Reports, and Audit Tools on Homeowner Decision-Making

Aaron Ingle, Mithra Moezzi, Loren Lutzenhiser, and Zac Hathaway Portland State University

Susan Lutzenhiser, Joe Van Clock, and Jane Peters Research Into Action

Rebecca Smith and David Heslam Earth Advantage Institute

Rick Diamond Lawrence Berkeley National Laboratory

July 2012 DISCLAIMER This document was prepared as an account of work sponsored by the United States Government. While this document is believed to contain correct information, neither the United States Government nor any agency thereof, nor The Regents of the University of California, nor any of their employees, makes any warranty, express or implied, or assumes any legal responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by its trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof, or The Regents of the University of California. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof or The Regents of the University of California.

This work was supported by the Assistant Secretary for Energy Efficiency and Renewable Energy, Building Technologies Program, of the U.S. Department of Energy under Contract No. DE-AC02- 05CH11231.

Acknowledgements The authors would like to thank Joan Glickman, Christa McDermott, and David Lee at the U.S. Department of Energy; Oradoña Landgrebe and Andrew Gibb at Seattle City Light; Sara Stiltner at the City of Seattle; Ammen Jordan at Home Performance Collaborative; Kristen Nice and Dave Henson at Puget Sound Energy; and Amber Johnson, Debi Elliot, and Tara Horn of Portland State University. We would also like to thank Tom Sanquist at Pacific Northwest National Laboratory and Alan Sanstad and Evan Mills at the Lawrence Berkeley National Laboratory for providing thoughtful review of a draft of this report.

Behavioral Perspectives on Home Energy Audits Page 2

Contents List of Tables ...... 4 List of Figures ...... 5 Acronyms and Abbreviations ...... 8 Executive Summary ...... 9 1. Introduction and Background ...... 21 2. Approach ...... 31 3. Industry Actors’ Opinions and Perceptions—Auditors & Realtors ...... 47 4. Homeowner Decision-Making Related to Energy Upgrades, Audit Reports, and Labels ...... 71 5. Post-Retrofit Assessment and Retrofit Quality Verification ...... 118 6. Home Energy Audit and Assessment Modeling Tool Comparison ...... 125 7. Conclusions ...... 148 References ...... 159

Appendix A: EPS Scorecard...... 163 Appendix B: EPS Energy Analysis Report ...... 165 Appendix C: Sample Home Energy Score Report ...... 179 Appendix D: Auditor Interview Guide ...... 182 Appendix E: Auditor Interviews Results – Presentation to Team, 3/2/11 ...... 186 Appendix F: Real Estate Professionals Interview Guide ...... 191 Appendix G: Real Estate Professionals Interviews Presentation 8/16/11 ...... 195 Appendix H: Homeowner Pre-Audit Interview Guide ...... 200 Appendix I: Homeowner Pre-Audit Interview Closed-Ended Responses ...... 207 Appendix J: Homeowner Post-Audit Interview Guide ...... 211 Appendix K: Homeowner Post-Audit Interview Closed-Ended Responses ...... 222 Appendix L: Post-Audit Survey Questions and Responses ...... 228 Appendix M: Retrofit Survey Questions and Responses ...... 277 Appendix N: Details of Modeling Data Processing and Preparation ...... 323 Appendix O: EPS Auditor, Home Energy Scoring Tool, and HESPro Feature Comparison ...... 330 Appendix P: Comparison of Asset-Based Energy Use Estimates ...... 334 Appendix Q: Comparison of Model Estimates to Utility-Reported Energy Usage ...... 345 Appendix R: Assessment of Impact of Missing Home Energy Scoring Tool Inputs ...... 368

Behavioral Perspectives on Home Energy Audits Page 3

Appendix S: Comparison of Recommendations—EPS Auditor and Home Energy Scoring Tool ...... 371 Appendix T: Model Assumptions of “Standard” Operation Compared to Reported Behaviors ...... 383

List of Tables Table 1: Overview of research data streams ...... 11 Table 2: Overview of research data streams ...... 31 Table 3: House and household sample ...... 35 Table 4: Characteristics of audited households as compared to SCL Customer Survey...... 36 Table 5: Demographic characteristics of survey respondents compared to the overall ...... 38 Table 6: Number of completed homeowner interviews and surveys by category ...... 41 Table 7: Mapping of house and homeowner data streams to completed analyses ...... 44 Table 8: Typical homeowner-initiated post-audit call-back topics ...... 56 Table 9: Factors considered and their impact on recommendations ...... 58 Table 10: Auditor use of “General Notes” field in audit report ...... 59 Table 11: Summary of number of completed homeowner interviews and surveys ...... 71 Table 12: Reasons cited by Pre-Audit Interviewees for seeking an audit (multiple responses allowed) ... 74 Table 13: What part of the audit process was particularly helpful in convincing household members .... 78 Table 14: Responses on the relative importance of Energy Score versus Carbon Score ...... 81 Table 15: Reasons given for having undertaken audit (Post-Audit Survey; n=134) ...... 86 Table 16: After your energy audit, have you discussed your Energy Score ...... 88 Table 17: Completed or intended upgrades cited more often after the audit ...... 89 Table 18: Completed or intended upgrades cited less often after the audit ...... 90 Table 19: How much Retrofit Survey respondents reported learning ...... 92 Table 20: Percentage of audited households receiving “standardized” recommendations ...... 95 Table 21: Considerations in priority of upgrades (Post-Audit Interviews; n=12) ...... 96 Table 22: Number of recommendations that survey respondents reported having completed ...... 99 Table 23: Energy upgrade recommendations received and reported completed ...... 100 Table 24: Reasons homeowners gave for choosing the retrofits they completed ...... 102 Table 25: Reasons homeowners gave for delaying planned retrofits (Retrofit Survey; n=95) ...... 102 Table 26: Reasons given for deciding against particular recommendations (Retrofit Survey; n=61) ...... 103 Table 27: Who performed the upgrades that were completed? (Retrofit Survey; n=82) ...... 103 Table 28: Rating of likelihood of incentive affecting upgrade decision (Pre-Audit Interview; n=33) ...... 104 Table 29: Post-Audit Interview responses to whether it would be “worth” taking out a loan ...... 105 Table 30: Total estimated expenditures for all energy upgrades performed for the households ...... 107 Table 31: Homeowner perceptions of savings from recommended upgrades they undertook ...... 108 Table 32: Reported changes in energy use practices after the audit or after upgrades ...... 112 Table 33: Summary of upgrade quality assessment, by upgrade type (n=50 houses) ...... 120 Table 34: Similarities and differences between EPS Auditor, Home Energy Scoring Tool, and HESPro .. 128 Table 35: Data sets utilized for the model comparison ...... 130

Behavioral Perspectives on Home Energy Audits Page 4

Table 36: Modeling tools and emulators used in the analyses ...... 131 Table 37: Six model comparison analyses completed ...... 132 Table 38: All Home Energy Scoring Tool recommendations for 31 homes ...... 135 Table 39: All EPS Auditor tool recommendations for 31 homes ...... 135 Table 40: Thermostat setbacks reported in the energy use behavior survey ...... 140 Table 41: Percentage of homes where EPS asset model estimated usage was substantially lower...... 142 Table 42: Percentage of homes where Home Energy Scoring Tool asset model estimated ...... 142 Table 43: Percentage of homes where HESPro house+weather+behavior model estimated usage ...... 144 Table 44: Findings from the SCL Home Energy Audit study compared to the literature ...... 149 Table 45: Missing or ambiguous data informing Home Energy Scoring Tool inputs ...... 325 Table 46: Similarities and Differences between EPS, Home Energy Scoring Tool, and HESPro ...... 330 Table 47: Summary of all EPS and Home Energy Scoring Tool Comparison Runs ...... 338 Table 48: Total Site Energy (kWheq/yr)—Comparison of Asset, Asset+Weather ...... 348 Table 49: Total Source Energy (MBTU/yr)—Comparison of Asset, Asset+Weather ...... 352 Table 50: Electricity Usage (kWhr/yr)—Comparison of Asset, Asset+Weather ...... 355 Table 51: Natural Gas Usage (therms/yr)—Comparison of Asset, Asset+Weather ...... 359 Table 52: Percentage of homes where EPS asset model estimated usage was substantially lower...... 362 Table 53: Percentage of homes where Home Energy Scoring Tool asset model estimated usage ...... 362 Table 54: Percentage of homes where HES-ProHESPro “house+weather+behavior” model ...... 363 Table 55: 15 Houses with Post-Retrofit Assessment Additional Information vs. Home Energy Score .... 369 Table 56: All Home Energy Scoring Tool recommendations for 31 homes ...... 373 Table 57: All EPS report “standardized” recommendations for 31 homes ...... 373 Table 58: Summary of air sealing recommendations made in EPS Auditor ...... 375 Table 59: Details of the 10 air sealing recommendations that coincided ...... 375 Table 60: Summary of attic insulation recommendations ...... 377 Table 61: Details of the 11 attic insulation recommendations that coincided ...... 377 Table 62: Summary of floor or basement insulation recommendations ...... 378 Table 63: Details of the 7 floor or basement insulation recommendations that coincided ...... 378 Table 64: Summary of duct insulation recommendations ...... 380 Table 65: Details of the 5 duct insulation recommendations that coincided ...... 380 Table 66: Thermostat Setting Assumptions Used in Modeling Tools ...... 384 Table 67: Self-reported thermostat settings from survey respondents ...... 384 Table 68: Number of respondents reporting using a particular supplemental heating source ...... 386 Table 69: Large appliance use model assumptions ...... 389

List of Figures Figure 1: Overview of the home energy audit process and associated actors included in the research ... 12 Figure 2: Overview of the Seattle City Light home energy audit process ...... 29 Figure 3: House and household data streams collected in this research ...... 35 Figure 4: Home energy audit process and data collection activities ...... 40

Behavioral Perspectives on Home Energy Audits Page 5

Figure 5: Blower door test photo...... 51 Figure 6: Example of EPS savings estimates ...... 55 Figure 7: Example of costs reported for recommended energy upgrades from the EPS Report ...... 56 Figure 8: Home labeling and real estate market demand for energy efficient houses ...... 61 Figure 9: Four current home label options ...... 63 Figure 10: Ways labeling differentiates new and existing homes ...... 64 Figure 11: Potential drawbacks to labeling new and existing homes ...... 65 Figure 12: Sample EPS Scorecard ...... 80 Figure 13: Changes in infiltration between values recorded ...... 122 Figure 14: Major sets of activities for model comparison ...... 129 Figure 15: Estimated total site energy use for Home Energy Scoring Tool compared to EPS ...... 133 Figure 16: Summary of comparison of upgrade recommendations ...... 136 Figure 17: Self-reported and model estimated occupancy vs number of bedrooms in home ...... 139 Figure 18: Histogram of fractional change in model estimated total site energy use of a home ...... 141 Figure 19: Major data collection, processing, and preparation activities for model comparison ...... 323 Figure 20: Percent difference in estimated total source energy use...... 327 Figure 21: Percent difference in estimated total source energy use ...... 328 Figure 22: Scatter Plots Comparing EPS and Home Energy Scoring Tool Total Energy Use Estimates .... 339 Figure 23: Distribution of the Percent Difference Between EPS and Home Energy Scoring Tool ...... 340 Figure 24: Scatter Plots Comparing EPS and Home Energy Scoring Tool Total Electricity and Nat Gas ... 341 Figure 25: Distribution of Percent Difference Between EPS and Home Energy Scoring Tool Electricity . 341 Figure 26: Scatter Plots Comparing EPS and Home Energy Scoring Tool Space Heating ...... 342 Figure 27: Distribution of the Percent Difference Between EPS and Home Energy Scoring Tool ...... 343 Figure 28: Scatter Plots of Model Estimated Total Site Energy vs Utility Reported Usage ...... 349 Figure 29: Distribution of the Percent Difference Between Model Estimated Total Site Energy ...... 350 Figure 30: Scatter Plots of Model Estimated Total Source Energy vs. Utility Reported Usage ...... 353 Figure 31: Distribution of the Percent Difference Between Model Estimated Total Source Energy ...... 354 Figure 32: Scatter Plots of Model Estimated Electricity Usage vs. Utility Reported Usage ...... 356 Figure 33: Distribution of the Percent Difference Between Model Estimated Electricity Usage ...... 357 Figure 34: Scatter Plots of Model Estimated Natural Gas Usage vs. Utility Reported Usage ...... 360 Figure 35: Distribution of the Percent Difference Between Model Estimated Natural Gas Usage ...... 361 Figure 36: HESPro+Weather+Behavior model estimated electricity use versus...... 365 Figure 37: Residuals plotted as a function of the regression predicted value for HESPro ...... 366 Figure 38: Asset model estimated heating use (site kWheq/yr): EPS vs Home Energy Scoring Tool ...... 372 Figure 39: Asset model estimated air sealing savings: EPS vs Home Energy Scoring Tool ...... 376 Figure 40: Asset model estimated attic insulation savings: EPS vs Home Energy Scoring Tool ...... 377 Figure 41: Asset model estimated floor or basement insulation savings ...... 379 Figure 42: Asset model estimated duct insulation savings: EPS vs Home Energy Scoring Tool ...... 380 Figure 43: Self-reported supplemental heating use ...... 386 Figure 44: Self-reported and model estimated occupancy versus number of bedrooms in home...... 387 Figure 45: Response to question about whether someone is home during the day ...... 388 Figure 46: Dishwasher use as reported by energy use behavior survey respondents ...... 390

Behavioral Perspectives on Home Energy Audits Page 6

Figure 47: Energy use behavior response on typical washing machine cycle setting ...... 390 Figure 48: Washing machine use reported by energy use behavior survey respondents ...... 391 Figure 49: Clothes dryer use reported by energy use behavior survey respondents ...... 391 Figure 50: Measured hot water temperatures from EPS home energy audits ...... 393 Figure 51: Histogram of fractional change in model estimated total site energy use of a home ...... 394

Behavioral Perspectives on Home Energy Audits Page 7

Acronyms and Abbreviations

ACHn Natural air changes per hour BPI Building Performance Institute, Inc. CFL Compact fluorescent lamp CFM Cubic feet per minute CFM50 Airflow needed to create a change in building pressure of 50 Pascals DOE U.S. Department of Energy EPS Energy Performance Score HERS Home Energy Rating Systems HESPro Home Energy Saver Pro kWh Kilowatt hour kWheq Kilowatt hour equivalent LBNL Lawrence Berkeley National Laboratory LEED Leadership in Energy and Environmental Design rating system MPG Miles per gallon NAR National Association of Realtors NREL National Renewable Energy Laboratory REPs Real estate professionals SCL Seattle City Light S.T.A.R. Earth Advantage Institute’s Sustainability Training for Real Estate Professionals educational course

Behavioral Perspectives on Home Energy Audits Page 8

Executive Summary

Overview During 2010 and 2011, thousands of news articles promoted the potential benefits of home energy audits and of subsequent energy efficiency improvements, signaling a resurgence of interest in motivating homeowners to assess and upgrade their homes. An increasing number of home energy audit programs were already underway in the United States, and auditors, utilities, and others learn from these experiences, and continue to try to design more effective audits, pitches, and programs, as they have for more than thirty years (e.g., Hirst et al. 1981; Van de Grift and Schauer 2010)—since the advent of home energy audits in the 1970s.

These home energy audits have generally been designed to overcome what the industry commonly considers “barriers” to increasing home energy efficiency, in particular the perception that homeowners do not have enough information about what to do to increase the energy efficiency of their homes or to solve performance problems, nor about the prospective advantages of these actions. Home energy audits are furthermore sometimes used to generate an energy score to inform potential home buyers about energy efficiency and thereby influence the overall housing stock, to qualify homes for energy efficiency financing, mortgages, loans, or incentives, and to diagnose home energy performance problems.

Despite success stories (e.g., Van de Grift and Schauer 2010), the results of home energy audit programs overall have often been considered disappointing: relatively few households undertake audits, and when they do, upgrade recommendations are often not acted upon (Frondel and Vance 2011; Fuller et al. 2010; Palmer et al. 2011). And though there is clearly remaining technical potential, little has been proven about the overall energy savings that result from audits and resulting upgrades, nor the extent to which actual social potential can reach technical potential. Rather than continue to ask why homeowners do not act as theory suggests they should, this research turns the lens more toward homeowners’ views and experiences. A clearer, more open understanding of these views and experiences may help expectations for home energy audits better align with what is likely to be achieved.

Our study focused on the perspective of homeowner decision-making in response to home energy audits, combined with attention to the quality of the recommendations that homeowners receive, as well as the perspectives of some key industry actors on auditing and home energy labels. Unlike a program evaluation, the research was not designed to answer detailed questions about program effectiveness in terms of costs, savings, or process, nor was it designed to provide direct answers to questions of how to get people to do more audits or more retrofits. Rather it “steps back” toward a better understanding of more basic questions about what audits provide and what homeowners seem to want, for the case of one particular program that we expect has parallels to many others.

In this report, we present the results of a study for the U.S. Department of Energy, applied to an existing home energy audit program and offered by Seattle City Light, the electric utility for the City of Seattle. Portland State University, Research Into Action, and Earth Advantage Institute worked together

Behavioral Perspectives on Home Energy Audits Page 9 with Seattle City Light and the Lawrence Berkeley National Laboratory to complete the research project. From mid-2010 to late 2011, approximately 1,350 home energy audits were completed in Seattle as part of Seattle City Light’s program. These audits, and the homeowners who received them, are the subject of our report.

The research reported here was designed to advance the field’s knowledge on what homeowners want and get from home energy audits. It did so by simultaneously studying multiple dimensions of these audits, including: physical characteristics of the houses audited, the energy use estimates and upgrade recommendations these audits offered to homeowners, actual energy use data, self-reported retrofit activity and energy use behaviors, physical assessment of the quality of the retrofits undertaken, viewpoints of both auditors and realtors on various key program elements, and—centrally in tying these streams together—homeowner motivations and reactions to the audits, what they consequently changed, and what they thought about the results. These data were used to address gaps in knowledge about home energy efficiency upgrades and audits, including:

 Homeowner decision-making processes in planning and undertaking energy retrofits, reactions to home energy performance scores, and satisfaction with the audits performed;

 Differences and similarities between home energy assessment and retrofit recommendation tools;  Importance of household energy behaviors relative to house- and equipment-based assessments of home energy performance and upgrade recommendations; and

 Industry views on the current and potential future use, and usefulness, of home energy performance ratings in general.

While we collected as much as 18 months utility consumption data for these households, the post- retrofit period was typically only 6 months, which was not sufficient to statistically test the degree of energy savings associated with audits and upgrades.

Seattle City Light’s Home Energy Audit Program In 2010, Seattle City Light began a program offering Seattle City Light customers owning single-family homes a $400 home energy audit for a discounted rate of $95, with an objective of reaching 5,000 households. These home energy audits used BPI-certified auditors, diagnostic testing, asset-based energy modeling, and an auditor-customizable report featuring an asset rating of the house’s energy and carbon use, upgrade recommendations, and the energy performance details of the house. Earth Advantage Institute’s EPS Auditor modeling and reporting software tool was used for these audits.

The Seattle City Light program provided an assessment of the whole house, including attic, walls, windows, foundations, water heating, ducts, and heating and cooling system(s), as well as measurements of air leakage, combustion safety checks, and, when feasible, infrared thermal imaging.

Behavioral Perspectives on Home Energy Audits Page 10

The in-home portion of these audits took 3-4 hours,1 during which time the audit recipient had to be at home. Upon completing the technical measurements, auditors typically talked with homeowners about initial findings, following up later with an e-mailed Energy Performance Score (EPS) Energy and Carbon use Scorecard, and a report with detailed findings from the audit. The report included a standardized set of recommendations, selected by the auditor from a list provided in the EPS Auditor tool, which provided homeowners with a range of estimated costs and savings, and covered a fixed set of categories including air sealing, duct sealing, insulation (ceiling/attic, floor, wall, and ducts), appliances and lighting, heating and cooling systems, water heating, windows, and others. Additionally, Seattle City Light encouraged auditors to customize the report with information on the current conditions of the home and with customized recommendations for the homeowner; 75% of reports contained these auditor customized recommendations, and in most of these, the recommendations were prioritized.

The Seattle City Light Home Energy Audit Program was not designed to maximize participation, was not heavily marketed, and was not intended to closely tie in various wrap-around elements that are known to help boost upgrade completion, such as financial incentives, direct links to contractors, or providing “energy advisors” or “energy advocates” to help homeowners through the process (see, for example, Van de Grift and Schauer 2010). Rather, the Seattle City Light program represents a fairly evolved home energy audit program that provides homeowners with a lot of information, a detailed assessment of the energy characteristics of their home from an asset perspective, and high quality, well-trained, and often quite experienced auditors. Thus the research goals and the program itself complement each other quite well.

Description of Research Methods To inform this research, we talked to nearly 300 different households that had received the Seattle City Light home energy audits. We collected 12-18 months of utility data, both gas and electric, for approximately 250 homes, extensive data on the technical characteristics for all 1,355 houses receiving an audit between June 2010 and October 2011, and self-reported data on energy behavior from 346 homes. We also obtained a follow-up set of data on the technical characteristics and upgrade measures for 50 homes where upgrades had been completed. In addition, we completed in-depth interviews with two key groups of stakeholders: home energy auditors and real estate professionals in Seattle. These data streams are detailed in Table 1 below.

Table 1: Overview of research data streams

Source Method(s) of data collection Contribution Homeowners Phone interviews and surveys Homeowner perspectives on audits and upgrades and Mail and web-based energy use Self-reported energy use behavior data, used in households behavior surveys combination with house technical data to evaluate Electric and natural gas billing data the importance of behavior relative to house and provided by utilities equipment factors in home energy performance

1 It typically takes the auditor about an hour to collect the measurements, other than the blower door test results, that are needed for the EPS score calculation (Earth Advantage Institute and Conservation Services Group 2009).

Behavioral Perspectives on Home Energy Audits Page 11

Utility-reported energy use data for many homes

House Technical house data collected by home Technical characteristics of the house energy auditors during the site visit Re-measurement of many technical characteristics Re-measurement (after upgrades were of the house completed) of technical house data, and Details on the quality of completed upgrades inspection of completed upgrades Home energy Phone interviews Auditor perspective on homeowner decision- auditors making, audit process, audit modeling tool, and program Real estate Phone interviews Real estate professional perspective on how home professionals energy performance currently fits into the buying/selling process and how it could fit in the future Home energy Model runs for a subset of audited Scores, energy use estimates (total, by utility, and modeling Seattle homes, using the Home Energy by end use), and upgrade recommendations from tools Scoring Tool and Home Energy Saver- each of the tools Pro, as well as emulators replicating these tools Home Energy Real Estate Results from the original EPS Auditor Audit Tool Auditor Professionals All Households modeling that was completed for the Real Estate audit reports Transactions

Upgrades Audit Outputs: Completed by -Score and label Interested Homeowner Summary of Major FindingsSite Visit -House conditions Homeowners Figure 1 diagrams the home energy audit process and-Upg actorsrade as involved in this program and in our recommendations research.2 Energy Use Behavior Changes

Figure 1: Overview of the home energy audit process and associated actors included in the research

Home Energy Real Estate Audit Tool Auditor Professionals Eligible Households Real Estate Transactions

Upgrades Audit Outputs: Completed by Interested -Score and label Homeowner Homeowners Home Visit -House conditions Sign Up -Upgrade recommendations Energy Use Behavior Changes

Major findings are presented topically below, approximately following the organization in the diagram.

2 We did not investigate real estateHom etransactions Energy directly; rather, we asked real estate professionals about their Audit Tool views on the current and prospectiveAud irolestor of energy and carbon scores, and energy efficiency more generally, in the residential real estate transactions.

Behavioral Perspectives on Home Energy Audits Au dit Outputs: Page 12 Interested -Score and label Upgrades Homeowners Home Visit -House conditions Completed by Sign Up -Upgrade Homeowner recommendations Homeowner Circumstances and Motivation Survey respondents included a variety of different demographic circumstances, but overall were wealthier, more educated, and older than typical for Seattle. The program was not widely advertised; less than 1% of eligible customers completed an audit. This underscores our overall impression, from talking to homeowners and auditors, that participants were often particularly interested in energy consumption or energy efficiency. The characteristics of the homeowners that were surveyed are consistent with reported participation in previous home energy audit programs (Sanquist et al. 2010). The extent to which results from studying participant populations should be extrapolated to apply to non-participants is a typical challenge of program-oriented research.

Overall, these homeowners appeared to be quite motivated and oriented towards completing upgrades when entering the program, though some may have been more “curious” than predisposed to complete upgrades. When asked, after their audits, about their motivation for signing up for the audit, homeowners expressed a variety of reasons for undertaking the audit that indicated a predisposition to upgrade, including a desire to reduce energy costs (26%), improve energy efficiency (23%), address comfort problems (13%), or contribute to environmental sustainability (10%). However, a substantial portion of homeowners (23%) appeared to be largely motivated by curiosity about how their home worked or by the discount on the audit cost; these homeowners may not have been oriented to making a big investment. Nearly three-fourths of the homeowners had already done some energy upgrades to their home; many of these were looking for what to do next, though some wanted to figure out why they were not saving as much from those upgrades as they expected.

For the mostly middle-to-upper income households receiving these audits, “financing” was not stated as a major consideration. Interviews early in the audit process indicated that most households would pay out of pocket for any upgrades they would complete. Surveys after retrofits had been completed bore this out: few respondents reported taking out a loan to pay for the energy upgrades they had completed. Rather, most (83%) said they used cash or savings, and most of the rest said they paid for the work with credit. Those who took out a loan said that they had spent at least $10,000, some much more, while some paid cash even for jobs over $10,000.

Audit Site Visit—The Homeowner Experience The auditor was integral to the homeowner’s overall reaction to the audit. Many homeowners identified the importance of face-to-face discussions with the auditor, and appeared to be energized by the auditor’s enthusiasm. Homeowners frequently reported that the discussion with the auditor at the end of their visit was the most informative part of the whole process. The audit site visit was designed to allow time for these interactions. Conversely, a few households felt that their auditor was not sufficiently oriented to their own circumstances. For many households, the home energy audit may be more about interactions with the auditor than the report or detailed calculations. However, the auditor’s role typically ended with the delivery of the audit report and response to any follow-up questions—auditors reported limited post-audit follow-up, in line with the Seattle City Light program design.

Behavioral Perspectives on Home Energy Audits Page 13

Blower door testing and infrared thermography seemed particularly compelling to the homeowner, and may have motivated higher levels of weatherization and local air sealing, and possibly led some homeowners to think of energy use in their homes in a different and more sophisticated way. These diagnostic techniques make invisible energy flows and leakage problems tangible and offer a good spectacle. Also, after upgrades are completed, blower door and combustion safety testing provide useful checks to make sure that upgrades resolve—and do not create—health and safety problems.

How Scores Were Perceived While program-participant Seattle homeowners found their home energy performance scores to be interesting, only a fraction of the audit participants appeared to be drawn by the opportunity to get an energy or carbon rating for their home. Less than half of the interview respondents said that they knew they would receive such a rating, and the rating may not have played a strong role in the marketing of these audits. The homeowners we talked to did not necessarily represent typical consumers of home ratings or scores, as the majority were not planning on selling their home in the near future. Instead, they tended to be more interested in how their home actually uses energy rather than seeing energy efficiency as a more abstract attribute of the house. Still, auditors thought the scores helped the homeowner make sense of their report. Seattle City Light’s program design was a deliberate balance of providing a comparable asset-based home energy score with a report providing more tailored recommendations for upgrades.

When asked to imagine that they were buying or selling a home, Seattle respondents familiar with the scores, after receiving an audit, consistently expressed an interest in seeing the scores of homes they were thinking about buying (95%). Respondents also indicated that they would be willing to share their home’s score with potential buyers (82%). However, these homeowners were rarely currently involved in selling their home or buying a new one, so the real estate context for this use of the scores was hypothetical.

When asked about the home purchase or sales process, real estate professionals saw qualified potential for ratings to provide information useful for the home buying decision. The real estate professionals we interviewed expressed concern that efficiency is not generally an advantage for existing homes compared to new homes, and also distinguished between marketing the energy efficiency features of a home and marketing the efficiency of the home as a whole.

How Reports and Recommendations were Perceived by Homeowners Auditors often provided extensively customized upgrade recommendations and other audit report content, and were encouraged to do so by Seattle City Light and by the audit report format. Homeowners appreciated the customization, and those who received the auditor customized recommendations were more likely to complete more upgrades than those who did not. Auditors frequently developed this separate list of recommendations, tailored to what they perceived as the needs of the homeowner, and also often including a personal note to the homeowners. 75% of respondents to one of the surveys received this type of list, and in two thirds of these lists the recommendations were prioritized.

Behavioral Perspectives on Home Energy Audits Page 14

Still, many homeowners expressed the desire for more information and guidance from the reports. Some homeowners expressed an interest in getting more recommendations, particularly achievable, low-cost suggestions, or a greater level of detail on their home’s issues, such as inclusion of infrared photos and other testing results. Some homeowners asked for more practical support on how to complete recommended upgrades, or more information on what upgrade subsidies or other financing incentives are available. A few households felt that the recommendations they received were too standardized, or not actionable for various reasons. Conversely, some homeowners said that the report was too detailed—indicating that not all homeowners necessarily wanted the same information from their audit.

The audit reports that homeowners received often gave very broad ranges for cost and savings estimates, which led to frustration for some auditors and homeowners. On the other hand, broad ranges may be more reflective of real modeling uncertainty than point estimates would be. In some cases the ranges provided to Seattle homeowners may have been unnecessarily broad (e.g., encompassing saving nothing and saving all end use costs). Still, both auditors and homeowners often found the estimates useful as a starting point, and doing without them does not seem to be a credible option. Typically, the EPS audit report provides a single number for the estimated fuel use after upgrades and for the approximate annual savings in dollars. However, the audit report implemented for Seattle City Light provided ranges instead of a single number for estimated fuel costs after upgrades and for the approximate annual savings in dollars after upgrades.

Upgrades Completed after the Home Energy Audit Homeowners seemed to prefer actionable, do-it-yourself, problem-solving, interesting, and lower-cost upgrades, though some households did bigger upgrades, too. In particular, homeowner action seemed focused on solving problems—such as plugging leaks or fixing safety problems—as compared to more abstract, less-visible, energy efficiency upgrades. In addition, these types of problem-solving upgrades may be easier to complete for many households. Of the homeowners we talked to at our latest survey point—an average of 8 months after the audit—43% said that they had not yet completed any recommendations, although many of these (67%) and many of those who had completed at least one recommendation (52%) still had plans to do upgrades in the future. Homeowners often cited costs, weather, family schedules, and waiting to bundle the job with other improvements as reasons for waiting, as well as, especially for white goods, waiting until appliance end of life. Nearly three in every four homeowners said that they had already completed upgrades prior to their audit—possibly leading to a reduced set of compelling retrofits remaining for auditors to recommend and homeowners to complete.

The blower door and infrared thermal imaging offered to participants seemed to make air sealing recommendations particularly appealing, based on our discussions with survey participants, while insulation upgrades, particularly wall insulation, were completed less, often due to expense, disruption, and technical challenges. Appliance upgrades appeared to be more a function of whether the appliance otherwise needed replacement. The most common recommendation provided to homeowners was air sealing (89%), and following that was attic insulation (71%). Sixty percent received a recommendation to replace their dishwasher, refrigerator, or washing machine. Over half of audited

Behavioral Perspectives on Home Energy Audits Page 15 households also received recommendations to insulate walls (61%) or install a high efficiency or tankless water heater (54%). Less frequent, but still common, were recommendations to seal or insulate ducts, to replace the heating system, to insulate the floor, or to add storm windows or high efficiency windows.

We conducted a post-retrofit assessment of 50 households who had completed upgrades, in order to assess the quality of the upgrades that had been completed. Based on this sample, upgrades that homeowners completed generally appeared to be of good quality. Where quality issues were found, these mostly reflected a lack of attention to detail or missed opportunities in air sealing or insulation— some attributed to contractors and some attributed to do-it-yourself upgrades. For these homes, the potential benefits from upgrades—energy, comfort, or otherwise—do not appear to have been substantially degraded by implementation. Also, two safety issues were found after upgrades—one home where replacement of suspect equipment failed to resolve the combustion safety issue found in the original audit, and another home whose rate of air leakage dropped below 0.35 ACHn (natural air changes per hour) after upgrades. These two cases (4%), while unusual in this sample, highlight that there are risks inherent to completing energy upgrades that can potentially be mitigated by “test-out” diagnostic assessments (after retrofitting work has been completed).

The primary tangible short-term benefits homeowners received from completed upgrades may often differ from their stated motivations for completing upgrades. At the time of the Retrofit Survey, three- quarters of the homeowners who had completed upgrades expressed satisfaction with them, and reported improved comfort in their house along with the expectation of long-term cost and energy savings, a sense of accomplishment, and occasional co-benefits such as reduced street noise, reduced dust, and improved protection against pests. Less than half of these households indicated that they had noticed some (30%) or a lot (10%) of energy savings, while others reported that it was too early to tell (11%). A few expressed disappointment with the lack of energy savings. This differs somewhat from the reasons homeowners gave for completing the upgrades they did—which focused first on cost and energy savings (67%), and secondarily on comfort (28%) and other motivations. For most households, the study period covered only 6 to 12 months after the audit, so we could assess only short-term rather than longer-term reactions to the audits and upgrades. Gathering follow-up utility data would enable assessment of the energy savings from these upgrades.

Changes in Energy Use Behaviors Though the audits as formally designed addressed homeowner behavior only peripherally, a quarter of respondents reported that their household changed their energy use behaviors in reaction to the audits. For these asset-based audits, behavior change was not a major objective of the audit, though some recommendations for behavior change were included in “no or low-cost strategies” portions of the home energy audit report. In addition to conservation-oriented changes in household energy management, a second type of behavior change could potentially occur in response to technical changes in the home, e.g., increasing thermostat set-points once a more energy-efficient furnace is installed. Few homeowners mentioned this kind of behavior change, including take-back effects, though accurate tracking of behavior change is notoriously difficult to achieve.

Behavioral Perspectives on Home Energy Audits Page 16

Modeling Tools—Scores and Recommendations Using a circumscribed set of model inputs, we found that EPS Auditor and the Home Energy Scoring Tool gave energy use estimates that were reasonably consistent with each other—for a set of homes modeled in both tools. This finding provides support for the idea that different asset tools can generate comparable asset scores or ratings for houses, especially when using a similar scope of inputs and similar assumptions regarding occupancy and energy use behaviors. There is no consensus on what energy uses are “in scope” or what occupancy and energy use behavior assumptions to use in asset modeling and scoring tools. While the assumptions used to represent “standard occupant behavior” differed somewhat between EPS Auditor and the Home Energy Scoring Tool at the time of the analysis, the model estimates still coincided reasonably well. Also, our analysis did not consider sources of variation in model results, such as auditor measurement and interpretation differences, while the limited model inputs we used likely caused model estimates to be somewhat more consistent with each other than they would otherwise have been if we had a full set of model inputs available for the Home Energy Scoring Tool.

For a small set of homes, we compared the upgrade recommendations generated by auditors using EPS Auditor with recommendations we generated for the same house using the Home Energy Scoring Tool. Despite the consistency in modeled energy use estimates noted above, the upgrade recommendations, costs, and savings estimates were quite different between these two asset-based tools. That is, a homeowner would be likely to receive different recommendations from an audit using EPS Auditor than from an assessment using the Home Energy Scoring Tool. We have no basis to determine whether either approach gave “better” recommendations by any particular criteria, and assessing the source of these differences, e.g., cost-effectiveness criteria or cost assumptions, was beyond the scope of our analysis. The auditor-customized recommendations also provided with most Seattle City Lights audit reports were not included in this comparison.

For 101 homes, we compared audit modeling tool estimates of total energy use to utility-reported energy consumption. While the tools that we considered estimated group-average energy use moderately closely, we found large differences between individual model estimates and reported usage for many homes, with larger differences for asset-based model estimates than for estimates that included inputs on occupancy and basic energy use behaviors. Asset-based tools, e.g. EPS Auditor, used for these Seattle audits, and the Home Energy Scoring Tool, do not consider occupancy and household energy use behaviors beyond applying standardized assumptions, and are not necessarily intended to reliably predict actual energy usage. However, these tools do use model-generated usage estimates to select upgrade recommendations or to calculate upgrade savings estimates. Homeowners using these savings estimates for upgrade decisions risk being misinformed if models significantly over- or under-estimate the savings they would see from completing upgrades. For EPS audits, personal interactions with the auditor and their customization of the report may often overshadow, or qualify, this model-generated guidance.

Behavioral Perspectives on Home Energy Audits Page 17

Conclusions Our research on home energy audits underscored the importance of a shift toward homeowner perspectives, requiring more than simple repackaging of energy efficiency, but rather fuller appreciation of the position of the homeowner and the personal nature of homes. We see the potential for policy makers and the home energy audit industry to better meet the needs and desires of homeowners for comfortable, healthy, and energy-efficient homes. This shift will not be easy, but based on the findings from this project, we have recommendations for going forward.

First, in the Seattle audits we examined, the auditor appeared to often function as an expert agent and advisor, and seemed to have an important influence on what homeowners did or did not do. Compared to presenting homeowners with standardized recommendations selected solely on estimated costs and energy savings, skilled auditors may provide critical additional value by being better able to judge the multi-dimensional nature of home energy upgrades—comfort, hassle, risk, safety, reliability of savings estimates, the present circumstances and plans of the homeowners, and so on—than software can, while perhaps offering inspiration and personal guidance to the homeowner.

Second, an asset orientation may not align with all existing homeowners’ interests and positions. The majority of homeowners we surveyed and interviewed were not planning on selling their home in the near term and were seemingly more interested in making home improvements, wanting specific upgrade advice toward improving their own living conditions rather than for directly affecting the market value of their home. Interviews indicate that people who are motivated to (or have already decided to make changes) but just want to know what to do, were less likely to be motivated by the EPS score itself than those seeking to compare their home to others. Seattle City Light anticipated as much and it is for this reason that they designed a program that tested homeowners’ reception of an asset rating along providing an audit report with customized recommendations. The asset-based home rating was interesting enough to most homeowners, and is something that almost all claimed they would want to see when buying a home or reveal when selling a home.

Few of the homeowners we spoke to were planning on selling their home anytime soon; upgrades appeared to be mostly about improving their living conditions in their home. More important, in assuming standardized usage of the home, asset-based recommendations and cost and savings estimates may be quite different from what would be recommended if how the homeowner actually used the home were considered. Household use of energy is highly variable. Good recommendations for frugal users, for example, may be much different than for those of liberal users. Our modeling results indicated that taking actual use—whether through bills or through operational data—into account might lead to important changes in recommendations and savings estimates, as others have also noted (e.g., Khawaja and Koss 2007). Additionally, some homeowners seemed interested in receiving advice on their energy use behaviors and operation of the home. Operationally-focused audits could provide the opportunity for providing this type of specific advice, e.g., when to use a portable heater rather than the central heater, how much they could save by turning down the heat at night, or how much energy a given appliance uses.

Behavioral Perspectives on Home Energy Audits Page 18

Third, those who undertook upgrades seemed pleased with them, although the non-energy benefits seemed often more tangible and may outweigh energy- and money-saving benefits, at least in the short term. While reductions in utility bills due to upgrades are difficult to positively identify given the natural variability in utility bills, so are they difficult to disprove. Therefore, homeowner satisfaction is not necessarily an indicator that actual energy savings met homeowner, or program, expectations. The utility consumption data we collected was not sufficient to statistically test the degree of energy savings.

Fourth, the group of surveyed homeowners tended to be considerably higher-income and more highly educated than other homeowners in the area, but there were still substantial differences in household characteristics and in what various homeowners appeared to want from audits. While many expressed sentiments in alignment with typical program promotion, e.g., energy efficiency, many did not. Other participants seemed to be interested in diagnosing and solving concrete problems such as high bills or comfort issues, or in shaving off monthly costs. Many auditors appeared to tailor their discussions to the wants and needs of homeowners, as they perceived them.

Finally, this research drew upon a generally aware, enthusiastic, often highly-educated, and affluent slice of Seattle homeowners—those voluntarily seeking out a home energy audit and who were subsequently willing to talk to us at length. People who do not sign up for a home energy audit may behave quite differently. If home energy audits are to be expanded, there is a research need to look at the circumstances and expectations of households who do not currently participate in home energy audit programs, and to understand their goals and decision-making rationales.

By many criteria, the home energy audit program conducted in Seattle led to successful outcomes, with participants indicating that they were pleased with their involvement in the program. In many instances, these audits have led to quality upgrades that improve the condition of participants’ homes, and they may have led to lower energy consumption in upgraded homes.

What we do suggest, however, is that in planning home energy audit programs in general, it is useful to consider how much of what is being offered makes good sense to potential participants. This perspective requires taking a more home- and owner-centered view than programs appear to typically adopt. It may also mean a harder look at the influence of personal interactions with a trusted expert such as an auditor or energy advisor, the quality, accuracy, and customization of the guidance being offered, in what cases diagnostic testing is warranted, and when are asset or operational assessments most useful. Of course, homeowners do not necessarily know what is possible or what will inspire them, and certain elements—the auditors themselves, or the blower door test—may sometimes be transformative.

Our findings suggest several open questions that, if better understood, could help home energy audits and assessments better speak to households:  How much energy are upgrades saving, and how does this compare to what energy models estimate and what households expect?

Behavioral Perspectives on Home Energy Audits Page 19

 Could research pursuing household segmentation with respect to home energy efficiency be used to refine home energy audits to better meet homeowner (and non-homeowner) needs? Are one-size- fits-all programs a suitable solution or are tailored programs more appropriate?

 Do home energy audits provide a viable opportunity for providing households with specific guidance on how to operate their home and optimize their energy use behaviors?

 How much effect does variability in auditor measurements and interpretation of homes, as well as data entry error, have on scores and on the tool-generated upgrade guidance provided to audit recipients, with what implications?

We are optimistic that research in these topics—and others—will strengthen policies and programs to help deliver the best home energy audits possible and to further support efforts to achieve homes that provide good indoor environments and reduced energy use.

Behavioral Perspectives on Home Energy Audits Page 20

1. Introduction and Background

1.1 Primary Project Objectives As part of the American Recovery and Reinvestment Act of 2009, the U.S. Department of Energy (DOE), acting through its Office of Energy Efficiency and Renewable Energy, increased its efforts to support residential energy efficiency technologies and programs. The Lawrence Berkeley National Laboratory (LBNL) has played a supporting role in this national effort by coordinating and funding regional research opportunities related to energy efficiency.

In 2010, Earth Advantage Institute and the City of Seattle commenced a pilot project to provide subsidized home energy audits and Energy Performance Score (EPS) labels and recommendation reports to single-family homeowners in Seattle City Light’s (SCL) service territory, which includes Seattle and surrounding suburban areas. These audits, regularly priced at $400, were offered at $95 to qualifying homeowners. We refer to this discounted audit program as the SCL Home Energy Audit Program. SCL originally planned to provide 5,000 audits, though fewer were actually performed. Because the SCL Home Energy Audit Program was already underway, LBNL identified the program as an opportunity for leveraging a situation that could inform, in a relatively short period of time, DOE’s efforts in encouraging effective audits, effective home retrofits, and in establishing home energy performance labels. A research collaboration was formed to:

1. Build a deeper understanding of the homeowner decision-making processes in planning and undertaking—or not undertaking—energy retrofits following home energy audits, of audit participant reactions to home energy performance labels, and of participant satisfaction with the audits performed; 2. Compare selected home energy audit modeling tools in terms of energy estimates and upgrade recommendations—the guidance provided to aid homeowner decision-making; 3. Assess what retrofits were actually undertaken and homeowners’ reactions to these retrofits; 4. Investigate the importance of household energy use behaviors relative to house- and equipment- based assessments of home energy performance and upgrade recommendations; 5. Explore real estate professional views on the current and potential future use, and usefulness, of home energy performance ratings in general; and 6. Consider aspects of the quality and uncertainty of recommendations and savings estimates in light of homeowner decision-making.

In addition to the authors’ organizations, several other organizations played an important role in this study including:

 Seattle City Light, the local electric utility and administrator of the home energy audit program that was the subject of this study;

 Puget Sound Energy, the local natural gas provider;  City of Seattle; and

Behavioral Perspectives on Home Energy Audits Page 21

 Home Performance Collaborative, a home performance analysis and contractor company.

Each of these organizations was an integral part of this research study, particularly in supporting contact with and data collection from the Seattle households that were the focus of the study. 1.2 Motivation for Research Home energy audits have been offered in the United States for more than 30 years, with an increase in activity in recent years due largely to federal stimulus funds (Palmer et al. 2011). These audits have generally been designed to overcome what the industry commonly considers as “barriers” to increasing home energy efficiency, in particular, the perception that homeowners do not have enough information about what to do to increase the energy efficiency of their homes or to solve performance problems, nor about the prospective advantages of these actions.

Despite success stories (e.g., Van de Grift and Schauer 2010), the results of home energy audit programs overall have often been considered disappointing: relatively few households undertake audits, and when they do, upgrade recommendations are often not acted upon (Frondel and Vance 2011; Fuller et al. 2010; Palmer et al. 2011). And though there is clearly remaining technical potential, little has been proven about the overall energy savings that result from audits and resulting upgrades, nor the extent to which actual social potential can reach technical potential. Rather than continue to ask why homeowners do not act as theory suggests they should, this research is directed to elucidating the homeowner view and experience. A better understanding here may help expectations for home energy audits better align with what is likely to be achieved.

Our study focused on the perspective of homeowner decision-making in response to home energy audits, combined with attention to the quality of the recommendations that homeowners receive and the perspectives of some key industry actors on auditing and home energy labels. Unlike a program evaluation, the research was not designed to answer detailed questions about program effectiveness in terms of costs, savings, or process, nor was it designed to provide direct answers to questions of how to get people to do more audits or more retrofits. Rather it “steps back” toward a better understanding of more basic questions about what audits provide and what homeowners seem to want, for the case of one particular program that we expect has parallels to many others.

1.3. Home Energy Audits and Tools

Home Energy Audits Home energy audits are generally designed to inform homeowners about the energy efficiency of their home and about opportunities to improve that performance—particularly through upgrades of the building envelope and major energy systems. This assessment is typically performed by a professional energy auditor who visits the home and assesses the home’s technical characteristics, such as insulation levels and equipment efficiency. Sometimes, additional diagnostic testing is performed to measure air infiltration and identify leakage points and particular areas of heat loss or gain, as well as to evaluate whether combustion equipment is operating safely.

Behavioral Perspectives on Home Energy Audits Page 22

The auditor may or may not utilize an energy modeling tool to simulate the home’s energy usage, suggest upgrades, and calculate savings from these upgrades. Home energy modeling tools may furthermore be used to generate an energy score for a home—typically representing the “asset” performance of the home—how the house would perform under “standard occupant behavior.”

Finally, the home energy audit may or may not be embedded in a larger program structure focusing on supporting participants all the way through the upgrade process, including in some cases, advising participants on what upgrades to perform, connecting them with contractors, and providing quality assurance on completed work. Upgrades may be incentivized or financing may be provided by various mechanisms.

It is evident, therefore, that home energy audits may differ greatly from audit to audit and program to program. We are basing this research upon a single program, which included some of the elements described above, but not others.

Home Energy Ratings or Labels3 As part of this research, we touch on the role of the Energy Performance Score (EPS) home energy rating included as one element of the SCL home energy audits. In general, home energy ratings may be positioned as a key element of a home energy audit, as only one of a number of important elements, or a rating may not be provided with the audit at all. In theory, at least, a home energy rating or score can serve several possible purposes:

 To inform potential home-buyers about the energy efficiency of various homes, and thereby allow them to incorporate energy efficiency in their decision process. In theory, then, more efficient homes will become more valuable, driving greater efficiency throughout the overall housing stock as home- owners invest in energy efficiency as a means to increase the value of their home;

 To qualify homes for energy efficiency financing, mortgages, loans, or incentives;

 To provide homeowners with a score for their home’s performance which can be compared to other homes and, in theory, provide a motivation for homeowners to invest in improving the score of their home so they can do better compared to other homes; and

 To draw people to request an audit so they can find out what their score is.

This research investigates the effect of the EPS on the decision-making of these Seattle homeowners, recognizing that this rating was not heavily marketed and was only one of several interrelated elements of the audit and report. We also asked real estate professionals and audit recipients hypothetical questions about how the ratings might be helpful in the home purchase or sales process. We do not explore the use of ratings in financing, mortgage, loan, or incentive programs, as the SCL Home Energy Audit Program did not include any of these elements.

3 We use “label” to refer to the energy efficiency rating in the context of real estate market, and “rating” to refer to the assessment more generally (e.g., in communicating overall efficiency to the homeowner).

Behavioral Perspectives on Home Energy Audits Page 23

Home Energy Modeling Tools Energy modeling tools are often used as part of home energy assessments to 1) apply building science to analyze the physical complexities of the home; 2) generate quantitative scores and savings or payback justification for completing upgrades; and/or 3) standardize and streamline the audit process and the generation of professional reports.

There are many different home energy audit tools available (see, for example, SENTECH 2010), and these vary considerably in their characteristics along a number of dimensions, including:

 Depth/detail of audit or assessment, which may range from a self-inspection to a “walk-through” inspection by an auditor, through detailed technical measurements and diagnostic measurements such as infiltration testing;  Latitude of auditor/assessor in influencing results;

 Level of training/expertise of auditor/assessor;  The orientation of the assessment—focusing on the house in isolation (asset) or the house along with how it is operated (operational), or somewhere in between;

 Whether the results provide energy consumption estimates, an asset-based score, and/or upgrade recommendations;

 Whether past energy consumption data are incorporated;  Applicability to new homes versus existing homes; and

 Whether the tool provides a means for addressing non-energy consumer concerns such as comfort, health, and safety.

The different characteristics of home energy modeling tools mirror differences in home energy audits and programs. For any particular home energy audit program, the choice to use a particular modeling tool, if at all, will depend on the audit program goals, resources, data availability, and participant interests. This study considered three particular home energy auditing tools: EPS Auditor from EAI, Home Energy Scoring Tool from the DOE and LBNL, and Home Energy Saver Pro (HESPro) from LBNL. These tools were selected for comparison on pragmatic grounds, i.e. the ability to use the existing SCL Home Energy Audit Program in Seattle, to address the interests of the project sponsor (DOE) interest in the Home Energy Scoring Tool and HESPro, and to achieve a breadth of comparison covering both asset and operational modeling capabilities. These tools are each described below, with a more in-depth comparison provided in Appendix P. But first, we highlight one important factor for these tools and the outputs they generate—whether they are asset or operational in scope and application.

Operational vs. Asset Assessments The energy efficiency industry often distinguishes between two different types of home energy assessments: asset versus operational. The definitions and distinctions are not always clear-cut. Asset assessments, like most energy efficiency definitions (e.g., for heating equipment), focus on physical

Behavioral Perspectives on Home Energy Audits Page 24 qualities, with energy use assumed to roughly scale up with how much the equipment is used. For homes, asset ratings are produced using standardized assumptions about occupancy and occupant behavior to model the home under “standard operating conditions.” Resulting energy use estimates and energy efficiency upgrade recommendations are based on those standardized assumptions of use. Asset ratings are sometimes compared to “miles per gallon” ratings for automobiles, though the analogy is incomplete.4 Operational assessments may consider the actual historical billing data for the home and use this to calibrate or bound recommendations and savings estimates (BPI 2011; Khawaja and Koss 2007)—or they may include operational variables such as energy use behaviors as model inputs. Though operational ratings are sometimes defined simply as “observed energy use,” the assessment itself can be considerably more complex if it includes both a detailed assessment of the technical characteristics of the house along with behavioral data, consumption data, or both.

Asset assessments can be easier to produce since they do not require utility billing data, occupancy data, or behavior data. They are applicable to new dwellings, where no operational data are available, as well as to existing dwellings, and are also user-neutral and thus well suited for comparison across houses. Asset results can therefore provide a standardized rating potentially amenable to the real estate market. Major home energy audit programs, such as DOE’s Home Energy Score program (Home Energy Saver: Engineering Documentation 2012) and the SCL Home Energy Audit Program covered here, have adopted asset ratings.

The Energy Performance Score (EPS) and EPS Auditor The EPS is an asset-based home energy rating system. Co-developed by Earth Advantage Institute and Energy Trust of Oregon, the EPS Scorecard estimates actual home energy consumption, related carbon emissions, and utility costs. The EPS Scorecard graphically illustrates a home’s energy score, which is based on total site energy use, and the home’s carbon footprint, and shows homeowners how their home compares to state averages and targets for homes. An example EPS Scorecard is provided in Appendix A. The EPS Scorecard is paired with an EPS Energy Analysis Report, which provides in-depth information on a home’s performance and provides upgrade recommendations, along with estimated costs and energy savings.

An EPS audit draws on two types of inputs to generate an EPS score and improvement recommendations, which are based on model-estimated energy savings for each house audited. First, the model includes building-specific inputs collected during the in-home audit process (the auditor collects insulation levels, air leakage rates, fuel type for heating and cooling systems, etc.). This also includes diagnostic testing: blower door infiltration testing and combustion safety testing where applicable, as well as in many cases, infrared images of various aspects of the home. Second, EPS Auditor leverages assumptions representing standard weather and occupant behaviors.5 Certain other

4 Whatever the unit of home use, it is more complex than a mile. 5 Assumptions are used for factors impacted by homeowner behavior like thermostat and water heater temperature settings and energy intensity factors for laundry, domestic hot water, entertainment electronics, and cooking.

Behavioral Perspectives on Home Energy Audits Page 25 large end uses6 are not modeled in detail in EPS Auditor, but are captured within a single input to reflect approximate energy usage in the score.

By drawing on this modeling approach, EPS audits produce an asset rating based on the physical characteristics of the home. Because the rating reflects a standard set of energy use behaviors rather than the actual energy use behaviors of the occupants, it allows for a direct comparison between the estimated energy use and energy savings of the rated home and an average home. To provide Seattle customers with a comparison of their home against an average Seattle home, SCL provided Earth Advantage Institute with data on typical energy consumption for single-family detached homes.

The EPS Energy Analysis Report includes asset-based energy use and fuel cost estimates by major end use group before and after recommended upgrades, a description of the performance of ten home energy elements (e.g., ducts, water heating), and a list of recommended upgrades along with typical cost ranges and annual savings ranges. Typically, the EPS Energy Analysis Report provides a single number for the estimated fuel use after upgrades and for the approximate annual savings in dollars. However, for its Home Energy Audit Program, SCL requested that the EPS Energy Analysis Report provide ranges instead of a single number for estimated fuel costs after upgrades and for the approximate annual savings in dollars after upgrades. This range was approximately 30% more or less than the single figure calculated by EPS Auditor and sometimes led to a low-end estimate of zero for fuel costs and savings.

In practice, most SCL audit reports provided homeowners with a second, overlapping set of upgrade recommendations. Auditors were encouraged by SCL to use the “General Notes” section of the report to call out any safety issues, prioritize upgrade recommendations, and/or to bundle upgrade recommendations into feasible phases. This was a deliberate program element to maintain the integrity of a comparable asset-based home energy label (the EPS), which was assumed to be of primary interest to homeowners planning on selling their home in the near future, while also providing unique upgrade and behavioral recommendations assumed to be more relevant to homeowners seeking to make upgrades for reasons beyond home value. Throughout this report we make a distinction between these two different sets of recommendations as a means of considering the degree of customization provided by the auditor, and how that customization is or is not enabled in the audit process and by the audit tool:

 “Standardized” recommendations: As part of the audit protocol, the auditor rated the home on ten energy performance elements on a scale from “Very Poor” to “Excellent.” Auditors were trained to provide upgrade recommendations for any category where a home’s performance was rated “Poor” or “Very Poor,” as well as for any other upgrades that they determined would be of use and of interest to the homeowner. The auditor selected these upgrade recommendations from a set list provided in the EPS tool. While the auditors could use their discretion as to rating the performance of each home element, the recommendations themselves were standardized—the text of the

6 The other large energy uses included in EPS Auditor include dehumidifiers, hot tubs, pool pumps, pool heating systems, sewage and well pumps, saunas, steam showers, and elevators.

Behavioral Perspectives on Home Energy Audits Page 26

recommendation and the model-generated cost and savings estimates could not be modified. This mechanism for defining upgrades reflects the necessity that, to utilize model-based estimates for savings and costs, the recommendations must be relatively standardized. o Additional upgrade details and deep retrofit options: For each of the “standardized” recommendations the auditor chose to include in the report, they were encouraged to provide additional upgrade details on a dedicated page of the report. Auditors often provided information on how to go about making the upgrades, or on areas of the house that needed particular attention. In this section of the report, there was also the opportunity for the auditor to provide “deep retrofit” options for those homeowners who wanted to take additional steps to upgrade their home. Both the additional upgrade details and the deep retrofit options were not treated in most of our analyses, which focused on the “standardized” recommendations and the “auditor customized” recommendations discussed below.  “Auditor customized” recommendations: As part of the audit process, most auditors delivered a personalized note to homeowners, within the “General Notes” section of the report. Within this note auditors often provided specific information on upgrades that they recommended the homeowner complete. These recommendations were most often prioritized, but rarely included cost or savings numbers. This type of customization was encouraged within the SCL Home Energy Audit Program, as a means for providing SCL’s customers with detailed and specific information. While auditors were encouraged not to contradict the “standardized” recommendations list, the two sets of recommendations were not necessarily consistent.

The Home Energy Score and the Home Energy Scoring Tool The Home Energy Score is an asset-based home energy rating developed by DOE and LBNL. Professional home performance auditors use the Home Energy Scoring Tool to assess a home’s major energy systems and envelope and provide an asset rating. In addition to the score, which ranges from 1 to 10, homeowners receive a summary of their home’s characteristics and recommendations for home improvements (Glickman 2011). An example Home Energy Score report is included in Appendix C. Auditors using the Home Energy Scoring Tool conduct a walk-through audit where they collect about 40 data points on the house and equipment, which takes about an hour. The Home Energy Score program has been positioned as a quick way for homeowners to get a score and upgrade recommendations for their home, and as a possible precursor to a more comprehensive home energy audit (Home Energy Score 2012).

The Home Energy Scoring Tool is a web-based home energy modeling and reporting tool that can be used only by trained and certified auditors to generate the Home Energy Score asset rating for the modeled house. This tool leverages a set of “standard occupant behavior” assumptions, along with a limited set of house measurements, to generate an asset score that can be compared across different houses. While the Home Energy Scoring Tool will accept blower door air infiltration measurements, this and other diagnostic testing is not required for the tool use and is not currently marketed as part of the Home Energy Score Program (Home Energy Score 2012). The Home Energy Scoring Tool generates scores and upgrade recommendations based on the house measurements input by the auditor, who otherwise does not influence what recommendations are generated.

Behavioral Perspectives on Home Energy Audits Page 27

This modeling tool is based on the same DOE2.1E hourly simulation platform used for the Home Energy Saver and HESPro, two other web-based tools developed by LBNL, and leverages many of the same calculations and assumptions as these other tools. According to the Home Energy Scoring Tool engineering documentation (Home Energy Saver: Engineering Documentation 2012), homeowners who have received a Home Energy Score assessment will be able to import their model results into Home Energy Saver or Home Energy Saver Pro to adjust their asset estimates and recommendations based on model inputs regarding how they operate their home.

The Home Energy Scoring Tool, as an asset tool, excludes certain other large energy uses, such as hot tubs or pool pumps, as well as appliance characteristics such as washing machine or dishwasher efficiencies. These appliance characteristics are covered by assumptions, while the tool assumes these other large energy uses are not present when calculating scores.

Home Energy Saver Pro HESPro was developed by LBNL as a publicly available web-based tool adaptable to a variety of uses, including homeowner self-assessment as well as use by professional auditors. This tool does not generate a rating or score, but instead allows the user the flexibility to assess a home at whatever level of input detail they have available, filling in the remaining inputs with defaults based on typical house technical characteristics and typical behaviors. The tool includes inputs for technical house characteristics, including some configurability not available in the Home Energy Scoring Tool, as well as appliance efficiency inputs and inputs for a variety of basic household energy use behaviors, including appliance, lighting, thermostat settings, and small electronics usage. HESPro also includes the ability to model certain other large energy uses, viewed as external to the “asset” perspective, such as pool pumps or hot tubs. HESPro utilizes the DOE2.1E simulation platform to calculate hourly heating and cooling loads based on typical weather conditions. Tool outputs include a set of recommendations for the home, with more detailed cost, savings, and payback estimates than provided in the Home Energy Scoring Tool, as well as a wider range of possible upgrade recommendations reflecting the wider range of HESPro inputs. Additionally, HESPro generates a fairly detailed set of energy use estimates by various major end uses and includes reporting functionality, with some ability of the user to modify upgrade recommendations.

1.4 The Seattle City Light Home Energy Audit Program To help meet Seattle’s goal of becoming the nation’s green building capital, in 2009 the city identified a number of related initiatives for energy efficiency in commercial and residential buildings. As one of these initiatives, the City of Seattle partnered with SCL to develop and implement a Home Energy Audit Program for SCL customers.

The program, still ongoing as of the publishing of this report, offers Seattle residents living in single- family homes a discounted home energy audit, including a home performance rating. Goals of the program include establishment of a standard rating for home energy performance through a relatively simple and easy to understand tool; increasing energy-related information available to home occupants; education of a wide range of stakeholders on energy efficiency and home performance rating; helping residents identify opportunities for efficiency gains; distributing free compact fluorescent lamps (CFLs)

Behavioral Perspectives on Home Energy Audits Page 28

Home Energy Real Estate Audit Tool Auditor Professionals All Households Real Estate Transactions

Upgrades Audit Outputs: Completed by -Score and label Interested Homeowner Site Visit -House conditions Homeowners -Upgrade recommendations Energy Use Behavior Changes to participating homeowners; encouraging voluntary home upgrades; and facilitation of such a rating into real estate listings and appraisals. SCL chose Earth Advantage Institute’s EPS Auditor software tool and its EPS home energy label to rate homes in the pilot project.

Home Energy Real Estate On Earth Day 2009 (April 22), Seattle Mayor Gregory ANickelsudit Too lintroduced his Green Building Capital Auditor Professionals EInitiativeligible Hou toseh reduceolds greenhouse gas emissions associated with buildings and homes in SeattleReal Eandstate Transactions announced that as part of the program, SCL would launch the Home Energy Audit Program. Within the

study period from June 2010 through October 2011, 1,355 SCL home energy auditsUpgrad eweres completed. A Audit Outputs: Completed by smallInt enumberrested of audits were completed prior to -JuneScore 2010 and la bunderel a pilot process. Homeowner Homeowners Home Visit -House conditions SCL SHomeign Up Energy Audit Program—the Process-Upgr ade recommendations Energy Use Behavior Changes An overview of the SCL home energy audit process is presented in Figure 2.

Figure 2: Overview of the Seattle City Light home energy audit process

Home Energy Audit Tool Auditor

Audit Outputs: Interested -Score and label Upgrades Homeowners Home Visit -House conditions Completed by Sign Up -Upgrade Homeowner recommendations

To participate in the program, homeowners sign up at the SCL website. SCL issues each homeowner a coupon entitling them to a $400 audit for $95. Along with the coupon, homeowners are provided with a list of program-qualified home energy auditors and instructed to select an auditor and correspond with the auditor directly to set up an appointment for the home energy audit. At the audit appointment, the homeowner provides the auditor with payment for the $95 discounted cost of the audit.

SCL’s EPS audits require an assessment of the whole house, including attic, walls, windows, foundations, ducts, and heating and cooling system(s), as well as measurements of air leakage. Also, auditors performing audits for the SCL Home Energy Audit Program were asked to make several additional measurements for both SCL and for this research project, significantly adding to the audit time. If homeowners wished, auditors also replaced incandescent light bulbs in their home with CFLs, at no cost to the homeowner. As a result, the in-home portion of the EPS audits took three to four hours to complete; the measurements (other than blower door testing) necessary to calculate the EPS score would typically take about an hour (Earth Advantage Institute and Conservation Services Group 2009). An adult in the participating household was required to be at home during the audit to provide the auditor access to the home and show the auditor how to access areas like attics and crawl spaces. In addition, as discussed in Chapters 3 and 4, interaction during the audit process plays an important role in auditors’ communication with homeowners.

Behavioral Perspectives on Home Energy Audits Page 29

Upon completing the audit of the home, auditors often have a conversation with homeowners about initial findings and inform homeowners that in the coming days they will receive an email indicating how to download their EPS Scorecard and in-depth report. After completing the in-home portion of the audit, auditors finalize all of the audit information, include relevant pictures (such as thermal images), and add recommendations for upgrades to generate a standardized EPS Scorecard and an EPS Energy Analysis Report for the homeowner. Homeowners then receive an email that their scorecard and report are available for download on the EPS website. The technical information collected in the audit, along with model estimates, scorecard information, recommendations, and any additional comments or guidance from the auditor were archived in a data base which was subsequently accessed by the project research team. Additionally, SCL conducts regular quality assurance checks on all program EPS Scorecards and Energy Analysis Reports and works with auditors to update reports that do not meet SCL’s standards for comprehensiveness and accuracy. Earth Advantage Institute also conducts regular quality assurance checks of audit results.

As many auditors are also home performance contractors, EPS auditors can recommend their own contracting services to the homeowner for any planned upgrades. However, SCL requires that if auditors recommend their own services to the homeowner, they also have to make homeowners aware that they have the option to work with other contractors. The SCL Home Energy Audit Program does not provide homeowners a list of contractors or support in finding one. Customers whose space heating is with gas are given information on rebates available from Puget Sound Energy, the local gas supplier. Customers qualifying for the federally funded Community Power Works program are given that information.

In addition, the SCL Home Energy Audit Program was not designed to maximize participation, was not heavily marketed, and was not intended to closely tie in various wrap-around elements that are known to help boost upgrade completion, such as financial incentives, direct links to contractors, or providing “energy advisors” or “energy advocates” to help homeowners through the process (see, for example, Van de Grift and Schauer 2010). Rather the SCL program represents a fairly evolved home energy audit program that provides homeowners with a lot of information, a detailed assessment of the energy characteristics of their home from an asset perspective, and high quality, well-trained, and often quite experienced auditors.

Auditor Training To qualify as a participating auditor for the SCL Home Energy Audit Program, auditors are required to show proof of current BPI (Building Performance Institute, Inc.) certification and to complete EPS Auditor training. The auditor training is conducted by Earth Advantage Institute and SCL, and consists of two full days of training: one full day of classroom training and one day of a site visit and a comprehension exam. The first training was conducted in April 2010 and then periodically after that, depending on demand.

Behavioral Perspectives on Home Energy Audits Page 30

2. Approach

2.1 Overview of Research Approach To address the research objectives described in Chapter 1, the project team drew from existing streams of data gathered as part of Seattle City Light’s (SCL) Home Energy Audit Program as well as developing a variety of additional data streams. The research focused on homeowners but also included conversations with home energy auditors and real estate professionals to more fully understand audits and home energy labels. These data streams are further described in Table 2 below.

Table 2: Overview of research data streams

Source Method(s) of data collection Contribution Homeowners Phone interviews and surveys Homeowner perspectives on audits and and Mail and web-based energy use behavior upgrades households surveys Self-reported energy use behavior data used in Electric and natural gas billing data provided combination with house technical data to by utilities evaluate the importance of behavior relative to house and equipment factors in home energy performance Utility-reported energy use data for many homes House Technical measurements of house collected Technical characteristics of the house by home energy auditors during the site visit Re-measurement of many technical Audit reports characteristics of the house Re-measurement (after upgrades were Details on the quality of completed upgrades completed) of technical house data, and inspection of completed upgrades Home energy Phone interviews Auditor perspective on homeowner decision- auditors making, audit process, audit modeling tool, and program Real estate Phone interviews Real estate professional perspective on how professionals home energy performance currently fits into the buying/selling process and how it could fit in the future Home energy Model runs for a subset of audited Seattle Scores, energy use estimates (total, by utility, modeling homes, using the Home Energy Scoring Tool and by end use), and upgrade tools and Home Energy Saver-Pro, as well as recommendations from each of the tools emulators replicating these tools Results from the original EPS Auditor modeling that was completed for the audit reports

Our approach is distinct from most previous research efforts in that we do a relatively deep dive on the program participant population, combining discussions with participant homeowners, technical data on

Behavioral Perspectives on Home Energy Audits Page 31 house characteristics, modeling results, and billing data. The study includes both surveys and interviews of home energy audit recipients in SCL’s Home Energy Audit Program at multiple points in the audit process, focusing on various issues including motivation, decision-making, retrofit completion, and energy use behavior; analysis of house characteristics from the audit results modeled in home energy audit tools (including EPS Auditor, used for SCL’s audits); utility-reported natural gas and electric use; and data from repeat auditor visits to a set of homes after retrofits were completed. We gain additional related perspectives from interviews of auditors participating in the program and from interviews of real estate professionals regarding home energy labeling.

By looking from these diverse perspectives and from multiple touches on homeowners, we assemble a more complete picture of the factors affecting homeowner motivation and decision-making, and of the program in general, than has been achieved in most previous research efforts.

All audits were conducted in the SCL service territory, and stakeholders were active in the Seattle vicinity. All data used was collected between June 2010 and October 2011, except for historical utility usage data. The following sections outline the basic data collection procedures used for each of these groups.

2.2 Home Energy Auditors—Data Collection and Sample Auditor interviews were conducted to learn more about the auditors and the audit process. In addition to characterizing the auditors, research questions addressed during the interviews included:

 How do auditors perceive their customers? What do they see homeowners’ wants, needs, and motivations to be?

 What happens during the home energy audit?  What do auditors think of the EPS Scorecard and Energy Analysis Report?

 Do auditors customize their recommendations to fit what they see as customer needs?

Interviews followed the guide included in Appendix D.

From an initial list of 42 auditors, we selected 16 as being the most active in the SCL program. From these 16 auditors, Research Into Action completed 12 interviews, representing 11 different companies.

2.3 Real Estate Agents—Data Collection and Sample Real estate agent interviews were conducted to provide insight into these respondents’ perspectives on home labeling in general, how energy labels impact the home buying process, how energy efficient features are marketed during home sales, and what buyers and sellers focus on during the sales process. Interviews followed the guide included in Appendix F.

Between May and July 2011, Research Into Action interviewed 12 real estate professionals (REPs) who work in the Seattle area and had attended one of Earth Advantage Institute’s Sustainability Training for

Behavioral Perspectives on Home Energy Audits Page 32

Real Estate Professionals (S.T.A.R.) courses. The majority of the agents who had attended the trainings came from large real estate companies, and included multiple respondents from two companies. Each respondent from these two companies came from a unique office.

2.4 Homeowners, Households, and Houses—Characteristics, Data Collection, Sample, and Analysis

Overview of Methods SCL’s Home Energy Audit Program is described in detail in Chapter 1.4. This program was voluntary and minimally marketed, was made available to the entire SCL service area, and had an objective of reaching 5,000 homes.

Participation was limited to single-family homeowners.7 The SCL Home Energy Audit Program design was such that participation was on a self-selection basis, while participating required a substantial time commitment and modest financial commitment—making it likely that participating homeowners were quite unlike the population of Seattle as a whole.

Of the approximately 171,750 SCL customers in single-family homes, 1,355 undertook an SCL home energy audit between June 2010 and October 2011, the dates of focus of this research.8 This amounts to a program-specific participation rate of 0.8% of eligible customers for that time period. 9 For the homeowner interviews and surveys, homeowners with addresses covered by the Community Power Works program10 were excluded at the request of the City of Seattle.

A large portion of our research activities focused on these houses and households that received an SCL home energy audit. Figure 3 provides an overview of the various house- and household-focused data streams we collected. In sequence of activity, we completed the following interview and survey steps:

 Pre-Audit Interviews of a small number of households who had signed up for the audit but who had not yet undertaken one, to capture early expectations and motivations.

7 While this was a stated requirement for receiving the audit discount from SCL, screening appeared to primarily be left to the auditor. A small number of audits appear to have been completed on non-single family homes, including townhouses, apartments, duplexes, or houseboats, based on comparison to publicly available property assessment records. It is not clear how many households who signed up for the audit were screened out. 8 The number of SCL customers in single-family homes was calculated based on SCL’s estimate of 343,500 total households in SCL territory combined with the census estimate of the percentage of these that were single-family homes (see Tachibana 2010). According to data collected by Earth Advantage Institute, several hundred SCL customers undertook audits under programs other than the SCL Home Energy Audit Program, so the total audit participation rate for utility customers is higher. 9 The program experienced a large initial interest and number of households signing up, creating a backlog that took six months or longer to clear. A number of those initially signing up did not end up getting an audit. A proportion of those signing up later, after the backlog was cleared, also did not complete audits. Because of duplicate sign-ups and this initial backlog, we cannot estimate this rate of conversion. 10 The Community Power Works program area included approximately 20% of SCL Home Energy Audit participant addresses.

Behavioral Perspectives on Home Energy Audits Page 33

 Post-Audit phone Surveys of households who had completed an audit, to capture reactions to the audit, auditor, and recommendations, and any early retrofit activity.  Post-Audit Interviews of a small number of households, covering similar topics to the Post-Audit Surveys but more in-depth and interactive.  Retrofit phone Surveys of households, on average eight months after the audit, for detailed information on retrofit activity and cost, homeowner assessment of the process and results, and plans for doing further work.  Web- and paper-based Energy Use Behavior Surveys, fielded any time after the audit, collected detailed information on various energy using practices, such as thermostat setting and hot water use, for the purpose of assessing how much house operation generally, and household behaviors specifically affect energy usage and modeling tool estimates and recommendations.

In addition to these discussions with homeowners, we incorporated in our analyses a number of other data streams on the technical aspects of the house, the audit, and energy use:

 Portfolio of data on technical measurements made on each house during the course of the audit, including basic house size and configuration, equipment details, blower door results, and dozens of other technical measurements.  Audit findings for each house, including recommendations, savings estimates, and auditor notes.

 Electricity and natural gas utility data for a subset of houses, for up to three years priors to the audit, with some post-audit data, as described below.

 Post-retrofit assessment data, gathered though site visits of independent auditors to 50 homes reporting that they had completed upgrades, in order to assess retrofit quality and measurement uncertainty.

Behavioral Perspectives on Home Energy Audits Page 34

Figure 3: House and household data streams collected in this research

In summary, our research sample, drawn from Seattle single-family homes that received an SCL home energy audit is described in Table 3.

Table 3: House and household sample

Number of Households Eligible Homes (Single-family homes in SCL territory) 171,750 Received an SCL home energy audit between June 2010 and October 2011 1,355 Technical house measurements and audit report available for analysis 1,355 Completed at least one phone survey or interview 286 Completed the Energy Use Behavior Survey 346 Complete utility billing data were available for analysis 251 Post-Retrofit Assessment completed 50

Characteristics of Houses and Households

Homes Receiving a Home Energy Audit and All Eligible Homes Table 4 summarizes basic characteristics of houses audited between June 2010 and October 2011, and comparative data for single-family houses in the SCL service territory, based on a recent customer

Behavioral Perspectives on Home Energy Audits Page 35 characterization study (Tachibana 2010). The characteristics of audited houses are roughly similar to those for the SCL service territory, though audited houses tend to be somewhat older and larger: as shown in the table, homeowners living in smaller houses (under 1,500 square feet) and those in newer houses (built 1980 or later) were less likely to undertake an audit than others. Compared to the housing stock nationwide, Seattle single-family houses tend to be older—only 19% of housing units nationwide were built before 1950,11 compared to 50% of single-family houses in the SCL survey. Also, central air conditioning is uncommon in Seattle houses—fewer than one in ten have central air conditioning.

Table 4: Characteristics of audited households as compared to SCL Customer Characteristics Survey results

Seattle City Light Characteristic Audited Houses12 Single Family Houses13 Age of House Built before 1950 61% 50% Built 1980 or later 5% 12% Floor area Under 1500 square feet 25% 41% Over 2500 square feet 26% 18% Primary Heating Fuel Electricity 17% 22% Fuel Oil 17% 17% Natural Gas 63% 58% Secondary Heating Equipment Present --14 79% Programmable Thermostat Present 82% 61% Water Heating Fuel Electricity 49% 54% Natural Gas 51% 46% Air Conditioning Equipment 7% --15

Demographic Characteristics of Households Responding to Surveys We asked survey respondents a minimal set of demographic questions. Some program participants living in areas in which another SCL-supported home energy audit program (the Community Power Works program) was being conducted were exempted from surveying, at the request of the Community Power Works program. This program covered zip codes in the south of the city with generally lower

11 2010 American Community Survey 1-Year Estimates Table DP04, U.S. Census Bureau. 12 With the exception of the data for presence of programmable thermostat, statistics are based on audit results for 1,355 houses between June 2010 and October 2011. The programmable thermostat statistic is based on the Behavioral Survey Results 13 Survey respondent characteristics in SCL Residential Customer characterization (Tachibana 2010). 14 Some data on secondary heating systems were collected, but the data were not complete and did not include portable heating. 15 The SCL customer characterization included portable cooling equipment and fans; 14% of single-family residences reported air conditioning of some kind. The home energy audit survey data included only wall and central air conditioners.

Behavioral Perspectives on Home Energy Audits Page 36 area-average incomes. The survey-eligible population was thus not a statistically representative subset of the participants in the SCL Home Energy Audit Program and may tend to omit lower-income participants.16

Table 5 provides a comparison of demographic characteristics of survey respondents from U.S. Census data for Seattle overall and to respondents in a recent SCL customer characteristics study (Tachibana 2010). Overall, survey respondents are wealthier, more educated, and older than typical for Seattle. Surveyed individuals had nearly three times the rate of professional, masters, and doctoral degrees (57% participant population versus 21% Seattle), and a much lower proportion of participants who had not attained a BS (9% in the participant population versus 46% in Seattle), compared to Seattle as a whole.17 Participating households also had higher incomes than typical in the area, with over half (57%) reporting 2010 household income over $100,000, and 23% reporting 2010 household income over $150,000. Despite this overall “elite” tendency, there was clearly demographic heterogeneity in the participating population as well. For example, while 27% had children under 18 at home, 19% had at least one household member over 65, and even with the Community Power Works area participants excluded, 12% of participants reported annual income less than $50K. Sanquist et al. found, in a literature review on home energy audit programs, that high income and education levels are typical of home energy audit programs (2010).

Pre-Audit Utility Usage Characteristics for Participating Households and for Eligible Homes Though a detailed analysis of changes in energy use over time was beyond the scope of our project, we completed two types of simple comparisons with available natural gas and electricity billing data from homes receiving an SCL home energy audit (the participant group), as well as with electricity billing data provided by SCL for a comparison group of 1,916 homes, though we did not have details on the characteristics of the comparison group homes. First, for each participant group household for which data were available, we compared 12-month energy use over available years, separately for electricity and natural gas. Second, we compared changes in electricity usage for the participant group to those for the comparison group. This comparison was intended to help understand the extent to which there may have been distinct patterns of change in energy use among participants that might suggest shared motivations with respect to energy efficiency (e.g., if energy costs had recently spiked).

Participant Group Comparisons. For the participant group comparisons, we aggregated the data into consecutive 12-month periods with the last period ending just before the home’s audit. For most participants, this allowed three years of comparison: the 12-month period right before the audit, a 12- month period just before that (“second year prior to the audit”) and the preceding 12-month period

16 Approximately 20% of participants in the SCL Home Energy Audit Program resided in areas eligible for participation in the Community Power Works program. 17 Data for Survey Respondents and SCL Customer Study are for single-family homes, while data for the American Community Survey are for all house types, not just heads of households or single-family homes. Data reported for project survey respondents (first column) are for the Post-Audit Survey only.

Behavioral Perspectives on Home Energy Audits Page 37

Table 5: Demographic characteristics of survey respondents compared to the community overall

Seattle City Light American Community Customer Study Survey (Seattle) Survey Respondents [Data Year [Data Year 2005- [Data Year 2010] 2008/2009] 2009] INCOME Median Household Income $100-$150K 18 N/A $59K (2009) Household Income $50K or less 12% 27% [under $45K] 43% Household Income $100K or higher 57%19 31% 27% Household Income 23% N/A 13% $150K or higher EDUCATION Education: Less than BS 9% N/A 46%20

Education: Master’s, Doctorate, or 52% N/A 21% Other Professional Degree AGE /HOUSEHOLD LIFESTAGE Respondent Age Median age: 50 44 or younger: 28% N/A years 44 or younger: 65 or older: 21% 29% 65 or older: 17% Percent with somebody 65 or older 19% N/A 14% Percent with children under 18 in 27% N/A 20% the home

(“third year prior to the audit”). This allowed two successive year-on-year comparisons of electricity usage for the participant group, with a full three years of electricity data prior to the audit available for 699 participants. Less data were available for natural gas: we had data for 146 households one year prior to the audit and for 47 households two years prior to the audit.

Electricity consumption for individual households varied markedly from year to year. The average electricity consumption for the 12-month period before each participant household’s audit was 10,833 kWh. For these participant households, comparing electricity use for the year just prior to the audit to the second year prior, 40% had electricity consumption that was more than 10% different from the year before—with 22% showing more than 10% reduction (and thus 18% a more than 10% increase) and 10% showing more than a 20% reduction. The high year-to-year variability in many household’s electricity consumption underscores the difficulty of fairly attributing changes in energy use to retrofits, which even ideally may often result in reductions of 10% or less. This comparison does not take into account

18 Income data were collected as a range, to increase response rate. Of households reporting income, 38% reported incomes lower than $100K for 2010. 19 The 13% of cases in which the respondent declined to report income are excluded. 20 Note that this figure is for all adults rather than heads of households in single-family homes.

Behavioral Perspectives on Home Energy Audits Page 38 end uses or the contribution of weather to variability, but note that nearly as many households showed an increase as a decline.

For natural gas, we had data to compare natural gas consumption for one year prior to the audit to two years prior to the audit for 47 households. Average consumption in the second year averaged 757 therms versus 697 therms for the previous year, an 8% increase (pairwise t-test p=0.001), likely weather-related. No comparison group data on natural gas use are available. The degree of change of individual household consumption from one year to the next were similar to those for electricity, in that 38% had natural gas consumption more than 10% different between the two years (one year prior to the audit versus two years prior to the audit), and 15% showed differences of more than 20%.

Participant Group versus Comparison Group. We also looked at whether the patterns in electricity use across time—the few years prior to the audit—for the participant group were much different than those for the comparison group. The differences appear minor: there were small declines in each group’s consumption over the three years, slightly more so for the comparison group than the participant group, and, in accordance with a rate change, slightly greater increase in bills for the participant group than for the comparison group. In particular, looking at changes in electricity consumption for the participant group as a whole, a pairwise t-test indicates no statistically significant difference in electricity consumption for the year ending just prior to the audit versus the previous year (p=0.80), and a small (3%) statistically significant decline over time comparing the previous pair of years (two years vs. three years prior to the audit, p=0.001). The annual bill for the year just prior to the audit was on average $97 higher across the participant group than for the previous year (pairwise t-test, p<0.001), with no statistically significant change in bills from the third year prior to the audit to the second year prior. SCL residential electricity rates increased 14% in 2010 and 4% in 2011. At 5.6 cents/kWh on average, these rates are 40% lower than the US average.21 We also computed changes over time for the comparison group, fixing the “end” date as mid-August 2011.22 Comparison group electricity use decreased 2.6% between the third year and second year prior to the end date, and 2.2% between the second year prior and the year prior to the end date, with statistical significance in both cases based on a pairwise t-test. For the comparison group, bills increased, though by somewhat less on average than for the participant group.

Coordination of Data Collection with the Home Energy Audit Process The home energy audit process, the timing of which was arranged by the homeowner, drove coordination for all other data collection activities. Participating households typically progressed through the home energy audit process and the various potential contact points as depicted in Figure 4.

21 http://www.seattle.gov/light/news/issues/RateProc/Docs/Fact%20Sheet.pdf 22 The end date will affect the results. We did not produce comparisons synching the start month of the comparison group to the multiple start months for the participant group.

Behavioral Perspectives on Home Energy Audits Page 39

Figure 4: Home energy audit process and data collection activities

Interview Pre-Audit Post-Audit Phone Phone Interview (N=31) Respondents Interview (N=33) Home Energy Audit Sign-Up Audit Completed Post-Retrofit Survey Post-Audit Phone Retrofit Phone (Test-out) Survey (N=134) Survey Assessment Respondents (N=159) (N=50)

Energy Use Behavior Survey (paper or web based, N=346)

Complete Utility Data Received (N=251)

We invited homeowners participating in SCL’s Home Energy Audit Program to voluntarily complete one or more phone surveys or interviews, as well as an Energy Use Behavior Survey administered on paper or online. The phone surveys and interviews targeted three different stages of the homeowner audit and retrofit process: before the audit (Pre-Audit Interviews); after the audit is completed (Post-Audit Interviews and Post-Audit Surveys); and later after the audits, after homeowners were given the opportunity to (but did not necessarily) complete retrofits (Retrofit Surveys). While we invited most households who completed an audit to complete the Energy Use Behavior Survey, we recruited homeowners to participate either in phone surveys or interviews, and not both.

The phone surveys included open-ended as well as closed-ended questions, and were fielded to a larger set of homeowners, while the phone interviews were more open-ended, and were designed to collect in-depth information from a smaller subset of homeowners.

Finally, we offered a Post-Retrofit Assessment to some participants who reported having completed retrofits recommended in their audit report. This test-out assessment generated an updated EPS Scorecard to reflect the work that was completed, and provided data on the extent and quality of the retrofits completed. SCL and Earth Advantage Institute provided the research team with details from the home energy audits for analysis. This audit data included the technical measurements of the audited homes, as well as upgrade recommendations, energy use and savings estimates, and other details from the audit report. The home energy auditors collected the data and generated the reports.

Data Collection Details for Homeowners and Houses

Homeowner Phone Surveys and Interviews Research Into Action conducted the in-depth Pre-Audit and Post-Audit phone Interviews. Thirteen participants were interviewed at two points in the process: after signing up for the audit but prior to completing it (Pre-Audit Interview) and soon after the audit (Post-Audit Interview).

Behavioral Perspectives on Home Energy Audits Page 40

The Portland State University (PSU) Survey Research Lab conducted the Post-Audit and Retrofit phone Surveys. Phone surveys sampled participants at two points in the process: soon after the audit (Post- Audit Survey), and longer after the audit (termed the Retrofit Survey because the questions focused on upgrade completion and decision-making). A subset (58) of the survey respondents completed both the Post-Audit and the Retrofit Surveys. Table 6 plots out the longitudinal and single interview and survey responses, while the details of each data collection point are described in more detail below.

Table 6: Number of completed homeowner interviews and surveys by category

Pre-Audit Post-Audit Retrofit Pre-audit only 20 Phone Longitudinal 13 Interviews Post-audit only 18

Post-audit only 76 Phone Longitudinal 58 Surveys Retrofit only 101

The number of completed phone surveys and interviews was limited by the number of program participants,23 some early problems in collecting contact information,24 as well as the exclusion of households in the Community Power Works program territory.

Pre-Audit Interviews—Pre-Audit Interviews were drawn from households who had signed up for a home energy audit but had not yet received the audit. As households typically received an audit soon after signup (often in two weeks or less),25 sample contact was very time-sensitive. Pre-Audit Interview questions focused on homeowner motivations for signing up for the audit, awareness of audit features (including the EPS Scorecard), perceptions of their homes, and intent to complete upgrades.

Post-Audit Interviews and Surveys—Candidates for Post-Audit Surveys and Interviews were drawn from the same sample population—those homeowners who had completed a home energy audit and had been sent the final audit report (referred to as a ‘finalized’ audit). Post-Audit Interviews were typically completed within one month after the audit, while Post-Audit Surveys were conducted from 1 to 6 months after the audit, with a mean of 4 months. Post-audit questions focused on how the homeowner experienced and perceived the audit visit, the auditor, and the score and report they received. The surveys and interviews also asked questions about what upgrades they had completed and intended to complete.

23 Originally, the SCL Home Energy Audit Program hoped to gain 5,000 participants. As it turned out, the number of participants within the study period was far fewer. 24 For a period of time, due to a programming glitch, participants signing up for the home energy audit on the EPS web portal were not asked to enter their phone number. This issue was detected and corrected, and many phone numbers were located via a reverse look-up process at the PSU Survey Research Lab. 25 The time between signup and audit completion was typically quite short, except for a period at the beginning of the program where a large backlog was experienced due to unexpected early demand for audits.

Behavioral Perspectives on Home Energy Audits Page 41

Retrofit Surveys—Candidates for Retrofit Surveys were drawn from those homeowners who were at least three months out from their audit and who had not completed a Pre-Audit or Post-Audit Interview. Homeowners who had completed a Post-Audit Survey were eligible for the Retrofit Survey after two months had passed between surveys, to allow time to complete additional retrofits. Retrofit Surveys were completed as late as was feasible within the constraints of the research project, and ranged from 3 to 12 months after the audit, with a mean of 8 months. Certainly, homeowners may have completed retrofits after they had completed the Retrofit Survey. The results of that survey are intended as a picture in time, and in fact, many homeowners indicated retrofits that they definitely planned to undertake (for example, when the weather was suitable), as well as ones they might do, in the future. Retrofit Survey questions focused on homeowner information seeking and decision-making after the audit, what upgrades were completed or not completed, and their experience with these completed upgrades (benefits, downsides, etc.).

Post-Retrofit Technical Assessments Finally, a no-cost “test-out” Post-Retrofit Assessment of the house was offered to homeowners who completed the Retrofit Survey and who reported having completed at least one of the upgrades recommended from their audit. Those respondents who agreed to be contacted about this assessment were roughly screened based on the energy upgrades they reported in the survey; those reporting more upgrade work were preferred. The Home Performance Collaborative, a group of home energy auditors in Seattle, completed these assessments. To prevent a conflict of interest and assure independence, a different auditor completed each of these assessments than completed the original audit on the home. A total of 50 Post-Retrofit Assessments were completed, providing data on the extent and quality of the retrofits completed, as well as generating an updated EPS Scorecard to reflect the work that was completed.

Energy Use Behavior Surveys In addition to the phone surveys, we also fielded an Energy Use Behavior Survey, intended to capture basic operational factors to complement the asset characteristics gathered during the home energy audit. The data from these surveys were used in modeling (Chapter 6) to assess the effect of incorporating operational data along with asset data in home energy audit models, asking, for example, about thermostat management practices as well as about unusual energy equipment. Some do-it- yourself audit tools, such as the Home Energy Saver, allow users to incorporate energy behavior data, but professional home energy auditing may rarely collect or use such data. Energy Use Behavior Surveys were delivered in both paper and email (web-based) formats. All households that received an audit were targeted for these surveys. Home energy auditors delivered the paper-based surveys at the point of the audit or at the post-retrofit audit, while those who had completed their audit prior to the point where we began to distribute paper surveys were instead sent an email invitation to complete the web- based survey. Email addresses were only available for a portion of audit recipients, limiting the number of surveys distributed. A total of 346 energy use behavior surveys were received, with 244 of these web- based. Of the 102 paper surveys, 17 were not labeled with the necessary identifier and could not be associated with the other collected data.

Behavioral Perspectives on Home Energy Audits Page 42

Survey responses were converted into basic operational model inputs for a subset of homes, and were also compared to asset modeling assumptions for the EPS and the Home Energy Scoring tool.

Natural Gas and Electric Utility Usage and Billing Natural gas and electricity usage and billing data were also obtained for a portion of the audit recipients. We worked directly with SCL, the electricity provider, and Puget Sound Energy, the natural gas utility, to obtain this data. We mailed utility-provided release forms to audit recipients who used natural gas. In total, complete utility usage data for at least one year prior to the audit was received for 120 all-electric homes and for 131 homes using both natural gas and electricity. Homes with incomplete or discontinuous data were screened from this sample. This excluded a fair number of homes that used fuel oil as a primary heating source—17% of audited houses.

We were able to obtain utility data for up to three years prior to the audit date for many households. Due to a relatively compressed project schedule, for most households who undertook retrofits, no more than six months of post-retrofit utility data were available at the time utility data was collected. This was an insufficient post-retrofit period of time to statistically assess savings.26 Such data could potentially be gathered in the future to enable assessment of the degree and persistence of savings achieved for the homes that received audits and completed upgrades.

In addition, SCL also provided anonymous electricity usage data for a set of more than 1,900 single- family homes in their service territory. As we did not have other house or household characteristics for these homes, we were unable to directly compare or match these homes to our sample homes. However, this data did allow us to characterize year-to-year variability in electricity usage within single- family Seattle homes over the period of interest, as described above.

Analyses of Homeowner and House Data The various streams of homeowner and house data were then drawn together in different combinations to inform analyses towards particular research objectives. These analyses are described in detail in subsequent chapters; a brief mapping of how the data are used in these analyses is presented in Table 7.

26 This is due to the normal level of variability of energy bills, seasonality, the fact that any household’s retrofit activity often took place over several months, limited sample size especially for households with natural gas data, and the fact that we were not able to obtain natural gas data for a comparison group.

Behavioral Perspectives on Home Energy Audits Page 43

Table 7: Mapping of house and homeowner data streams to completed analyses

Analysis Where Presented Data Streams Utilized in Analysis Homeowner decision-making Chapter 4 -Phone interviews and surveys -Audit reports Home energy modeling tool Chapter 6 -Technical measurements of house (audit data) comparison -Audit reports -Energy Use Behavior Surveys -Utility billing and usage data Completed upgrades Chapter 4 -Phone surveys -Audit reports Analysis of quality of Chapter 5 -Audit data completed upgrades -Inspection of completed upgrades Audit technical measurement Chapter 5 -Technical measurements of house (audit data) input variability -Re-measurement (after upgrades were completed) of technical house data

2.5 General Strengths and Limitations of the Research The strengths and limitations of this research are detailed throughout the report and indicated where appropriate with conclusions. Those considerations that have the greatest general relevance for report findings are reviewed here:

Access to Various Points of the Process A particular strength of this research was our ability to talk to homeowners at various points in the home energy audit process. We talked to homeowners just before their audit, soon after their audit, and further after their audit, after upgrades were performed. In some cases, we were able to talk with homeowners at two points in this process, to gather true longitudinal results. This enabled a richer view of how homeowners navigated their home energy audits than would otherwise be possible. Still, our data at the beginning and end of the process are limited. Early data would help us better explore how homeowners decide to get the audit, and what differentiates them from non-participants, while late post-audit data would help us better understand whether audits trigger upgrades for years into the future, or whether any effects are short-lived. Long-term utility usage data would also help us to determine what energy savings were associated with completed upgrades.

Multiple Perspectives Another strength of this research was our ability to bring together multiple perspectives on home energy audits and participating homeowners. In particular, we were able to bring together technical data on the house and on actual energy use with analysis of home energy audit modeling tools and tool- generated guidance, in combination with auditor perspectives on the business and on homeowners and tools, tied together with surveys and in-depth interviews with homeowners covering the various stages of the home energy audit and retrofit process as well as their energy use behaviors. Interviews with real estate professionals helped distinguish the home purchase transaction and perspectives from those of existing homeowners not looking to buy or sell. By considering these various streams of data, we were able to gain a broad perspective on home energy audits, tools, and homeowners.

Behavioral Perspectives on Home Energy Audits Page 44

Program Context The SCL Home Energy Audit Program was limited to one geographic area, and to the households who chose to participate. Because the research was designed to take advantage of a program that had already been conceived and planned, it was difficult to add experimental design elements, and because we were accessing utility customers, there were limitations to how much we could ask of participants.

Community Power Works The Community Power Works program was launched in the SCL service area in April 2011. This program is related to but distinct from the SCL Home Energy Audit Program that was the subject of our study, but the Community Power Works territory included approximately 20% of audit recipients. Due to the City of Seattle’s concern over conflicting messages to potential participants in the Community Power Works program, all contacts who had completed SCL home energy audits but who resided in Community Power Works territory were removed from the project phone survey and interview call lists as of April 28, 2011. Therefore, the sample included in various surveys and interviews does not include homes in the Community Power Works territory.

Self-Selection and Comparison to Non-Participants As mentioned before, program participation was voluntary and only a small proportion (approximately 1%) of the pool of potential Seattle single-family homeowners participated. Much of the data collected was necessarily voluntary, and while we made every effort to maximize response rates to surveys, homeowners who completed surveys and interviews were not necessarily representative of all participants in the SCL Home Energy Audit Program. Furthermore, the 99% of Seattle households not participating in the SCL Home Energy Audit Program may be quite unlike the study participants. Survey respondents were demographically quite distinct from the Seattle population as a whole. The research was not designed to address the non-participant group. We were, however, able to obtain electricity billing data for a comparison group of households who had not undertaken the SCL home energy audit.

Processing of Survey and Interview Data for Analysis Survey and interview responses are in many cases qualitative in nature—depending on the structure of the question asked and the responses received. In some cases for the surveys and interviews, coding of open-ended responses into categories was used to convert some open-ended responses into a form allowing for categorical analysis, and this is a subjective process.

Processing of Technical Data for Analysis We used technical data on participant houses, sometimes in combination with household-reported occupancy, energy use behaviors, and historical weather conditions in Seattle, to model houses and households in two home energy modeling tools (Home Energy Scoring Tool and Home Energy Saver Pro) in addition to the EPS Auditor tool used for the SCL home energy audits. Conversion and interpretation of house measurements and household-reported energy use behaviors into technical model inputs was required for these analyses, bringing in a variety of assumptions as well as sources of error. Additionally, software emulators for the two tools were developed to allow model runs with Seattle weather data.

Similarly, processing was required to translate historical Seattle weather data into year-long periods in a format appropriate for the modeling tools, and was also required to generate full-year calculated utility

Behavioral Perspectives on Home Energy Audits Page 45 usage from billing data, for both gas and electric usage data. These steps also require approximations and interpolation, and have the potential to generate error.

Behavioral Perspectives on Home Energy Audits Page 46

3. Industry Actors’ Opinions and Perceptions—Auditors & Realtors We interviewed two stakeholder groups—home energy performance auditors and real estate professionals. Interviews with auditors who conduct audits for the Seattle City Light (SCL) Home Energy Audit Program were designed to gather these respondents’ perspectives on the overall audit process, including their interactions with homeowners and the Energy Performance Score (EPS) asset rating. Because of their central role in the audit process and their interaction with a wide range of homeowners, auditor interviews provide insight into the motivations that lead homeowners to seek an audit and how homeowners react to the audit and its products. Through interviews with auditors, we sought to explore their role as energy-efficiency experts on homeowners’ decision making processes regarding recommend energy upgrades.

The second stakeholder group was comprised of real estate professions (REPs). Interviews with REPs were designed to provide insight into these respondents’ perspectives on home energy labeling in general, and how energy issues and more prospectively, energy labels, factor into home-buying and home-selling processes. Discussion of the findings from interviews with real estate professionals is included after the sections on auditors. Detailed descriptions of methods for conducting the interviews can be found in Chapter 2. Interview guides and presentations on findings by stakeholder type can be found in Appendices D-G.

3.1 Home Energy Auditors from the Seattle City Light Program In January 2011, we completed in-depth interviews with 12 auditors who conduct EPS audits through SCL’s Home Energy Audit Program. The 12 interviewed auditors represent 11 individual companies. The research team sought to contact the auditors who had completed the largest number of audits through the program. Together, the interviewed auditors had completed 464 audits, 74% of the program audits completed at the time the interviews took place. The research team also reviewed 12 audit reports to add further context to interview findings.

Summary of Findings The following is a high-level summary of findings derived from in-depth interviews with auditors. Below these summary findings, we present a more detailed discussion of the topics we discussed with this sample of active auditors.

Key findings from auditor interviews include:

 Auditors appear to be a committed group of professionals who seek to make a difference by helping homeowners improve the energy efficiency and comfort of their homes. They do this by customizing the audit report based on their expertise and their knowledge of the house and its current residents.

 Auditors view homeowners as their customers and seek to provide a product that meets the needs of each customer. Auditors take their own perception of the homeowner’s concerns, needs and abilities into account when conducting the audit and presenting their findings. Auditors may alter the way they present findings based on the homeowner’s use of space and planned upgrades, the homeowner’s propensity to do the work themselves, and the homeowner’s apparent willingness and ability to take on large retrofits. In addition, auditors may prioritize recommendations based on

Behavioral Perspectives on Home Energy Audits Page 47

their own perception of whether cost savings, lower environmental impact, or something else is most important.

 Auditors perceive customers as motivated to make improvements but constrained by the cost of upgrades and limited knowledge. Auditors see tax credits, incentives, and low-interest financing as tools that can help homeowners reduce costs and in our interviews, auditors noted the importance of presenting retrofit opportunities as investments that will yield an attractive return. In addition, auditors perceive that homeowners lack knowledge of both energy savings opportunities and the specific steps necessary to take advantage of those opportunities.

 While auditors view the asset rating and the EPS Scorecard as useful in providing comparisons to the average Seattle home, they feel homeowners are most interested in concrete recommendations to reduce their energy use. Auditors reported explaining the asset rating to customers, but stated that the specific upgrade recommendations are the most valuable part of the report for homeowners. Auditors stated that the types of comparisons an asset rating allows might be more useful if the ratings were more common, giving homeowners more context for their ratings, or in cases in which customers are shopping for a new home.

 Audits can be a sales tool for retrofit services. Six of the eleven companies represented in the interviews offer retrofit services as well as audits. Respondents from these companies said that they remain impartial when interacting with customers—by only wearing their “auditor hat” during audits. However, they reported that the knowledge they gain about an audit customer’s home, the relationship they build with the customer during the audit process, and the knowledge the customer gains about needed retrofits provide the contracting firms that employ the auditors with an advantage in selling retrofit services.

Detailed Findings In the following sections of the report we present findings from our discussions with a sample of active auditors.

The Auditors The interviewed auditors ranged from having one to three years of experience conducting audits. The interviewed auditors had completed between 8 and 86 audits through SCL’s Home Energy Audit Program, and 9 of the 12 respondents reported that these made up the majority of the audits they conducted in 2010. Including audits conducted outside of the SCL program, the interviewed auditors reported completing between 25 and 150 total audits in 2010.

Half (6 of 11) of the companies our auditor respondents represent do contracting work in addition to providing audits. Respondents from contracting companies stressed that they tried to be impartial when conducting audits and presenting homeowners with audit findings. In an illustrative quote, one auditor described his interaction with audit customers:

“We try to wear separate hats, [we are] just auditors when doing that work, and contractors when doing that part of it.”

Behavioral Perspectives on Home Energy Audits Page 48

Further elaborating on their efforts to remain impartial in their roles as auditors, respondents from two companies that also offer contracting services reported that they recommended customers seek multiple bids. SCL encourages auditors to take this approach.

However, auditors who work with contracting companies reported that the audit process provides them with an advantage in selling retrofit services. According to respondents, the audit raises customer awareness of a need for the type of services the auditor’s firm provides. In addition, the auditor’s interaction with a customer during the audit process begins to build a business relationship between the customer and the auditor’s firm. Ultimately, the auditor can position himself as an expert with unique information about the customer’s home, and the ability, through his firm’s contracting work, to provide solutions to the problems identified. The following quotation exemplifies the balance between the audit as a client-centered information tool and a contractor sales tool:

“I really make it clear they are in no way tied to us, which tends to engender confidence and trust. I am completely sincere in saying that, but I can’t remember any case where it didn’t turn out in my favor.”

Auditors’ Perceptions of Customers Consistent with findings from homeowner interviews and surveys (Chapter 4), auditors reported that their SCL audit customers are already particularly motivated to make energy efficiency improvements. According to one auditor,

“Out of all the houses here in Seattle that are eligible to do the audit, we are skimming the cream off the top. The folks that are jumping into it now are a select group. They have already got that notion and motivation to do something.”

Auditors cited a desire to increase comfort, reduce environmental impact, and reduce energy costs as the primary factors motivating customers to seek an energy audit. While most auditors did not prioritize motivating factors, two specified that comfort is the primary motivator for customers. One auditor noted that on-bill comparisons between a customer’s energy use and that of their neighbors’ (provided by SCL) also motivate customers to pursue an audit.

Interview findings also suggest that the cost-sharing structure of the SCL’s Home Energy Audit Program may deter participants who are not motivated to make improvements. This was a conscious decision in SCL program planning. Customers pay $95 toward a comprehensive audit, while the program provides an additional subsidy of $305 toward the cost of the audit. Because participants must bear part of the cost of the audit, auditor respondents stated that these SCL program participants may have been more motivated to make energy efficiency improvements. In contrast, respondents stated that participants in fully subsidized audit programs, “are just wanting the free light bulbs and want some reinforcement for what they are thinking about their house.”

While they noted that the requirement that customers pay $95 toward the audit may deter customers less motivated to make significant retrofits to undertake an audit, auditors also emphasized the importance of the subsidy the program provides. According to one auditor, “I think most of the people,

Behavioral Perspectives on Home Energy Audits Page 49 when we leave, understand the value of what we do. But I have a much harder time convincing people the value over the phone when I am saying it would cost them $400 out of pocket versus the $95.” Another auditor stated more bluntly, “would a homeowner pay $400 for an energy audit? Of course not.”

Auditor’s Views on Difficulties Homeowners Face in Undertaking Upgrades Although they described customers as motivated to make energy efficiency improvements, most auditors cited two main difficulties that prevent customers from moving forward with retrofits: the cost of upgrades (10 mentions out of 12 auditors) and knowledge of how to proceed (7 mentions).27 Auditors described a variety of strategies they had used to help customers defray costs. Seven respondents cited tax credits and incentives as effective motivators, particularly large federal tax credits for energy efficiency upgrades. Some auditors reported including information about available incentives and financing programs of interest to particular customers in the “General Notes” field provided in the audit report. Auditors also cited low-interest financing (5 mentions) as a tool to overcome customers’ financial constraints, and emphasized the importance of presenting efficiency upgrades as investments. One auditor elaborated on this last suggestion, saying:

“People are motivated by what happens in their wallet, and once they can see clearly that something doesn’t cost, it pays, then most of the battle is won.”

In addition to upgrade costs, auditors stated that customers lack knowledge of the opportunities that exist in their homes for energy improvements and how to go about making those improvements. Auditor interviews suggest that the comprehensive EPS Energy Analysis Report, which is intended to provide a “road map” for customers to follow as they make improvements, helps to address this knowledge barrier. As we will discuss further below, auditors took steps to use the flexibility inherent in the program design to customize the EPS Energy Analysis Reports in ways that they thought better met their customers’ needs—and were encouraged to do so by SCL. In part, through this customization, auditors sought to provide their customers with the information they needed to make improvements. Auditors are requested by SCL to offer further explanation as to why particular retrofits should, or should not, be done and may prioritize their recommendations. Nonetheless, two auditors suggested that customers might benefit from additional resources designed to connect them with contractors.

Auditor-Homeowner Interactions While the audit report is one channel by which customers receive audit results, auditor interview findings suggest that homeowners’ interaction with the auditor during the audit process also plays an important role in communicating audit findings and recommendations. The “show and tell” process that takes place as customers show auditors around their homes calls attention to any concerns or specific problems the customer is experiencing, which the auditor will likely seek to address through the audit

27 We use the term “barrier” to talk about the path from receiving an upgrade recommendation to acting on that upgrade recommendation, without implying that the homeowner ideally should undertake any recommended upgrade. Our analysis is intended to address what homeowners do and don’t do and why, without assuming that they should be doing something else than they have.

Behavioral Perspectives on Home Energy Audits Page 50

process. In addition, the types of tests auditors conducted during the audit process made deficiencies in the home’s shell and systems visible and provided clear evidence of improvement opportunities.

Figure 5: Blower door test photo

According to one auditor: “The blower door test is probably the one thing that motivates homeowners the most—it is not just a report, but they ‘see’ what is going on.”

Nonetheless, the interviewed auditors varied somewhat in their approach to communication with customers during the audit process. Half (6 of 12) stated that they encouraged customers to follow along for the elements that most clearly demonstrated potential improvements: the blower door test and infrared photography. A minority of the interviewed auditors (4 of 12) reported that they actively encouraged customers to follow along with them for the entire audit process.

The in-home portion of the EPS audits can take three to four hours to complete, and may require the auditor to access cramped and dirty areas like attics and crawl spaces. Interview findings suggest that some customers lose interest in following along for the full process. According to one auditor, “It becomes so boring that they drop off and resume when the blower door happens.”

Auditors varied in their opinions of the benefits and drawbacks to their own work of having customers following along during the audit process. While four respondents primarily cited benefits from customers following along, three primarily cited drawbacks. The remaining auditors cited both benefits

Behavioral Perspectives on Home Energy Audits Page 51 and drawbacks. The benefits and drawbacks auditors cited indicate the importance that auditors strike a balance between their level of engagement with the customer and their focus on the audit itself.

According to the interviewed auditors, the benefits of customers following along primarily resulted from the opportunity to provide visual and verbal information in addition to the written audit report. Respondents noted that this information could ultimately improve the customer’s understanding of their audit report. In addition, three auditors said that engaged customers are more likely to take action. An auditor summed up the advantage of customer participation this way:

“I feel like I have a better relationship with my customer, giving them information that is not easily communicated in the report.”

Drawbacks auditors cited to homeowner participation primarily focused on the time requirements that homeowner interaction could add to the audit process. Two-thirds of the interviewed auditors noted that customer participation can be time consuming. According to one respondent, “It’s the number one efficiency killer for an energy auditor…I will easily spend twice as much time in a house if I am having a conversation with someone while I am doing the inspection.” Other auditors noted that interaction with homeowners can distract them from their focus on the home and limit their ability to fully consider all of their findings before making recommendations. One auditor stated that, in combination, these factors have the potential to hurt his credibility with the customer if later findings force him to contradict something he had told the customer previously.

Audit Products EPS audits generate two primary products for the homeowner: the EPS Scorecard (See Appendix A) and the detailed audit report (See Appendix B). The following sections examine auditors’ views of each of these products.

Auditor Perspective of EPS Scorecard As noted in Chapter 1, the EPS Scorecard presents a standardized, asset-based rating of a home’s energy use and a comparison between the rated home and the Seattle average, as well as to average household consumption levels to help meet Seattle’s long-term environmental goals.

Auditors reported using the analogy of a car’s miles-per-gallon rating to explain the EPS rating and scorecard to homeowners. The interviewed auditors cited the EPS Scorecard’s potential to serve as a basis for comparison between homes as its primary benefit. Three auditors stated that the scorecard raises awareness of energy use in homes and calls attention to the potential to reduce energy use. One auditor elaborated that the scorecard allows homeowners not only to compare their home’s energy use to that of an average Seattle home, but also to gain insight on their own energy use behaviors by comparing their actual energy use to the modeled estimates.

While auditors stated that the EPS Scorecard facilitates comparison between homes, some questioned the utility of these comparisons to their customers. Five auditors reported that the scorecard would be more valuable if asset scores were more common, stating that increased recognition of the score and a

Behavioral Perspectives on Home Energy Audits Page 52 wider range of rated homes for customers to compare would increase the score’s utility. In addition, three auditors noted that the type of comparison between homes that the EPS score allows would be most useful for homebuyers who are comparing multiple homes. One auditor illustrated this distinction, saying,

“You can evaluate potential homes based not only on whether they have granite countertops…but how much your utilities are going to cost.”

However, this auditor and another noted that audit customers are typically more interested in identifying upgrades that will decrease energy use and increase comfort than in comparing their homes to others. The quoted auditor continued, “I think that’s great, but potential homebuyers are not our customers.”

In addition to serving as a basis for comparison between homes, three auditors cited the visual presentation of the home’s energy use as a benefit the scorecard provides. According to these respondents, this visual presentation effectively supports the rest of the audit report.

Customer Understanding of the EPS Asset Rating Interview findings suggest that auditors varied somewhat in the detail with which they explained the distinction between an asset rating and an operational assessment (Please see Chapter 1 for a discussion of the differences between an asset rating and an operational assessment). However, almost all of the interviewed auditors informed their audit customers that the EPS rating is based on average energy use behaviors rather than a customer’s actual use. Respondents stated that failing to make this distinction could be misleading to customers. For example, auditors noted, customers whose homes appear efficient from an asset perspective might forgo efficiency improvements despite significant opportunities arising from their individual energy-using behaviors. In addition, auditors noted that it is important to explain to customers that, because of their individual energy use habits, the savings they actually achieve from an upgrade might differ from the modeled savings estimated in the audit report.

A majority of the interviewed auditors (7 of 11 responding) reported that their customers appeared to understand asset ratings when the auditors explained them. Respondents who stated that customers did not understand the asset rating primarily explained that the most of their customers were not interested in the ratings, focusing instead on specific recommendations for improvement. Nonetheless, some auditors noted that a minority of customers, particularly those with a background in energy or environmental issues, took a stronger interest in the details of the rating. Two auditors reported that this closer examination could cause some customers to dismiss the EPS score, particularly if they disagree with a poor rating.

Auditors’ Views of the Audit Report Overall auditors used the EPS Energy Analysis Report both as an asset (model-based) tool and as an educational tool. As an asset tool, the audit report informs and educates customers about their homes and, as one auditor said, “serves as a catalyst for discussion.” Auditors also used the report as an educational tool for providing customer-specific recommendations based on their understanding of the

Behavioral Perspectives on Home Energy Audits Page 53 home and its occupants. Interview findings suggest that auditors value both the asset-based and personalized educational elements possible with the report. Respondents cited specific upgrade recommendations, savings estimates, and cost estimates as valuable elements of the audit report, but suggested that, at times, the asset-based elements of the report did not provide the specific detail customers seek.

Recommendations for Specific Retrofits A majority of the interviewed auditors (7 of 12) cited specific recommendations as the most valuable aspect of the EPS audit report for homeowners. Auditors stated that the EPS report does a good job of providing a clear path to follow as homeowners pursue retrofits. According to one auditor, the report “does a fairly good job of creating a call to action,” more so than other types of audits with which this auditor had experience. Nonetheless, as we will discuss further below, auditors personalized their retrofit recommendations beyond the asset-based recommendations the EPS report generates. SCL specifically designed the Home Energy Audit Program to provide the opportunity for such personalization and expects auditors to use this functionality. An example of such customization is how auditors often provide prioritized lists of recommendations to customers or advice on addressing specific issues of concern to the customer.

Savings Estimates While auditors were critical of the accuracy of the EPS report’s savings estimates, respondents also noted that even imprecise estimates can be useful in discussions with homeowners. Four of the interviewed auditors questioned the accuracy of the savings estimates included in the EPS report or their applicability given the variation in energy use resulting from occupant behavior. According to one auditor, “it’s completely inaccurate, you can make recommendations to change things and it doesn’t change the score, [the savings estimates are] useless,” though it is possible that this auditor was not using the EPS Auditor tool correctly. Another auditor said, “I think people want something, and I can give them something without those calculations.” This auditor noted that the report’s savings calculations are “imprecise,” an assertion that our review of audit reports supports. Figure 6 shows a savings estimate from an EPS audit report, which, in addition to estimating savings across a very wide range, seems to suggest that the customer could completely eliminate their heating energy use.

Typically, the EPS Energy Analysis Report provides a single number for the estimated fuel use after upgrades and for the approximate annual savings in dollars. However, SCL requested that the EPS Energy Analysis report provide ranges instead of a single number for estimated fuel costs after upgrades and for the approximate annual savings in dollars after upgrades. This range was approximately 30% more or less than the single figure calculated by EPS Auditor and sometimes led to a low-end estimate of zero for fuel costs and savings.

Behavioral Perspectives on Home Energy Audits Page 54

Figure 6: Example of EPS savings estimates

Auditors suggested that breaking apart some of the energy use categories presented with the EPS Scorecard might allow customers to see a more direct connection between recommended upgrades and changes in the EPS score. For example, ‘heating’ encompasses savings from air sealing, insulation, and equipment upgrades. Auditors would like to see savings estimates from each of these potential improvements presented with the EPS score, in addition to their current listing under the “Summary of Recommended Upgrades” section of the audit report.

Despite these criticisms, three auditors noted that even rough savings estimates can facilitate discussions with customers on which measures offer the most attractive payback periods, helping to prioritize upgrades and potentially motivating customers to undertake retrofits.

Cost Estimates Auditors generally considered estimates of retrofit costs provided in the audit report as more valuable to customers than savings estimates, but expressed a desire for more personalized estimates than the EPS report provides. Auditors stated that customers are interested in cost estimates for the recommended upgrades, and noted generally that cost estimates are valuable to include in audit reports. However, auditors question the quality of the estimates the EPS report provides. Two respondents questioned the accuracy of cost estimates, while two others noted that the estimates can list such a wide range of costs that they provide little value to customers. According to one auditor:

“[Cost estimates are] good stuff, it’s good to have that in there, but the rates are so wild…in other words, there is no information there.”

Figure 7, taken from our review of audit reports, illustrates the type of wide-ranging cost estimates to which these auditors referred.

Behavioral Perspectives on Home Energy Audits Page 55

Figure 7: Example of costs reported for recommended energy upgrades from the EPS Report

As noted, auditors nonetheless saw value in including cost estimates in audit reports, and despite their criticism of the savings calculations and cost estimates in the audit report, four respondents reported discussing payback calculations with customers. These auditors used payback periods in an effort to present the energy upgrades as investments that will pay back over time, and also used payback as a way to prioritize upgrades.

Homeowner Follow-Up Auditors’ experience with customer contact following the audit was consistent with their assessment that customers seek clear information on the steps they must take to complete retrofits and that the EPS audit report is effective in providing this information. None of the interviewed auditors reported that more than 60% of their customers had followed up with them for any reason after receiving the audit report, and most (7 of 12) reported that less than one-third followed up.

No single element of the report or EPS Scorecard stood out as a frequent subject of follow-up calls from audit customers. However, the interviewed auditors reported receiving follow-up calls on a variety of topics related to carrying out retrofits (highlighted in Table 8), including requests for contractor recommendations, advice on how homeowners could carry out retrofits themselves, and clarification when a contractor suggests a solution different from the one recommended in the audit report. Auditors reported that responding to customers’ follow-up questions typically takes little time; the majority of the auditors reported that responding to a request typically takes 10 minutes or less.

Table 8: Typical homeowner-initiated post-audit call-back topics

Topic Number of reports Contractor recommendations 5 Do-it-yourself advice 5 Clarification of specific findings 4 Requests for additional photos 3 Contractor recommendations inconsistent with report 2 Difficulty accessing report online 2 Requests for information regarding rebates 1 Auditor receives little or no follow-up from customers 2

Behavioral Perspectives on Home Energy Audits Page 56

Three interviewed auditors suggested that proactively following-up with customers, rather than waiting for the customer to contact them with a request, might increase the likelihood that customers will undertake retrofits. According to one of these respondents:

“A lot of people are just busy and they don’t do the work because it falls through the cracks. If you give them that little nudge, they will pick up the ball again.”

While in some cases, these auditors stated that they had made efforts to contact homeowners for follow-up, they noted that doing so can be time-consuming, and, in order to succeed in their auditing business, they must focus on paid work. In addition, one respondent noted that following up with customers in this way would require a level of organization and contact management beyond what he currently has.

Customization of EPS Energy Analysis Reports As noted in Chapter 1, a purely asset-based audit and rating is not designed to take into account the specific behaviors of the home’s occupants. Instead, asset audits focus on the physical characteristics of the home and the equipment present and generate ratings based on standardized values for inputs that occupant behavior would impact. Interview findings suggest that auditors’ recommendations were largely consistent with this asset focus in that they primarily addressed improvements to the home itself rather than operational changes. However, interviews also suggest that auditors liked the opportunity provided by the SCL program and the EPS Auditor software to tailor the recommendations they offered customers, the priority they gave their recommendations, and the way they presented audit findings based on characteristics of their customers (the use of “General Notes” fields for this purpose is discussed below).

Auditors felt they could increase the value of audit findings to the individual customer beyond that of the uniform report that the software generates by presenting customized recommendations that draw on their own expertise and their understanding of the customer’s needs. SCL encouraged auditors to perform this type of customization. The following sections examine the factors auditors consider in customizing findings and how auditors present customized findings to their customers.

Factors Auditors Consider in Customizing Recommendations Auditors sought to customize audit reports in order to address what they thought their customers wanted to know over and above the in-depth report in order to more effectively meet the needs of their customers. However, interview findings suggest that auditors identified customer needs in a variety of ways. Four of the interviewed auditors said they collected information from customers in order to focus the audit and tailor recommendations to the customer’s needs. These auditors considered any specific concerns the customer had, the customer’s plans for the house, and the types of upgrades the customer was interested in undertaking.

Table 9 illustrates how auditors suggested that these considerations might impact the audit findings they present to customers.

Behavioral Perspectives on Home Energy Audits Page 57

Table 9: Factors considered and their impact on recommendations

Factor Recommendations Customer use of space and planned upgrades – Areas recommended for insulation – Need for mechanical ventilation Customer propensity to do upgrade work – Focus on easier, less expensive projects themselves – Specification of tasks that require specialized skills Customer willingness and ability to take on large – Emphasis on low-cost, fast payback measures retrofits – Large retrofits presented as a “long-term strategic plan” or “road map”

In most cases, auditors reported that these types of considerations affected the way they presented recommendations to their customers or the upgrades they prioritized, but did not influence which upgrades the auditors recommended. For example, while auditors stated that they might not emphasize high-cost upgrades to customers who seemed unable or unwilling to pay for them, respondents stressed that they nonetheless included those upgrades in the audit report for the customer’s reference.

Auditors’ considerations of the way customers use space and their planned upgrades differ somewhat from considerations of customers’ propensity to do the work themselves or take on large retrofits in that they may influence the types of upgrades the auditor recommends. For example, two auditors reported considering how customers use their basements and their plans for the space in determining whether to treat the basement as conditioned space and ultimately where to focus air sealing and insulation efforts. One auditor also reported that he might be more likely to recommend mechanical ventilation to homeowners with large numbers of pets or gas appliances in their living space.

Delivery of Customized Findings Auditors delivered customized recommendations to customers in two primary ways: through discussion with homeowners during the audit visit and in the “General Notes” field of the audit report. Auditors typically discussed their findings with customers during the audit visit. Respondents noted that these conversations provided them with an opportunity to clarify their findings with homeowners by demonstrating things like what a rim joist is. One auditor described these conversations, saying:

“I sit down with the homeowners after I do the audit and I say ‘look, here are my recommendations,’ we go through each one and have a conversation about them…and we will come up with a plan that makes sense for them.”

However, another auditor stated that he sought to exercise caution in discussing findings with customers before the report was complete because providing too much detail might encourage customers to move forward with retrofits without considering the full range of recommendations.

In addition to discussing findings with homeowners, auditors presented customized findings in the “General Notes” field of the audit report, per SCL program protocol. Table 10 lists the type of information auditors included in the “General Notes” fields in the 12 audit reports we reviewed.

Behavioral Perspectives on Home Energy Audits Page 58

Table 10: Auditor use of “General Notes” field in audit report

Consideration Example Content Number of Reports (n=12) Prioritized list of – “Here are the priority projects for your house:” 7 recommendations – “I will list below for you how I would prioritize your energy improvement issues.” Personalized messages – “Thank you for inviting us into your lovely 6 home…” – “If you have any questions about anything in these documents, please don’t hesitate to contact us.” Advice on potential safety – Need for proper ventilation of combustion 5 issues encountered appliances – Potential asbestos hazards – Lint in dryer exhaust – Need for carbon monoxide detectors Information that does not fit – Responses to specific homeowner concerns 3 report structure – Information on moisture issues – Advice about high bills Education and general – Elaboration on need for certain measures 3 information – Additional explanation of blower door readings – Target levels of air exchange to achieve/maintain air quality. Nothing in General Notes 3

A prioritized list of recommendations was the most common element auditors included in the “General Notes” field of the examined reports. Notably, five auditors cited these lists as one of the most valuable parts of the audit report to their customers.

Conclusions from Auditor Interviews Although the EPS modeling software is designed to standardize the audit approach and provide “asset- based” recommendations, the SCL program version recognizes that all homes are unique and encourages auditors to supplement fundamental information with specific home details based on auditor knowledge and expertise. We found that auditors sought to customize findings in order best to meet customers’ needs as the auditors perceived them. Delivery of customized findings in this way morphs an asset-based approach into one that “[comes] up with a plan that makes sense” both to the auditor and the occupants of the home. Several factors play into this transformation of asset-based information into more homeowner/occupant-focused retrofit recommendations. Factors include, but are not limited to:  Auditors consider themselves to be a committed group of professionals who desire to make a difference by helping homeowners improve the energy efficiency and comfort of their homes.  Auditors viewed the homeowners participating in the SCL program as a group that was generally motivated to make upgrades. The auditors viewed audits, including their interactions with the

Behavioral Perspectives on Home Energy Audits Page 59

customer, as a positive influence that helps to provide homeowners with knowledge that will enable them to complete energy upgrades that fit their needs.  Auditors are not technicians, producing a standard report that varies only by house. Instead, they viewed audits participants as their customers and sought to provide good customer service by delivering customized, customer-focused audits and recommendations. These factors (many subjective), along with the potential sale of retrofit jobs by auditor/contractors, combine in various ways to co-influence the generation and presentation of asset-based results as customer-focused information in the EPS Energy Analysis Report.

3.2 Real Estate Professionals One of the theorized benefits of home energy ratings and labels like the EPS is their potential to allow an apples-to-apples comparison of energy use between homes. In order to explore the potential benefits of energy ratings and labels for homebuyers, as well as potential drawbacks, we conducted interviews with real estate professionals in the Seattle area. These interviews sought to explore real estate professionals’ opinions on how home labels might change how energy-efficient features are marketed during home sales and what buyers and sellers focus on during the sales process. We sought real estate agents’ perspective on these and other issues related to the sale of both new and existing homes, though the EPS Home Energy Audit Program covered existing homes and homeowners—few of whom were looking to sell their home or buy a new home in the near future (see Chapter 4).

Background—Purpose of Home Labels In general, labels, whether on cans of tomatoes, furnaces, or homes, seek to provide information, facilitate comparison, and ultimately influence behavior. The schematic presented in Figure 8 represents a classic “market transformation” view on how home energy labels might influence the real estate market, by allowing buyers to compare homes based on their energy use, thus increasing the demand for, and value of, efficient homes, and ultimately motivating homeowners and builders to make energy improvements and increasing the overall efficiency of the housing stock.

Methods Between May and July 2011, Research Into Action interviewed 12 real estate professionals (REPs) who worked in the Seattle area and had attended one of Earth Advantage Institute’s Sustainability Training for Real Estate Professionals (S.T.A.R) courses. Though these trainings may attract a variety of different interests, our contact list, overall, was presumably more oriented to energy issues than the “average” REP. Still, we sought to interview a diverse range of contacts within this list. The majority of the agents who had attended the trainings came from large real estate companies and as a result, we interviewed multiple respondents from two companies: Windermere and Coldwell Banker Bain. Each respondent from both of those companies came from a different office. Since these respondents were all selected to have shown some interest in sustainable housing, they may have been more engaged with energy efficiency than Seattle real estate professionals as a whole.

Behavioral Perspectives on Home Energy Audits Page 60

Figure 8: Home labeling and real estate market demand for energy efficient houses

Key Findings Key conclusions from our REPs interviews include:

 REPs suggest that labels increase the desirability of efficient homes but have potential drawbacks for sellers. REPs stated that labels raise awareness of energy use and energy costs among buyers, allow REPs to use efficiency as a feature in their marketing, and encourage sellers to make efficiency improvements, ultimately increasing the efficiency of homes. However, REPs also noted that labels may disadvantage sellers who would likely bear the cost of labeling and may see the value of their homes fall if they receive unfavorable ratings.

 Efficiency is one factor among many that buyers consider in selecting a home and may be secondary to considerations like location and floor plan. Buyers can improve a home’s efficiency after purchasing the home much more easily than they can alter factors like location and floor plan. In addition, REPs reported that most buyers are relatively indifferent to energy costs, particularly in Seattle where energy costs are low and home prices are high.

 REPs are more likely to market efficiency as an overall feature of new homes while focusing on individual efficient elements within existing homes. New homes are inherently more efficient than existing homes and are more likely to have energy labels REPs can feature in their marketing. In addition, builders have opportunities to educate buyers about efficiency in ways that existing home sellers do not. REPs typically only include efficiency in existing home marketing when the home has received a prominent upgrade and typically focus efficiency claims on the specific element upgraded rather than the home as a whole.

 REPs typically discuss efficiency with buyers but see little benefit for sellers in making efficiency improvements. REPs discuss efficiency generally with buyers, typically focusing on energy costs, and may go into greater detail and discuss specific upgrades with buyers that express interest in efficiency. While REPs typically discuss efficiency with sellers, they primarily recommend visible or cosmetic upgrades. REPs see the primary benefit of efficiency upgrades as an improved experience

Behavioral Perspectives on Home Energy Audits Page 61

living in the home and stated that sellers are unlikely to recover the cost of energy upgrades in a higher sales price.

Detailed Findings The following section presents a detailed summary of topics discussed with real estate professionals, including their perspectives on the role of energy efficiency and home labeling during the sale of new and existing homes.

Characteristics of REPs and Awareness of Energy Efficiency The interviewed REPs ranged from two to more than 30 years of experience in the real estate field. Respondents reported that they primarily sell existing homes; only 2 of the 12 REPs interviewed reported that more than 50% of the homes they sell are new. The rest reported that 25% or fewer of the homes they sell are new. This predominance of existing home sales is consistent with overall sales; according to Dataquick, in April, 2011, new homes made up only 14% of all home sales in the Seattle/Tacoma/Bellevue area (509 of 3,587 homes sold).

Interview findings suggest that a lack of awareness of energy efficiency among REPs limits their ability to discuss efficiency with buyers and use it to market homes. The interviewed REPs agreed (10 of 12) that, in general, agents have little awareness of energy efficiency in homes. Given this relatively low level of awareness of energy efficiency, respondents noted that in some cases, homebuyers may be more knowledgeable than agents. Respondents also reported that even agents aware of energy efficiency issues may not know how to use it as an effective home marketing tool.

This awareness gap among REPs contributes to the gap in awareness among homebuyers and sellers of energy efficiency as a home feature worth considering during the sales process. According to one respondent,

“[Buyers] look to their real estate agents for guidance, and if real estate agents were to explain [efficiency] better and show the importance of it, I think they would get it and want it.”

Although the interviewed REPs were more knowledgeable about energy efficiency than most of their colleagues,28 respondents were divided regarding whether they market themselves as ‘green,’ with 5 of 12 reporting that they do. Reasons respondents cited for marketing themselves as ‘green’ or choosing not to do so center on attracting niche buyers. One REP reported seeking to attract niche buyers by marketing herself as green, while another toned down her green marketing, positioning herself as ‘an agent who also does green’ as opposed to ‘a green agent’ to avoid limiting herself to niche buyers.

The Home Labeling Landscape In discussions of the current labeling landscape, the interviewed REPs cited four different types of labels, all of which are voluntary (Figure 9). Two-thirds of the interviewed REPs (8 of 12) stated that the

28 In open-ended responses, five contacts suggested that they are more knowledgeable about energy efficiency than is typical for REPs, in addition, as noted above, all of the interviewed REPs had attended EAI’s S.T.A.R. course.

Behavioral Perspectives on Home Energy Audits Page 62

availability of such a large number of labels in the marketplace may be confusing to consumers, and interview findings suggest that overcoming this confusion requires a relatively high level of understanding on the REP’s part. While two respondents stated that knowledgeable REPs can prevent multiple labels from confusing homebuyers, two additional respondents expressed a desire for additional education of real estate agents. Two REPs also expressed a desire for a resource illustrating differences and areas of overlap between various labels, as well as which organizations offer each one.

Figure 9: Four current home label options

Figure Note: Built Green is a program through the Master Builders Association of King and Snohomish Counties. It includes requirements for energy efficiency, indoor air quality, natural resource conservation, and preservation of water quality. It applies to both new and existing homes. The rating system scores homes from one to five stars. It includes a combination of set requirements and a point system for additional features.

What REPs Want In a Home Label The REPs who we interviewed indicated that they prefer simple, recognizable labels. Respondents most often stated that labels should present information simply, offer a standardized rating system, and include estimates of energy costs. The opinions that respondents expressed of the individual labels available in the marketplace also indicated that REPs favor labels that are widely recognized or easy to explain. REPs expressed positive views of ENERGY STAR as familiar to homebuyers and of the EPS rating as easy to understand and discuss with clients; respondents expressed differing opinions of LEED, largely based on their assessment of its recognition among homebuyers.29

Benefits and Drawbacks of Home Labels Differ Across New and Existing Markets REPs most frequently cited the potential for energy labels to differentiate homes from others on the market and to help illustrate the total cost of living in the home as potential benefits of energy labeling

29 Views of LEED were mixed: three REPs viewed it positively, two of whom cited its familiarity; two viewed it negatively, one saying it is less familiar to buyers, one saying qualified homes may not have outstanding energy performance.

Behavioral Perspectives on Home Energy Audits Page 63

(Figure 10). However, twice as many REPs cited the potential to illustrate the total cost of living as a benefit that applies to new homes (6 contacts) than did so for existing homes (3 respondents).30 One respondent offered an explanation for this increased focus on cost comparisons in new homes, saying that energy costs may play a bigger role in buyers’ decisions between new homes than between existing homes. According to this REP, buyers of existing homes are “more focused on the character of the home: the architecture, how pleasing it is that way.”

Figure 10: Ways labeling differentiates new and existing homes

7 6 5 4 3 2 1 0 Differentiate Illustrate total Attract niche Consistent with Help efficient No benefits home from cost of living in buyers increasing existing homes specified others on home interest in EE compete with market new

Number Citing for New Homes Number Citing for Existing Homes

Figure Note: Total costs of living in the home include mortgage payment, property taxes, and insurance as well as utility costs.

REPs also mentioned attracting niche buyers more often with regard to new homes than they did with regard to existing homes. One respondent elaborated that, while niche buyers seek an energy efficient home, they do not want to undertake significant efficiency retrofits themselves. Instead, this respondent stated, “They want someone else to already have done the research and work and said ‘this is energy efficient.’”

Drawbacks of Labeling Differ Across New and Existing Markets The drawbacks realtors cited for energy labeling primarily focused on financial impacts, particularly those that might affect sellers of existing homes (Figure 11). REPs noted that the cost of labeling could increase the expense sellers must bear in the transaction or raise the cost of the home to buyers. Respondents also noted that buyers might perceive labeled homes to be more expensive. In addition to costs directly associated with obtaining the label, REPs noted that energy labeling decreases the appeal

30 In illustrating the total cost of living to customers, respondents sought to include utility costs with the mortgage payment, property taxes, and homeowners’ insurance in order to determine the total monthly expense of living in the home. By discussing total monthly expenses in this way, REPs encourage their clients to look beyond simply the sales price to determine whether they can afford a particular home.

Behavioral Perspectives on Home Energy Audits Page 64

of homes that score poorly, lowering their value. One respondent compared a poor energy rating to a home with a bad roof, saying that in both cases buyers may want a price adjustment as compensation.

Figure 11: Potential drawbacks to labeling new and existing homes

10 9 8 7 6 5 4 3 2 1 0 Decrease value of Increase cost of Difficulty of providing No drawbacks inefficient homes labeled homes comparable energy specified ratings

Number Citing for New Homes Number Citing for Existing Homes

Interview findings suggest two reasons why REPs mentioned the potential to decrease the value of inefficient homes more prominently as a drawback to labeling existing homes than new homes. First, respondents noted that they considered new homes, even those built only to code, to be more efficient than existing homes. Second, although our interviewees did not say so explicitly, REPs may perceive that builders have more agency to design efficiency into homes than sellers who would have to pay for costly retrofits in order to increase their homes’ efficiency.

Two respondents who did not cite drawbacks to labeling for both new and existing homes noted that for voluntary ratings, realtors could simply suppress poor ratings in their marketing. One of these REPs explained, “Realtors who market houses market the strength of the house, and if it has a bad energy label score, realtors just won’t mention it.” However, another REP questioned whether disclosure of voluntary ratings would be mandatory, saying sellers may refrain from pursuing voluntary energy labels if they would be required to disclose a poor rating. All home energy labeling and disclosure in the U.S. is currently voluntary.

Realtor Comments on the Potential Impact of Mandatory Home Labels Based on interview findings, our specialized group of REPs overall seemed to believe that mandatory home labeling would create pressure toward increased efficiency in U.S. homes. Nine respondents stated that labeling would increase the efficiency of homes. These REPS cited two related reasons, both consistent with the logic presented in Figure 8; four noted that labeling would raise awareness of energy use among buyers, while four others stated that labeling would encouraging sellers to make energy efficiency improvements. The ninth respondent cited both reasons. Individual respondents also mentioned that a mandatory labeling system would standardized the ratings currently in the

Behavioral Perspectives on Home Energy Audits Page 65 marketplace and that labeling could increase demand for retrofits and competition in the retrofit market, lowering costs as a result.

Although six REPs (including four who stated it would increase the efficiency of the housing stock) discussed drawbacks related to mandatory labeling, only one stated that mandatory labels would not increase the efficiency of U.S. homes. This respondent cited experience with other issues identified in the home inspection process to argue that, while buyers would use low ratings to negotiate a lower sales price, they would not necessarily make improvements to increase the efficiency of the property once they owned it.

The drawbacks REPs cited for mandatory labeling parallel those mentioned for voluntarily labels, and primarily impact sellers of existing homes: labeling has the potential to decrease the value of homes receiving poor ratings and obtaining the label adds costs to the process of buying and selling a home. Most of the respondents who raised these issues speculated that sellers would be responsible for these costs, although one suggested that a discount on the excise tax for sales of homes that achieve a certain rating may be an effective incentive.

The potential to disadvantage some groups of sellers to a greater extent than others is a drawback unique to mandatory labeling, which two REPs cited. According to these respondents, sellers forced to sell for financial reasons or out of state sellers selling homes that had been rental properties may be unable to make efficiency improvements that would make their homes more valuable. According to one respondent:

“[Homeowners] may not have enough money to make the next payment, you may not have enough money to change out the windows…so many people owe more on their house than the house is worth that to put this into a transaction right now is kind of insanity.”

Efficiency in the Home Sales Process While interview findings suggest that REPs believe labels might increase demand for efficient homes, energy use remains just one of many factors homebuyers and sellers consider. To better understand the extent to which labels might impact the efficiency of U.S. homes, it is important to consider the role energy efficiency plays in the home buying process more broadly. The following sections examine REPs’ views on the current importance of energy efficiency to homebuyers, the role energy efficiency plays in home marketing, and how real estate agents discuss efficiency with their clients. These perspectives provide us with general benchmarks of current practices and provide us with some indicators of leverage points for increasing the use of energy efficiency as a feature considered during home sales.

Current Importance of Energy Efficiency to Home Buyers Seven REPs stated that energy efficiency is not a primary consideration for homebuyers. Instead, according to these respondents, buyers are most interested in factors like a home’s location, floor plan, number of bed- and bathrooms, and aesthetic features. REPs noted that, in part, these factors may take precedence over energy efficiency because, while buyers can upgrade a home’s energy efficiency relatively easily, it is much more difficult for buyers to alter the location or overall floor plan of their

Behavioral Perspectives on Home Energy Audits Page 66 home. Illustrating this point, one REP stated, “It could be the most energy efficient house, but if it’s in the wrong neighborhood, that isn't going to help it.”

In addition to giving precedence to aspects of the home they cannot change, a relative indifference to energy costs among buyers may also limit the importance of energy efficiency in their decision-making processes. Two-thirds of the interviewed REPs (8 of 12) stated that fewer than 30% of the buyers they work with ask about a home’s energy bills, and, in open-ended responses, four respondents specified that energy costs are a low priority for buyers. Respondents noted that, in relative terms, energy costs are low and home prices are high. According to one REP, “An extra $50 a month because the house is inefficient is not going to change their life.”

While interview findings suggest that most buyers see energy efficiency as a secondary concern, REPs also stated that a minority of buyers is very interested in a home’s energy costs and efficient features. One respondent estimated that 20% of buyers are very concerned and want to know “extreme detail” about “all aspects of running the property,” including energy bills. While this person stated that these very concerned buyers have always existed, another said that concern about energy costs had grown. According to this respondent, while buyers once purchased the largest houses they could afford, they are now turning away from homes with more space than they need because of concern over operating costs.

The Role of Energy Efficiency in Home Marketing Interview findings suggest that REPs market a home as a collection of individual features rather than as a single, cohesive product. In their early interactions with buyers, REPs seek to understand the features most important to buyers, and adapt their discussion of available homes to focus on those features. For example, REPs reported focusing more strongly on energy efficiency when discussing homes with buyers who had expressed interest in efficiency. Respondents also noted that a home’s features could be positive or negative. While REPs may emphasize the ease of altering negative features (making upgrades in the case of energy efficiency), features can also play a role in negotiations between buyers and sellers. One respondent emphasized the importance of negative features saying, “Buyers don’t come looking for reasons to buy a home; they look for reasons to not buy a home.”

The features REPs use to market homes may apply to the home as a whole, as with location, or they may consist of individual elements within the home, as in the case of hardwood floors or stainless steel appliances. Interview findings suggest that efficiency most often arises in this later context, particularly in existing homes. According to one respondent, “Occasionally I see [efficiency] in the marketing remarks to a realtor, but usually only if the home has energy efficiency features people will recognize. I don’t see general comments like ‘this is an energy-efficient home.’”

REPs suggested that efficiency typically arises as a feature of the home itself only when buyers are comparing homes that are very similar in other ways, which is more often the case with new homes than existing homes. Elaborating on this point, one respondent said, “When you are driving all over looking at a home here, a home there, they are not apples to apples…In a new community the homes are built at the same time, with the same materials. The floor plan might be different, but they are a lot

Behavioral Perspectives on Home Energy Audits Page 67 easier to compare.” Energy labels also influence whether REPs market efficiency as a feature of the home itself or focus on individual efficient elements. According to one REP, “You can’t market [homes] as green unless they have an environmentally certified label. You can market features, but [you] can’t call it green.”

Consistent with the diversity of existing homes and the relatively small proportion with energy labels, when energy efficiency arises in marketing of existing homes, it typically focuses on individual efficient elements, rather than the efficiency of the home itself. Nine REPs said efficiency only plays a role in existing home marketing when the home has had some sort of upgrade that makes efficiency a marketable feature. One elaborated that REPs might be reluctant to bring up energy efficiency unless they were certain a home’s features would make it stand out in order to avoid bringing up the issue only to lose the sale to a more efficient home. Respondents cited high efficiency appliances, equipment, or windows as features that might arise in existing home marketing, but noted that discussion of these features may go beyond energy; efficient products are also typically new.

New homes are more similar to each other and more likely to have energy labels than existing homes. As a result, energy efficiency plays a more prominent role in new home marketing and is more likely to arise as a feature of the home itself. Six respondents stated that efficiency typically plays a role in new home marketing and four additional respondents stated that efficiency plays a role when new homes have energy labels.

REPs cited two additional factors that increase the prominence of efficiency in new home marketing. First, four respondents noted that new homes are inherently more efficient than existing homes making energy efficiency a distinguishing feature. According to one REP, “When you look at what today’s code is, you know that buying a new house, even if it’s not built green or to any labeling, it’s still more efficient than 95% of the homes over 10 years old.” Second, the presence of the builder adds an element to new homes marketing that is absent in sales of existing homes. Respondents noted that builders’ sales office displays and model homes provide an opportunity to educate buyers about energy-efficient features and market the home’s efficiency in a way that is not available to sellers of existing homes.

How REPs Discuss Efficiency with Clients

Sellers While REPs typically discuss efficiency upgrades with sellers, interview findings suggest they rarely encourage the types of deep upgrades that would have the greatest impact on the home’s energy use. Eight of twelve respondents said efficiency typically comes up in their discussions of how sellers could upgrade their homes to make them more appealing to buyers. However, these discussions primarily focus on visible measures or needed repairs. The six respondents who mentioned specific upgrades they encourage sellers to make reported recommending upgrades to windows, appliances, and lighting. Two REPs explicitly acknowledged they focus on cosmetic upgrades in discussions with sellers. “I know this is shallow, but they [sellers] concentrate on cosmetics. First come the repairs, then comes the look.”

Behavioral Perspectives on Home Energy Audits Page 68

Explaining this focus on cosmetic upgrades, REPs indicated that they see little benefit of energy efficiency to a seller. Three respondents stated that the primary benefit of energy upgrades comes from an improved experience living in the home, for example, increased comfort and lower energy bills. These respondents and three others stated that sellers are unlikely to recover the cost of upgrades through a higher sales price. As one REP put it, “To simply buy a furnace to sell your house is lunacy unless it absolutely needs replacement.”

Buyers While most REPs hold general discussions of energy efficiency with the majority of the buyers they work with, respondents discuss specific upgrades less often. Nine REPs said they discuss efficiency as part of their initial discussions about the buyers’ interests and desires or when initially showing houses and all but one reported these discussions occur in the context of monthly energy costs.31 This focus on energy costs in discussions of efficiency with buyers contrasts with the relative indifference toward energy costs that REPs reported among buyers. Nine respondents also reported promoting specific upgrades, but only five did so with all buyers. The remaining four promote specific upgrades only when buyers express interest in energy efficiency.

REPs typically discuss specific upgrades with buyers after the buyer has selected a home (in price negotiations or the inspection process) or when the buyer was choosing between a small number of homes. Three respondents (two who encourage upgrades to all buyers and one who does not) reported discussing specific efficiency improvements in the context of other upgrades needed to the house. Three other respondents reported mentioning utility incentive programs or offering advice on financing efficiency improvements by wrapping them into the mortgage.

Conclusions from Real Estate Professional Interviews Energy labels have the potential to alter the role energy efficiency plays in the home buying process, although the effects of labeling vary somewhat between new and existing homes. In existing homes, labeling has the potential to make energy efficiency a feature of the home itself, allowing REPs to market the entire home as efficient and buyers to compare homes based on their efficiency. These types of comparisons are far more difficult when the absence of labels requires REPs to market individual efficient elements within the home. As a result, labeling has the potential to make energy efficiency a consideration in existing home purchases in a way that it is not currently.

Energy efficiency already plays a larger role in new home marketing than in marketing of existing homes, and REPs are more likely to market efficiency as a feature of new homes themselves. Labeling contributes to this greater prominence of efficiency in new home marketing. REPs and builders see efficient homes as more desirable to buyers; as a result, they use labels to call attention to efficiency and differentiate new homes from each other and from existing homes.

31 Of the remaining three, two do not bring up efficiency and the other discusses efficiency in the context of other improvements. The contact who does not discuss efficiency in the context of energy costs discusses benefits specific to individual measures.

Behavioral Perspectives on Home Energy Audits Page 69

By making energy efficiency a consideration in the existing home market and differentiating homes in the new home market, labeling has the potential to increase the desirability of an efficient home as compared to an inefficient home that is similar in all other ways. However, few homes are truly similar in everything but efficiency, and efficiency comparisons must compete with considerations of factors like location, floor plan, and price that currently take precedence in buyers’ decision-making.

While labeling would make efficiency a more prominent consideration, it does not address two of the factors that currently make efficiency a secondary concern to buyers. First, efficiency remains easier to improve than other considerations, and, second, homebuyers remain relatively indifferent to energy costs, particularly in Seattle where energy costs are low and home prices are high. Given the prominence of these other considerations, the extent to which energy labeling would truly increase demand for efficient homes remains unclear.

Behavioral Perspectives on Home Energy Audits Page 70

4. Homeowner Decision-Making Related to Energy Upgrades, Audit Reports, and Labels

This chapter describes results from the surveys and in-depth interviews of homeowners who completed an audit through the Seattle City Light (SCL) Home Energy Audit Program. It focuses on high-level analysis rather than detailed accounting. The appendices provide more detail, including frequency tables for most of the survey questions (Appendices L-M), as well as the interview guides (Appendices H and J) and responses (Appendices I and K). Sampling details and approach are described in Chapter 2.

4.1 Methods Many of the homeowners who participated in SCL’s Home Energy Audit Program were invited to voluntarily complete one or more phone surveys or interviews.32 Homeowners were recruited to participate either in phone surveys or interviews, and not both. Surveys contained both open-end and closed-end questions. They were designed to be fielded to as many participating homeowners as possible and to provide detailed information (e.g., on what recommendations were completed) suitable for statistical descriptions. Interviews were more interactive, designed to collect in-depth information from a smaller subset of households.

Researchers contacted homeowners at one or two of three possible points to assess homeowner motivations, experiences, and actions as they progressed through the home energy audit process. Participants could be interviewed at two points in the process: after signing up for the discount but prior to the audit (Pre-Audit Interview, n=33), and soon after the audit (Post-Audit Interview, n=30). A subset of interview respondents (13) completed both a Pre-Audit Interview and a Post-Audit Interview.

Participants could be surveyed by phone at two points in the process: soon after the audit (Post-Audit Survey, n=134); and longer after the audit (referred to as the Retrofit Survey because the questions focused on upgrade completion and decision-making, n=159). A subset (58) of the survey respondents completed both the Post-Audit and the Retrofit Surveys. For more in-depth information on methods and sample sizes, please refer to Chapter 2.

Table 11: Summary of number of completed homeowner interviews and surveys

Pre-Audit Post-Audit Retrofit Interviews 33 30 - Surveys - 134 159

32 A total of 1355 audits were completed in the study period. Many of these were not accessible for phone contact due to several factors, including the absence of phone contact information, and exclusion of households eligible for the Community Power Works program. Finally, some audits were completed late in the study period, such that not enough time had elapsed after the audit to reasonably ask the household about their experience with the audit report and subsequent actions. In total, nearly 600 households met the screening criteria and had phone contact information. Of these, 286 completed at least one phone survey or interview.

Behavioral Perspectives on Home Energy Audits Page 71

4.2 Characteristics of Respondents Of the approximately 171,750 SCL customers in single-family homes, 1,355 undertook an audit through the SCL Home Energy Audit Program between June 2010 and October 2011.33 This amounts to a program-specific participation rate of 0.8% of eligible customers for that time period. Participation in utility-sponsored home energy audit programs is on average just 3.3%, according to a recent review (Palmer et al. 2011). To avoid competing with potential participation in the parallel Community Power Works program, the SCL Home Energy Audit Program was not widely marketed (see Chapter 1 for more information on the SCL Home Energy Audit Program). The low participation rate underscores that participating households might often have been—in the words of one of the auditors interviewed—“the cream of the crop” in terms of their interest in home energy efficiency.

Demographic characteristics—Survey Findings We asked survey respondents a minimal set of demographic questions, for which results are summarized in Chapter 2. Overall, respondents were wealthier, more educated, and older than typical for Seattle.34 Surveyed individuals as a group had nearly three times the rate of professional, masters, and doctoral degrees (57% participant population versus 21% Seattle), and much a lower proportion of participants who had not attained a bachelor’s degree (9% in the participant population versus 46% in Seattle), compared to Seattle as a whole—and even more so compared to the U.S. as a whole, where 72% of adults 25 or older have not attained a bachelor’s degree.35 Survey respondents were thus over three times as likely to have attained at least a bachelor’s degree as for adults in the U.S. as a whole (91% vs. 28%).

This general pattern of participants being relatively affluent and educated has been found in earlier reviews of audit participants in other audit programs as well (Sanquist et al. 2010). Despite this overall “elite” tendency, there was clearly demographic heterogeneity in the participating population as well. For example, while 27% had children under 18 at home, 19% had at least one household member over 65. And even with the Community Power Works area participants excluded, 12% of participants reported annual income less than $50,000. In at least one case, the homeowner was not the occupant, and so on. In short, there were a variety of demographic circumstances represented.

33 The number of SCL customers in single-family homes was calculated based on SCL’s estimate of 343,500 total households in SCL territory combined with the census estimate of the percentage of these that were single-family homes (see Tachibana 2010). According to data collected by Earth Advantage Institute, several hundred SCL customers undertook audits under programs other than the SCL Home Energy Audit Program, so the total audit participation rate for utility customers is higher. Though the program was intended for single-family homes only, a small percentage of participants may have been in home types other than single-family. 34 Some SCL Home Energy Audit Program participants residing in the Community Power Works program area, generally a lower-income area, were exempted from survey eligibility at the request of the Community Power Works program. Approximately 20% of Home Energy Audit participants were excluded for this reason. Originally, the Community Power Works program covered only the south of the city, so the Home Energy Audit participants interviewed and surveyed were necessarily north of this area. At present, the Community Power Works program has been expanded to cover all of Seattle. 35 Source is the American Community Survey, U.S. Bureau of the Census, for 2009 (http://www.census.gov/compendia/statab/2012/tables/12s0233.pdf).

Behavioral Perspectives on Home Energy Audits Page 72

Moving In, Moving Out, Staying Put—Survey Findings We asked survey respondents about how long they had been in their homes and how much longer they planned to stay, to help understand the context of their decision-making on energy upgrades, including the potential relevance of the home’s energy performance score to its value in the real estate market. As to the general real estate context, paralleling the U.S. overall, the 2009-2010 Seattle area real estate market had receded substantially from a few years earlier, with fewer sales, lower median prices, and thousands of repossessions (Pryne 2011).

Just 9% of surveyed respondents said that they had been living in the audited house for two years or less. Most (61%) said that they been living in the house for at least ten years; 29% of these said that they had lived in their current house for at least 20 years. Accordingly, participants’ interests in energy improvement may often concern situations that may have been ongoing for many years, or to general home improvements such as house additions. Responses to survey questions about what had changed in the household since the audit or renovation suggest that some of the renovation activity may have been partly spurred by major life changes, such as having babies or retiring.

Asked how much longer they were planning on staying in their house, only 6% said that they were planning on staying for less than two years longer. The most common answer given by respondents was that they were planning on living in the current house “for the rest of my life” (22%), with an additional 21% saying that they planned to stay for at least twenty years. For these long-stay households, the direct sales value of energy efficiency may have little salience. Rather, these plans underscore that many participating households were primarily seeking to improve their own living conditions or for other personal reasons rather than looking to increase the market value of their home per se. A second implication of the long intended stay in the audited home is that, from the conventional perspective of judging the cost-effective energy efficiency investments through future expected energy savings, payback periods of ten years or more would often be technically legitimate, ignoring uncertainty and risk. A quarter, however, said that they were planning to stay less than ten years.

4.3 Pre-Audit Attitudes and Motivations—Interview Findings Prior to their audit, we asked homeowners several interview questions about their general attitudes towards energy issues and why they sought the SCL home energy audit.

Pre-Audit Attitudes Pre-Audit Interview respondents indicated that they were concerned with their energy use, carbon emissions, and larger environmental impact. Seventy percent of pre-audit respondents (23 of 33) stated that everyone in their household tried hard to conserve energy. In addition, slightly fewer than half of the Pre-Audit Interview respondents (15 of 33, 45%) rated their households as “very concerned” about reducing their carbon footprint, while 13 additional respondents (39%) rated their households as “somewhat concerned” about their carbon emissions.

Consistent with this concern about energy use and carbon emissions, a large majority of respondents (25 of 33, 76%) said that they had made previous efforts to improve the energy efficiency of their homes.

Behavioral Perspectives on Home Energy Audits Page 73

Nonetheless, interview findings suggest that pre-audit respondents believe further opportunity remains to improve their homes’ energy efficiency. Asked whether they thought their energy bills were high, average, or low, a majority of respondents (61%, 21 respondents) reported that their current energy bills are high; only one respondent rated his/her current energy bills as low. In addition, 79% of respondents (26 of 33) listed projects that they were interested in doing to improve the comfort or energy efficiency of their homes, and the majority of those respondents (15) reported specific plans to complete those projects. In all but one case, respondents anticipated completing the projects within the next two years.

Motivation for Seeking a Home Energy Audit Among Pre-Audit Interview respondents, lowering energy costs emerged as the most prominent motivation cited for seeking an energy audit (Table 12). Pre-Audit Interview respondents also reported seeking an audit to inform planned energy improvements or larger renovations that might include efficiency elements, a finding that is consistent with auditors’ reports that homeowners seek a clear path to energy improvements (see Chapter 3). In addition, some pre-audit respondents sought to understand why their energy use remained high despite past efforts to increase their home’s efficiency.

Pre-audit respondents also reported a desire to increase the comfort of their homes as a motivation for pursuing the audit. However, while some auditors identified comfort as a primary factor leading homeowners to seek an audit, interview findings suggest that less than half of respondents perceived the thermal comfort of their homes as the most pressing issue in need of attention. Rather, a majority of contacts (20 of 33, 60%) reported that they are able to maintain comfortable temperatures throughout the year. In un-prompted responses, none of the pre-audit respondents cited a desire for a home energy rating as a motivation for seeking an audit.

Table 12: Reasons cited by Pre-Audit Interviewees for seeking an audit (multiple responses allowed)

Motivation Count (n=33) Percent Reduce energy costs 21 64% Gain general knowledge about home energy use 12 36% Save energy 10 30% Improve comfort 8 24% Planning energy improvements or larger renovations 8 24% Did not see anticipated savings from past energy improvements 6 18% Recently purchased home 4 12% General environmental/climate change benefits 3 9% On-bill energy use comparison to neighbors 2 6% Homeowners’ association promoted audit 2 6% Other 2 6%

Behavioral Perspectives on Home Energy Audits Page 74

Most respondents (24 of 33) cited multiple factors motivating their decision to receive an audit. Nonetheless, there was relatively little overlap between the two motivations contacts cited most often—reducing energy costs and gaining general knowledge about the home’s energy use. Homeowners citing each motivation differed somewhat in their perception of their homes’ energy use relative to similar homes. Respondents seeking general information about their home were somewhat more likely to perceive their home’s energy use as on par with similar homes, while those seeking to reduce costs most often perceived their energy use as greater than similar homes.

Interview findings suggest that the offer of an EPS score for their home had a relatively small role in most respondents’ decision to receive an audit. It is unclear how much of this overall lack of direct interest in the score had to do with the way the program was marketed, versus a more fundamental disinterest. Less than half of Pre-Audit Interview respondents (14 of 33, 42%) were aware that they would receive an EPS score. However, of those fourteen homeowners who were aware that they would receive an EPS score, nearly half (6 of 14, 43%) reported that the EPS score motivated them to obtain the audit. Elaborating on their responses, two of the six respondents who cited the EPS score as a motivation reported seeking a general rating of their homes’ efficiency or a comparison to other homes. However, two other homeowners reported they were interested in the EPS because they sought information on specific improvements, suggesting that they may not distinguish between the EPS score and the larger EPS Energy Analysis Report. One of these respondents explicitly stated that he was not interested in a comparison with other homes, saying, “As far as the score is concerned, I don’t care at all how we compare. I don’t know if the motivation behind this is to throw a guilt complex onto people.”

Respondents who did not say they were motivated to sign up for the audit because of the EPS score itself emphasized that they sought concrete recommendations of ways to improve their homes rather than a basis to compare their home with others. Illustrative comments include:

“I think there are things we can do differently to make our house more comfortable and save some money. It’s not really about how our house compares to others.”

“We don’t want to judge ours against other houses. We just care whether it is where we need it to be in terms of energy.”

“Although the score is interesting, we know we will do work on the house. When we get into it with the contractor, we know what we want to do.”

While they did not cite the EPS score as part of their motivation to pursue an audit, two respondents reported that comparisons to their neighbors’ energy use they had received on their energy bills were among the factors motivating them to pursue an audit (Table 12). The experience of one interviewed auditor supports this finding; this auditor stated, “We get quite a few people doing audits because they have gotten information that compares them to their neighbors and they have come up high.” 36 Other

36 During the study period, SCL sent out OPOWER reports to some customers, comparing their electricity use to that of their neighbors.

Behavioral Perspectives on Home Energy Audits Page 75 researchers have found that homeowners often claim not to be influenced by comparisons of their energy use to that of other households, even while evidence suggests that they actually are influenced (e.g., Alcott 2010). It is possible that EPS scores could have a similar influence, even if receiving the score was not a motivation for undertaking an audit.

4.4 The Audit Experience—Survey Findings After their audit was completed, we asked homeowners several questions about their reactions to the audit and the auditor, the visit to their house, and what elements of the audit they found to be most helpful. These findings are drawn from the Post-Audit Survey, delivered to 134 homeowners who had received an SCL home energy audit, and the Retrofit Survey, delivered to 159 homeowners somewhat later in the research period.

Homeowner Reactions to the Audit The SCL home energy audit requires an assessment of the whole house, including attic, walls, windows, foundations, ducts, and heating and cooling system(s), as well as measurements of air leakage and combustion safety checks. Also, auditors performing audits for the SCL Home Energy Audit Program were asked to make several additional measurements for both SCL and for the DOE, significantly adding to the audit time. As a result, the in-home portion of the SCL home energy audits typically takes three to four hours to complete. 37 Somebody was required to be at home during the audit to provide the auditor access to the home and guide the auditor on how to access areas like attics and crawl spaces. The financial cost notwithstanding, for some homeowners, the long in-home portion of the audit experience may have also been perceived as a substantial investment.

Though the audits were conducted along standardized protocols, auditors added their own experience, expertise, and personality to the audits they undertook, adapting to the situation and to the homeowners they encountered. Homeowners often followed along with the auditor: nearly 60% said that they were with the auditor most (31%) or all (28%) of the time that the auditor was in the home, and most of the rest (34%) said they followed the auditor at least some of the time. As noted in Chapter 3, from the perspective of some auditors, homeowner involvement could slow down the audit process. Based on homeowner comments about being present and engaged during the audit representing an investment many Post-Audit Survey respondents seemed to enjoy the process and the found the ongoing conversations with the auditor useful and interesting. The amicability, experience, and communication skills of the auditor may play a major part in what homeowners get out of audit and what they do subsequently, as suggested by some of the open-end responses offered by homeowners).

Most households surveyed seemed pleased with, and often very positive about, the auditor’s visit. Overall, survey respondents said that the auditor clearly communicated (75% “strongly agree”) and did a complete review of their home (70% “strongly agree”). Though respondents were somewhat less enthusiastic for this question, most also said they were confident in the auditor’s work and recommendations (64% “strongly agree” and 23% “agree”). Few survey respondents expressed

37 It typically takes auditors about an hour to collect the measurements, other than the blower door test results, that are needed for the EPS score calculation (Earth Advantage Institute and Conservation Services Group, 2009).

Behavioral Perspectives on Home Energy Audits Page 76 dissatisfaction with the audit: only three (out of 134 asked, or 2%) did not agree that the auditor had done a complete review of the home, and only seven (5%) did not agree that they had confidence in their auditor’s work and recommendations.

Homeowners also praised their auditor’s professionalism and dedication, for example:

“He was a pro, like Ghostbusters. He blinded me with science.”

“They were absolutely wonderful, very professional, and had a lot of information and suggestions for me.”

“He found something that I had somebody come to look at in our furnace system. There was a duct that wasn’t connected and was causing hot air to spill into the attic. He spotted that and we had it fixed.”

Apart from comments about the auditor, some respondents clearly stated that they placed high value on the scientific nature of the audit and the calculations, for example, speaking to the value of the audit in general:

“It gave scientific results of the house performance that are measurable rather than opinions.”

Echoing comments made elsewhere in the survey about the enthusiasm and dedication of the auditor (and inversely, the very few cases where homeowners felt that the auditor was in a hurry or otherwise disengaged), many respondents also mentioned that the process of talking through home energy issues and options with the auditor was particularly valuable and motivating. This finding suggests that effective home energy audits are best seen as more than technical observation, calculation, and the transfer of results, even if the diagnostic testing and evident scientific nature of data collection and modeling provide a critical foundation. Rather, when the standardized objective presentation of recommendations, savings, and costs is not motivating enough, what may make the difference is an auditor who is willing and able to translate a technical assessment into the intricacies of how the home is, will, and can be used. But this does not mean that retrofits can necessarily be made appealing to all customers, no matter what the technical rationale. A number of respondents commented positively on the availability of options, particularly lower-cost and easier-to-do options, which gave them a way to improve living conditions or reduce energy use and greenhouse gas emissions without major investment or effort.

Asked months later, in the Retrofit Survey, if there was anything about the audit process that was especially helpful in convincing other household members to complete energy upgrades, the surveyed homeowners who answered “Yes” provided two (open-end) responses that stood out: the personalized discussions with the auditor, and the diagnostic testing. Respondents particularly mentioned the blower

Behavioral Perspectives on Home Energy Audits Page 77 door test, but also the infrared (IR) imagery (see Table 13).38 Both the blower door test and the IR photos seemed to bring energy issues to life, making otherwise invisible patterns of heat loss and drafts more visible. For example:

“The best part … was the draftiness in the house and what to do about it. Also, the air leaks around the dishwasher, electrical outlets, and doors.”

“The infrared test on our walls shows us how much more heat was leaking through the non-insulated part of the walls.”

Few surveyed homeowners mentioned the report or the label per se, though other responses (e.g., quantification, options) are certainly tied to the report. At present, in the U.S., energy efficiency rating schemes for residences are primarily for new dwellings, and designed to help convey relative efficiency in the real estate market (Pérez-Lombard et al. 2009)—there is little established parallel effort in the existing home real estate market.

Table 13: What part of the audit process was particularly helpful in convincing household members to consider completing energy upgrades? (A subset of Post-Audit Survey respondents; n=52)

Aspect Percentage of Respondents Blower door 31% Discussion with auditor, process 21% Quantification 13% Options 13% IR photos 10% Report or Label 8% Surprises (e.g., about insulation) 6%

Though most experiences with the audit process seemed clearly positive, even homeowners who said that they were largely or completely satisfied sometimes voiced reservations about what they received. In short, what the audit can and did deliver sometimes fell short of participant expectations or ideals. Some homeowner comments went to the heart of questions about the balance between what homeowners might want and what even well-performed audits provide. For example:

“Every auditor seemed to have some business, that they were going try and sell me something. There's no way to get around that, but that was how it was.”

“There basically was nothing I could do other than get a fireplace insert. I wish there were more things I could do.”

38 A blower door test was conducted in every audited house, with few exceptions for technical reasons. Not all auditors used IR photography.

Behavioral Perspectives on Home Energy Audits Page 78

“I would like specific recommendations and what might be a more efficient model or brand or criteria. I want to learn about how much that the windows are losing. No percentages are given for anything.”

4.5 Homeowner Reactions to the EPS Scorecard—Interview and Survey Findings After the in-home portion of the audit was completed, auditors reviewed the data collected and created a customized report that included both a home energy rating (EPS Scorecard) and recommendations for home energy upgrades (EPS Energy Analysis Report). Once the audit was finalized, homeowners received an email indicating how to download their EPS Scorecard and Energy Analysis Report online. Figure 12 shows an example of the front of the EPS Scorecard. The EPS Scorecard provides both an Energy Score and a Carbon Score. The Energy Score is expressed in kWh-equivalents site energy per year under “asset” assumptions, and the Carbon Score is the corresponding source carbon translation in metric tons per year. The Scorecard indicates the asset-based energy consumption estimate for the given house, both as is and with all of the recommended retrofits in place. For comparison, the chart also marks the Seattle average, as well as the Seattle 2020 and Seattle 2050 targets for energy consumption, corresponding to a 20% and approximately 80% reductions, respectively. For the Carbon Score, the benchmarks are the Seattle average and the Seattle 2024 and Seattle 2050 targets.39 The second page of the Scorecard report provides additional explanation (see Appendix A). Note that these ratings provide an absolute estimate of site energy consumption and source carbon emissions for a home under asset-rating “standard occupant behavior” assumptions, as opposed to HERS and Energy Star ratings, which are rendered relative to particular definitions of efficiency.40

According to auditor interviews (see Chapter 3), the scorecard section of the audit report often took some effort to explain, but could also serve as a good focal point for the rest of the audit, especially to explain the “asset” aspect of the assessments. We asked households what they thought of the scores they received.

Reactions to the EPS Scorecard—Post-Audit Survey Findings Two-thirds of survey respondents said that they recalled their EPS scores fairly well. Nearly half (48%) of those who recalled their score said that it was worse than average. Many said the scores confirmed their suspicions, while others expressed surprise, and sometimes annoyance and frustration. Not everyone interpreted the scores as asset-based. A few who clearly understood the asset basis questioned the usefulness of an asset score for their own purposes (“Why treat me as an average?”), or the usefulness of the comparison to an average home (“I was more interested in comparing our big old leaky house to

39 For most of its electricity, SCL uses hydropower, which has low greenhouse gas emissions compared to fossil-fuel based power generation. The utility does purchase electricity from the market, so, as the Scorecard report explains, regional market emissions factors were used to calculate the Carbon Score shown on the report. 40 For example, the HERS rating quantifies the efficiency of a home using a standard new home as a benchmark.

Behavioral Perspectives on Home Energy Audits Page 79

Figure 12: Sample EPS Scorecard. A second page with additional explanations of definitions and assumptions is provided in Appendix A.

other big old leaky houses, rather than small new houses”), or questioned the methodology (e.g. as to carbon emissions associated with grid emissions factors; see just below).

The Post-Audit Survey also asked homeowners about how they viewed the relative importance of the Carbon Score versus the Energy Score. Table 14 summarizes the results. Overall there was greater interest in the Energy Score though half did not distinguish either the Carbon Score or the Energy Score as more important. Among homeowners who offered explanations for why one score was more important than the other, understanding of the complexities in interpreting the scores seemed quite high. Those who said the Energy Score was more important typically said that it had more direct relevance to their costs, their comfort, or what they could control. They also sometimes noted that the Energy Score was a surrogate for carbon emissions. Consistent with this finding, four of the interviewed auditors (see Chapter 3) noted that the predominance of hydropower in Seattle may lead particularly informed customers to question the methodology for calculating their Carbon Score. Several of those preferring the Energy Score mentioned concerns about assumptions or interpretation of the Carbon Score that underscores the need to go beyond the simple diagnosis of “homeowners need education.” For example:

“The carbon score is based on where the energy comes from but there is no guarantee that that is what my carbon footprint really is.”

Behavioral Perspectives on Home Energy Audits Page 80

Another said:

“I get tired of this carbon stuff. It is a term I do not really understand. It has never been put in terms that one can truly measure to me.”

Table 14: Responses on the relative importance of Energy Score versus Carbon Score (Post-Audit Survey; n=129) Which score is more important? Percentage of Respondents Both are equally important 53% The Energy Score 37% Don’t know 6% Neither are important 3% The Carbon Score 2%

Toward understanding the potential role of Energy Scores in home sales, we also asked survey respondents about the use of the Energy Score in selling their present home or buying a new one. Most surveyed households who were familiar with the scores (85%) said that they would like to see such a score when buying (95%), and most said that they would be willing to reveal the score of the home to potential buyers (82%). Respondents who said that their scores were below average were slightly less likely to say that they would show their house’s score, but almost three-quarters still said that they would. It should be remembered that these are responses to a hypothetical question, as few respondents seemed poised to sell their homes; for example, it seems unlikely that somebody who wanted to sell a home would volunteer information that would make it harder to sell or lower its price.

Reactions to the EPS Scorecard—Post-Audit Interview Findings In post-Audit Interviews, few respondents (2 of 13) expressed concerns that the assumptions made in the EPS score calculations limited the score’s relevance to their homes. One of these respondents reported that the auditor made an error in estimating the age and energy use of his water heater and refrigerator. The other homeowner stated that the EPS was less applicable to his home because the score assumes the home uses only electric energy, while his home uses gas. The tool converts natural gas and fuel oil use into kWh-equivalents, but some homeowners may misunderstand or object to this conversion.

Though the majority of respondents (9 of 13) reported that their EPS Energy Score would influence the upgrades or home repairs they might undertake, comments elaborating on their responses suggest that homeowners tended not to differentiate between the Energy and Carbon Scores that make up the EPS Scorecard and the larger audit report, as was also seen in the Post-Audit Survey responses. Explaining how their Energy Scores would influence their upgrade decisions, three respondents simply referred to specific upgrades they would undertake. One additional respondent specifically cited an element of the audit report—prioritization of upgrades—rather than the Scorecard. Only two homeowners made comments directly tying their upgrade plans to the Energy and Carbon Scores, in one case stating that they hoped to reduce the score and in the other stating that the score was an effective baseline. A third

Behavioral Perspectives on Home Energy Audits Page 81 respondent later mentioned a desire to re-assess his Energy Score after completing upgrades in order to track progress.

4.6 Homeowner Reactions to the Energy Analysis Report In addition to the EPS Scorecard, homeowners who completed an audit received an Energy Analysis Report, typically 15-18 pages long. Appendix B provides an example. The report included: asset-based estimates of energy use and fuel cost by major end use group, for the current house and after completion of all of the “standardized” recommended upgrades; a description of the performance of ten home energy elements (e.g., ducts, water heating); and a list of “standardized” recommended upgrades along with typical cost ranges and annual savings ranges. Auditors also often provided a separate, customized list of upgrade recommendations which they could make more specific to the homeowner’s needs and use to call out any safety issues, prioritize upgrade recommendations, and/or to bundle upgrade recommendations into feasible phases. SCL encouraged these “auditor customized” recommendations.

Homeowner Reactions to the Energy Analysis Report—Post-Audit Survey Findings In response to a closed-end question, most survey respondents (70%) said that they thought that the Energy Analysis Report had “lots” of useful information. Some mentioned that they got more from the auditor visit, and one dissenter mentioned that he found the report too standardized for the character of his hundred-year-old home.41

Two-thirds of survey respondents said that they understood all of the recommendations they were given. Most of the rest said they understood most of the recommendations and just 5% said that they understood only some of them. Even so, when it came to actually trying to make changes, many households said that they wanted more specific information, as became clearer in the Retrofit Survey responses. So, while most respondents—not surprising for this group—said that they could readily understand what they read, getting sufficient understanding to feel ready to decide what to do was sometimes more challenging.

One homeowner said:

“The fundamental flaw … is that the recommendations are not tied to solid estimates to get the work done. The range is too wide and the information is not adequate for a contractor to do an estimate. Virtually every contractor I talked to would not do it based on the audit, but wanted to do their own walk through.”

Toward understanding not only potential improvements to audits and audit outputs such as the Energy Analysis Report, but also more generally homeowner expectations about the audit, we also asked homeowners about what more or different they might want from the overall audit process. There were many different types of suggestions:

41 As noted earlier in the report, many of the homes audited were quite old. Judging also from some auditor comments, respecting the age, nature, and character of the home was certainly important for some customers.

Behavioral Perspectives on Home Energy Audits Page 82

 Wanting more or better recommendations. Some respondents said they got only one recommendation, and that it was not always actionable. Overall, as deduced from various types of comments that surveyed homeowners made, homeowners seemed to gravitate toward interesting, readily achievable, and lower-cost suggestions more so than a desire to increase energy efficiency per se. In short, they were seeking to solve problems that they perceived (or were made to perceive) and sometimes perhaps even were satisfied to check something off the to-do list. Some expressed disappointment that the recommended upgrades would require such large investments, and others said that they appreciated that they received a wide range of options in terms of cost and complexity.  Wanting more precise information about the costs and benefits of completing upgrades, and how to prioritize them.42 Costs and benefits are not always financial. A few homeowners also mentioned downsides. One homeowner, for example, said that she wished she had known that the heat pump that she had installed was going to be noisier than the equipment it replaced and that the energy- efficient clothes washer took an hour to complete a load, rather than twenty minutes for the previous machine.  Wanting more specific and practical information on how to complete recommendations. For example, respondents mentioned wanting to know more about exactly what was needed for insulation, or wanted links to online resources, the name of a reputable local contractor, or ideas about where to get some of the materials–especially unusual ones like chimney balloons–that the auditor or audit report had suggested. The request for contractor recommendations also addresses issues of trust, with a few respondents mentioning they were wary of sales pushes or that they wanted more government information to back up the auditor recommendations. Several mentioned their efforts to find the most reputable contractors (e.g., through internet research or phone calls) and various frustrations with this search.  Wanting more information about appliances, non-heating end uses, and the cost of running equipment. One homeowner said, “The audit didn’t come back with recommendations that were useful. It didn’t identify all of the electrical sources we could cut back on other than light bulbs, and our heat is oil.” The SCL home energy audits were asset-based audits that intentionally focused on technical energy efficiency. They were not designed to focus on or even advise audit customers on energy use behaviors, though some auditors did discuss behavioral changes with clients. As presented later in this chapter, some respondents reported behavioral changes in energy use, but several said that they had hoped to get more thorough information on what to change about energy-use practices, in addition to information on technical energy efficiency.  Wanting copies of the infrared photos or more detailed information about where the house leaked. The audit report allows photos to be inserted. Some auditors provided photos or other details in the

42 Typically, the EPS Energy Analysis Report provides a single number for the estimated fuel use after upgrades and for the approximate annual savings in dollars. However, SCL requested that the EPS Energy Analysis report provide ranges instead of a single number for estimated fuel costs after upgrades and for the approximate annual savings in dollars after upgrades. This range was approximately 30% more or less than the single figure calculated by EPS Auditor and sometimes led to a low-end estimate of zero for fuel costs and savings.

Behavioral Perspectives on Home Energy Audits Page 83

audit report, but not all of them did. Such comments also underscore the importance of blower doors and IR photography in demonstrating energy inefficiencies and in motivating action and (as discussed earlier in this chapter) in getting buy-in from household members to make upgrades.

 Wanting more information about retrofit subsidies or other financial incentives. Basic incentive information was included in the reports, but the SCL Home Energy Audit Program was not designed to provide homeowners with assistance in finding specific incentive and financing options. Some expressed disappointment that more subsidies were not available, or at least, that they had not learned what these were from the auditor’s visit. However, relatively few subsidies were available during the program period, so this concern may have less to do with the completeness of the information provided than with homeowner expectations that applicable financial incentives would have been available, and sometimes, the implication that they should have been available (i.e. that private energy efficiency investments should be subsidized).

 Wanting more follow-up in general. Overlapping the specific suggestions above, many mentioned wanting more “follow-up,” sometimes stated quite generally (e.g., “Making sure there is some follow- up and support for people. We are dedicated to it and I see how the regular homeowner might not have the time” and “a follow-up from the auditor about a year later”). A few said that they did not get answers to specific issues that they had raised, or that they had not learned new things from the audit and its outputs. For example, recommendations to add insulation and to weatherize doors and windows, while legitimate, may add little to nothing about what a customer has already assumed could be done to increase the efficiency of their home—and can even make efficacy of completing these recommendations less clear. A recent German study found that while home energy audits can encourage some homeowners to do particular upgrades, they can also dissuade homeowners from doing upgrades that they had already planned, for reasons such as low or uncertain projected payoff or finding that behavioral changes could provide adequate savings (Frondel and Vance 2011). The surveys were not designed to track this pre-audit vs. post-audit shift,43 but as noted elsewhere, some households said that they already planned to do upgrades, and a few pointed to surprises as to what they were not told to do (e.g., replace windows or add insulation).

The overall message of these suggestions is of homeowners wanting more details about the retrofits and the need for these retrofits, wanting less uncertainty in the upgrade costs and savings, wanting more help in actually undertaking the retrofits including help in finding contractors and financial incentives, and wanting more information about operational aspects of the home as opposed to a more asset-based assessment. As SCL continues to refine the program, they have been encouraging auditors to address some of the concerns—about more guidance in completing retrofits—particularly through customizing the audit report and recommendations.

One of the most challenging concerns in the list above is reducing the uncertainty in savings and cost estimates. Savings and cost estimates could be presented as narrow ranges or even a point estimate,

43 See, however, the interview results for a small subset of participating households interviewed both before and after the audit, presented below.

Behavioral Perspectives on Home Energy Audits Page 84 making the necessary comments about uncertainty. But the flip side of providing narrow ranges on savings or cost estimates is the possibility of being misleading. That would likely come at a cost to both providers (the utility and auditors, who usually work hard at appearing to be trustworthy) as well as homeowners. Optimistic ranges for savings can lead the homeowner to poorer decisions, and optimistic ranges for costs are likely to be discovered once contractors are called in. The question from a technical standpoint is: can savings estimates be more accurate? If so, how, and under what conditions? Moving forward, SCL no longer requires that the EPS Energy Analysis Report provide ranges instead of a single figure for estimated fuel costs after upgrades and for the approximate annual savings in dollars after upgrades.

Homeowner Reactions to the Energy Analysis Report—Post-Audit Interview Findings Consistent with Pre-Audit Interview findings that many homeowners seeking audits were also motivated to pursue energy efficiency home improvements, the majority of the interviewed auditors cited specific recommendations to improve the efficiency of the home as the most valuable aspect of the audit for homeowners. Post-Audit Interviews with audit participants support the auditors’ assertion. Of the 11 homeowners interviewed who listed specific components of the audit report that they valued, six either cited specific findings the report included or stated generally that detailed recommendations were the most valuable element of the audit report. Three additional respondents cited prioritized lists of recommended upgrades as most valuable—and as presented below, the auditor customized recommendations (including prioritized lists) seemed to increase the propensity of homeowners to follow through in completing upgrades.

4.7 Intentions, Attitudes, and Knowledge-Seeking Through the Audit Process Post-Audit Survey results, coupled with a comparison of Pre- and Post-Audit Interview findings for thirteen respondents, together provide insight into the impact of the audit on participants’ knowledge and attitudes about their homes’ energy use and the types of upgrades they anticipate undertaking.

Attitudes Following the Audit—Post-Audit Survey Findings Table 15 categorizes the reasons that homeowners gave, after having undertaken the audit (Post-Audit Survey), for why they had done the audit, recoded from open-end responses.44

44 Responses could be recoded into more than one category.

Behavioral Perspectives on Home Energy Audits Page 85

Table 15: Reasons given for having undertaken audit (Post-Audit Survey; n=134)

Reason Example Response (Paraphrase) Percentage of Respondents Energy or I wanted information on things I could do to lower our 26% Monetary energy use. Savings Efficiency I wanted to find out what needed to be done to improve my 23% energy efficiency. Curiosity We learned that there was a rebate available and it seemed 23% like a good idea. Improvement We had made some improvements already to our home and 16% projects we were not sure where to go next. Comfort I always felt that I had cold spots and I wanted to check 13% concerns those out. High bills Our bills are ridiculous. 11% Environment My wife and I have been working for several years to reduce 10% our energy footprint.

One interpretation that emerges subtly from this categorization is that these homeowners were often oriented to solving “problems” they experienced—or problems they could solve (as will become clearer in the discussion of what retrofits were completed), or were going to face, say, in upcoming home improvements. This orientation contrasts with more abstract motivations such as environment or even energy efficiency per se (though 23% mentioned energy efficiency as a motivation). Notable findings include:

 Saving energy or money are cited as motivators more often than the more abstract concept of energy efficiency. Saving energy or money were the most common motivations cited by homeowners we surveyed, while others said they wanted to address high bills. In using the word “savings”, the wording of survey respondents often differs from “energy efficiency” per se, which is how home energy audits are often promoted.45 Many homeowners did mention energy efficiency—in fact it is the second most common reason cited—but saving money and saving energy were more common.46 “Saving” and “efficiency” are not just different ways of saying the same thing. Energy efficiency naturally corresponds to an asset perspective, but it is also abstract, removed from day-to- day home operations and direct material benefits such as saving energy, saving money, or comfort.

45 For example, a U.S. DOE site for consumers explains home energy audits as “the first step to assess how much energy your home consumes and to evaluate what measures you can take to make your home more energy efficient” (http://www.energysavers.gov/your_home/energy_audits/index.cfm/mytopic=11160). 46 Since the responses were open-ended, it was often not possible to distinguish between “saving money” or “saving energy,” so these responses were not distinguished.

Behavioral Perspectives on Home Energy Audits Page 86

 Curiosity was also a common motivator. The $95 audit seemed inexpensive enough that homeowners would pay for it to better understand how their house and energy equipment worked, whether they were new to the house, had been working on home improvements, had careers related to homes, energy, or environment, etc. Additionally, some households mentioned the CFLs that the auditors had installed—as stated on the SCL Home Energy Audit Program website, these are installed at no cost to the homeowner in the program.

 Many homeowners mentioned that they were already active in remodeling. Nearly three-fourths of homeowners surveyed listed energy upgrades that they had completed, prior to undertaking the audit. Replacing windows was often among these, but also insulation, new furnaces, or even more unusual activity such as installing solar tubes or solar hot water. We did not seek out precise comparison data but our impression is that these homeowners may have been quite active in remodeling in general, as many of the Retrofit Survey respondents pointed out other changes that they had been making to their homes, in addition to completing energy upgrades.

 Comfort problems were explicitly named by 12%. Improving comfort was not very commonly stated in the Pre-Audit Interviews (24% of 33 respondents) or the Post-Audit Survey (see Table 15, above) as a major initial draw, though the interviewed auditors considered it an important motivation. Comfort was, however, the most noticeable benefit that homeowners cited after making recommended upgrades, as presented later in this chapter.

 Big problems, big bills. Some households said that they felt that their home energy bills were way too high, and may have not known where else to turn to figure out what might be wrong or what could be changed. In reading survey responses, it is difficult to reliably distinguish “big problems” in energy use from wanting to marginally reduce energy use, but from a technology perspective these may be quite different—one looking for operating faults or major overhauls, the other more focused on increases in efficiency or smaller changes.

 Environment per se was a limited draw. Though 45% of respondents to the Pre-Audit Interview rated their households as “very concerned” about reducing their carbon footprint, only 9% of homeowners responding to the Post-Audit Survey explicitly mentioned the environment, sustainability, or carbon footprints after the audit. Of course, some of the other answers (“saving energy”) may be viewed as having an implicit environmental component, from the standpoint of the homeowner.

Audits as a Topic of Conversation and Further Research—Post-Audit Survey Findings Even when other household members were not closely involved, energy audit results were a surprisingly popular topic of conversation. As shown in Table 16, nearly everyone (96%) surveyed in the Retrofit Survey said that they had discussed the Energy Score or upgrade recommendations with at least one other person, most commonly other household members, friends, contractors, or in a follow-up conversation with the auditor. One-third said that they had discussions with neighbors. Most (three out of four) said that they did not learn anything new or different from these discussions with other people. Those who did learn something new mentioned getting more information on how to do upgrades themselves, about retrofit costs, or about other issues in the home, such as moisture problems. Nearly

Behavioral Perspectives on Home Energy Audits Page 87 half of homeowners indicated that these discussions encouraged them to do recommended upgrades (42%), while 40% said that the discussions neither encouraged nor discouraged them. Only 14% said that these discussions discouraged them from doing the recommended upgrades.

Table 16: After your energy audit, have you discussed your Energy Score or upgrade recommendations with any of the following (n=159)

Response options Percent stating: Other household members 82% Friends 65% Contractors 53% Your energy auditor 47% Other family who don’t live with you 44% Work colleagues 38% Neighbors 33% Somebody else 4% Nobody 4%

Asked whether they did further follow-up research, about half (45%) said that they did. Perhaps surprisingly, only about a quarter of households said that they did follow-up research on the recommended upgrades using the internet, while others talked with contractors, went to stores, did their own calculations, or consulted magazines. Almost half (46%) of those who said that they sought further information said that they learned something new from this research, often about the details of the technology or upgrade process, or about how much the upgrade would cost. Half said that the research encouraged them, 11% said it discouraged them, 17% said that the research both encouraged and discouraged them, and 22% said the research neither encouraged or discouraged them. A few reported identifying other possible upgrades in their research that the auditor did not mention and some respondents said that they had concluded that some of the recommendations would be difficult to implement or would not save as much energy as the audit had indicated. Issues homeowners reported identifying in their research include:

 Older people may not see energy upgrades as reasonable long-term financial investments. One homeowner said: “I learned about how effective other appliances may be in comparison to a hot water heater. If we were in our twenties and had a lifetime to recoup the cost, a new water heater might have been worth it.”

 Some discovered surprising downsides. When investigating whether to pursue retrofits, some said that they found out about difficulties or expenses that were not apparent at first. One said, “I learned about the process of having to remove all insulation, rather than just supplementing it.”

 Homeowners may be attracted to unconventional lower-cost options for increasing comfort. Some households seemed to want to know about, or to otherwise gravitate to, upgrades that were novel. One said, “There are a couple new products, foiled-backed stuff, that you can use to ease cooling in

Behavioral Perspectives on Home Energy Audits Page 88

the summer. I also looked into attic ventilation, though the contractor did not recommend it.” Novelty or cachet (such as solar), beauty, something to talk about with the neighbors, or other non-energy aspects may make some upgrades more appealing than others—and perhaps a surer bet from the standpoint of home improvement.

Comparing Intentions and Knowledge at Different Points in the Audit Process

Change in Intentions—Pre-Audit and Post-Audit Interview Findings A comparison of the home improvement projects homeowners reported interest in undertaking in Pre- Audit Interviews as compared to Post-Audit Interviews indicates that the audits effectively called participants’ attention to specific opportunities to upgrade their homes’ efficiency and helped them prioritize those opportunities. In Pre-Audit Interviews, three contacts stated there were no projects they were interested in undertaking based on their current knowledge. In follow-up interviews, all of the contacts who had received their audit reports stated that they already had, “definitely will,” or were “very likely” to complete at least one upgrade.47

While respondents most often expressed interest in installation of insulation, air sealing, and weather stripping in Pre-Audit Interviews, the number of contacts who reported they had completed or were likely to pursue these measures doubled in Post-Audit Interviews (Table 17). Auditor interview findings suggest that audits may draw homeowners’ attention to air sealing in particular (see Chapter 3). Five of the twelve interviewed auditors noted that homeowners typically have little awareness of the need for air sealing before the audit, although they may be interested in insulation. According to one of these contacts, “insulation doesn’t stop air infiltration; that surprises people, they don’t think of it that way.”

The number of contacts expressing interest in upgrading their heating and cooling system also increased in follow-up interviews, although two respondents who expressed interest in Pre-Audit Interviews did not mention heating system upgrades as a measure they were considering post-audit.

Table 17: Completed or intended upgrades cited more often after the audit (subjects completing both Pre- Audit and Post-Audit interviews; n=12)

Project Type Pre-Audit Post-Audit Pre-Audit Post-Audit Total Total1 Only Only Install insulation, air sealing, weather stripping 6 12 0 6 Install efficient heating/cooling system 3 5 2 4 Air duct sealing 0 2 0 2 Other—Low cost 2 2 2 2

1 Number of contacts reporting they have completed each upgrade, are “very likely” to complete the upgrade, or “definitely will” complete the upgrade.

47 One contact reported receiving only his EPS Scorecard, not a comprehensive audit report. This contact is excluded from this analysis.

Behavioral Perspectives on Home Energy Audits Page 89

In addition to increasing the prominence of some measures in participants’ planned upgrades, Post- Audit Interview findings suggest that the energy audit information decreased the priority homeowners gave to other planned efficiency improvements. For example, while four respondents expressed interest in installing new windows in Pre-Audit Interviews, none planned to replace windows in follow-up interviews. This reduced interest in window upgrades may reflect the influence of the auditors or the audit report. The majority of the interviewed auditors described window upgrades as a common mistake homeowners make in seeking to improve the efficiency of their homes. Auditor respondents reported encouraging homeowners to prioritize more cost-effective upgrades like insulation and air sealing over window replacement. Fewer homeowners also anticipated installing an efficient hot water system or upgrading appliances following the audit than had done so in Pre-Audit Interviews.

Table 18: Completed or intended upgrades cited less often after the audit (subjects completing both Pre- Audit and Post-Audit Interviews; n=12)

Project Type Pre-Audit Post-Audit Pre-Audit Post-Audit Total Total1 Only Only Install new windows 4 0 4 0 Install efficient hot water system 3 2 2 1 Replace aging/inefficient appliances 1 0 1 0

1 Number of contacts reporting they have completed each upgrade, are “very likely” to complete the upgrade, or “definitely will” complete the upgrade.

Differences in Intentions—Post-Audit Survey Findings Consistent with interview findings, Post-Audit Survey results also suggest that homeowners changed their intentions as to what upgrades they planned to complete as a result of the audit. Sixty percent of Post-Audit Survey respondents had plans for specific energy upgrades in mind before having the audit, and many of those (57%) reported changing their plans as a result of the audit. In addition, 64% of the respondents who did not have specific upgrade plans prior to the audit reported that they did have plans following the audit.

The surveyed audit participants primarily reported changing their plans because the audit informed them of energy savings opportunities of which they were not previously aware (33%) and because the audit helped them prioritize retrofit opportunities (28%). Other reasons respondents cited for changing plans include information the audit provided about the condition of their house (12%), that the problems the audit identified were different from what the homeowner had expected (10%), and information the audit provided about the cost effectiveness of various upgrades (9%).

Knowledge Gained from Audit—Pre-Audit and Post-Audit Interview Findings More than half of the homeowners who completed both Pre- and Post-Audit Interviews (7 of 12) indicated that the audit had caused them to change how they think or feel about energy use in their homes. Respondents mainly described these changes in terms of surprise over unexpected results indicating that the sources of problems or opportunities for solutions were different than they thought (5 respondents). One homeowner described the audit process as discouraging, saying “[It gave me a] better sense of how weak things actually are; quantifying the degree of badness in a way it was nebulous

Behavioral Perspectives on Home Energy Audits Page 90 before.” However, another homeowner gained a more positive attitude toward their home following the audit: “Yeah, it oddly made me feel better. I was really concerned that the walls were so drafty. But after reading the report, it became clear that we had some simple fixes.”

Post-Audit Interview findings indicate that the audits were effective in providing participants with new information about their homes, and particularly in pointing out specific issues to address. More than 80% of homeowners interviewed (10 of 12) indicated that they learned something new about their home from the audit, and the two remaining homeowners indicated that the results confirmed what they already knew. The interview respondents who reported learning from the audit primarily (in 9 of 10 cases) discussed the knowledge the audit provided in terms of specific action items. Typical comments include:

“I never realized until I looked at the thermal image that there is so much air leaking between the basement and the first floor.”

“The biggest and simplest fix will be to caulk around the upstairs window frames. [It was] a surprise to see how much they leak.”

Knowledge Gained from Audit—Post-Audit and Retrofit Surveys From three months to one year after the audit was completed, we fielded a Retrofit Survey where we asked a series of specific questions about how much homeowners felt that they had learned from the audit and the report.48 Homeowner responses to this series of questions reflected their longer-term experiences of working with and thinking about the audit results, and for those who actually pursued or completed upgrades, to what degree the audit facilitated that process. As summarized in Table 19, over three-quarters of respondents said that they felt that they had learned “a lot” about the current condition and energy performance of the home. When it came to questions of actually making changes—what to do, how, and expected costs and benefits—respondents still tended to be positive, but were most likely to say that they learned only “some” about these issues.

Sixty percent said that they learned “a lot” from the audit—and 40% thus indicated that that they had learned less than “a lot” from the audit. This provides an interesting comparison to homeowner responses as to their overall satisfaction with the audit (Post-Audit Survey), where most said that they felt “very satisfied” that the auditor had done a complete review of their home (70%) and that the auditor had clearly communicated findings (75%). As noted earlier, some respondents said that they wanted more specific information about the upgrades and more explicit instructions on how to accomplish them.

Nearly two-thirds said that they had learned less than “a lot” about the amount that it would cost to improve their home’s energy performance. A quarter said that they had learned “not much” or “nothing” about these costs. While the Energy Analysis Report gives estimated costs for recommended upgrades, respondents noted reservations about these estimates: some noted that the range of costs

48 Some responses to this survey have already been integrated above, where appropriate.

Behavioral Perspectives on Home Energy Audits Page 91 were wide, difficult to interpret because they were expressed by unit area rather than the total cost for the home, or substantially lower than the contractor estimates they received.

Table 19: How much Retrofit Survey respondents reported learning about specific aspects of their home’s energy efficiency and how to improve it (n=159) 49

How much did you learn about …: A lot (%) Some (%) Not Much Nothing (%) (%) The current condition and energy 77% 19% 4% 0% performance of your home? The energy upgrade options that are best 60% 32% 6% 2% for your home? The amount it would cost to improve your 36% 38% 14% 10% home’s energy performance? The amount of money you could save on 33% 52% 9% 6% energy bills by improving your home’s energy performance? The resources that are available to help 18% 47% 22% 12% you complete upgrades, such as contractors, loans, or other experts?

Half indicated that they had learned “some” about the amount of money that they would save on energy bills from improving their home’s energy performance. Only one-third said that they had learned “a lot” and 15% said that they had learned “not much” or “nothing.” Savings estimates in the audit reports examined were sometimes very wide, in some cases even encompassing $0 savings.

Responses indicated that respondents felt they had learned the least about resources that are available to help complete upgrades. One-fifth said that they had learned a lot, while a third said that they had learned “not much” or “nothing.” Elsewhere in the survey, respondents mentioned wanting more information about financial incentives for energy efficiency upgrades and references to recommended contractors.

4.8 Retrofit Recommendations While some retrofit recommendations may have been communicated verbally, our research was only able to capture upgrade recommendations communicated in the energy audit report, for which the records were made available to the research team. As described in Chapter 1.3, upgrade recommendations were provided in two portions of the Energy Audit Report: in the “Summary of Recommended Upgrades” section, “standardized” recommendations were provided that were selected by the auditor from a list in the EPS tool based on the audit protocol but with some auditor discretion, and which included model-generated cost and savings estimates; and in the “General Notes” section of

49 “Don’t know” responses are not shown in the table. For this reason and because of rounding, the percentages in the table do not necessarily add up to 100%.

Behavioral Perspectives on Home Energy Audits Page 92 the report, “auditor customized” recommendations were sometimes provided by the auditor, and were encouraged by SCL.

In practice, households received anywhere from zero to 12 “standardized” recommendations, most commonly five.50 A few (2%, 22 audits) received no recommendations on this page. At least some homeowners who said that they did not receive any recommendations commented negatively on this fact—the message “your house is in good shape” without further interesting recommendations may not be a satisfying finding for some homeowners who invested time and money in the audit and are actively looking to improve. High-quality behavioral recommendations (as well as “deep retrofit” technical recommendations, which are already part of the audits) may be particularly useful in cases where the most feasible envelope and system upgrades have already been completed. For each category for which a recommendation was made, the report provided a page or more of details. These detail pages included a summary of current conditions, more description of the recommended upgrade(s), and sometimes, deep retrofit options. In some cases, low-cost or no-cost options for energy conservation were also described.

The “auditor customized” recommendations tended to be broad in scope, often including recommendations to address moisture, air quality, pest, and other non-energy issues. The auditor customized recommendations were sometimes quite detailed and personalized, with explanations of not only what to do, but why. For example, one auditor added:

“This Classic [neighborhood-specific] home has wonderful architectural details and leaded glass top lights in the front windows. Overall priorities:

1 The roof is in danger of rotting out from moisture condensing on it in the winter. Attic ventilation needs to be increased in combination with controlling stack effect air movement from the basement. Damp room and crawl area are main moisture source. This can be combined with air sealing at attic hatch and wall top plates in attic (temporarily move insulation) see details in report.

2 General air sealing—weather strip windows and doors, seal around frames and baseboards.

3 Insulate basement rim joist area and perimeter walls. Install tight clean plastic vapor barrier on dirt crawl area. Install small efficient exhaust fan in moist room in basement with 24 hr operation in winter at low (20-40 cfm) speed. Air seal all pipe and wire penetrations and chase around ducts and cat door in basement ceiling.

4 Increase insulation on SPA if possible. It is a cedar spa, which are difficult to insulate. Insulated top would help, better maybe to keep cold water in it and use tankless heater for when in use?

50 Of the participant households, 22 received no recommendations on this page (1.6% of audited households). Some households received multiple recommendations in the “Appliance” category, and these are counted separately here (e.g., “upgrade refrigerator and dishwasher” counts as two recommendations).

Behavioral Perspectives on Home Energy Audits Page 93

5 Install low flow shower heads and insulate all exposed hot water pipes.

6 After moisture problem solved—insulate attic to R49

7 Blow in wall insulation in areas still un-insulated.

8 Replace largest window sash with double pane low e replacement sash and side tack kits. Stained glass may have a simple storm window panel over it.”

While not all reports contained this level of detail in auditor comments, the example shows the complexity of the assessment, recommendations, and guidance that some homeowners received. The remaining analysis in this section pertains only to the “standardized recommendations;” the “auditor customized” recommendations are assessed in more detail in a subsequent section of this chapter.

Table 20 shows the percentage of audit recipients during the study period (not just survey respondents) who received a “standardized” recommendation by recommendation type, for each of 16 categories. The most common recommendation was air sealing (a category that sometimes includes adding mechanical ventilation), advised to nearly 90% of audit recipients. The next most common recommendation was attic insulation, which was recommended to seven out of every ten audit recipients. Over half of audited households also received recommendations to insulate walls or install a high efficiency or tankless water heater. Most also received a recommendation to replace at least one appliance, whether dishwasher, refrigerator, or washing machine. One out of every four received a recommendation to replace windows, though often not all windows. “Replacing all light bulbs” and “replacing the cooling system” were rarely recommended (4% and 6% respectively).

Behavioral Perspectives on Home Energy Audits Page 94

Table 20: Percentage of audited households receiving “standardized” recommendations, by category of recommendation (n=1,355) 51

Recommendation Type Percent of Audit Examples of Specific Recommendations Recipients Receiving Recommendation Air sealing 89% Seal air leaks, install heat recovery ventilation, install mechanical ventilation Cooling System 6% Install a heat pump; upgrade to Energy Star model Heating System 40% Install more efficient heating system (Energy Star heat pump, condensing gas furnace, ductless mini-split heat pump Seal Ducts 45% Seal all seams and joists on ductwork or plenum, replace panned floor joists with ductwork Insulate Ducts 29% Wrap ducts with fiberglass insulation Insulate Walls 61% Dense pack wall cavity, fully insulate attic knee walls, add rigid foam when residing Insulate Attic 71% Insulate attic to R-38 or R-49, dense pack vaulted ceiling space, add foam to deck exterior or roof/deck cavity Insulate Floor 39% Fully insulate floor joist cavity, Insulate basement/crawlspace to R-15 Dishwasher 30% Replace with high-efficiency Refrigerator 41% Replace with Energy Star model Washing Machine 36% Replace with Energy Star model Lights 4% Replace all light bulbs with CFLs Water Heater 54% Install high efficiency or tankless water heater Windows 37% Install storm windows, high-efficiency windows Solar Panel 16% Solar Water Heater 18%

51 This table summarizes “standardized” recommendations for the 1,355 SCL audits completed during the study period. Survey participants are a subset of this group.

Behavioral Perspectives on Home Energy Audits Page 95

4.9 Decision-Making on Energy Upgrades and Other Changes This section summarizes the retrofits that homeowners reported having already completed at the time of the surveys, which other energy upgrade recommendations they planned to do in the future, which upgrades they did not plan to do, and the reasons for these decisions. This section also reports on changes in bills, comfort, and other home conditions that homeowners perceived had resulted from the retrofits completed and presents data on upgrade expenditures, financing, and the use of contractors versus do-it-yourself for the completed upgrades. Findings in this chapter are from Post-Audit Interviews, Post-Audit Surveys, and Retrofit Surveys, and are labeled accordingly.

Prioritization Factors—Post-Audit Interview Findings Post-Audit Interview findings suggest that the cost of upgrades and the ease of completing them had an impact on the priority contacts gave to audit recommendations. Respondents, after completing upgrades or when asked about upgrades they were thinking about completing, primarily cited the cost or cost-effectiveness of individual upgrades as a consideration influencing their decision to pursue certain upgrades rather than others (Table 21). Homeowners also reported considering the ease of completing upgrades and whether the project would require them to hire a contractor. Describing these considerations, respondents made comments that suggest they viewed the process of selecting a contractor as time-consuming. For example, one homeowner stated that, while she could install other types of insulation herself, “insulating walls, will have to get a bid, and [we] will do that later.” Similarly, another respondent stated that the upgrades she was most likely to complete were “easy, do-it-yourself, low-cost fixes,” while lower-priority options “require money, time to find contractors, etc.”

Comments related to consideration of the remaining expected useful life of appliances or equipment suggest that the interviewed homeowners may be more likely to replace equipment they see as older and likely to fail relatively soon. One respondent stated that replacing her water heater was “a top priority” because “the auditor said it would die within a year,” while another stated that she “can’t see replacing” a washing machine that is a year and a half old. Non-energy concerns that respondents cited include the impact of air sealing and air circulation on moisture within the home and the homeowner’s allergies.

Table 21: Considerations in priority of upgrades (Post-Audit Interviews; n=12)

Consideration Count Cost or cost effectiveness 10 Easy/can do themselves 4 Expected useful life of appliance or equipment 2 Non-energy concerns 2 Provide greatest energy savings 1 Plan to make all recommended upgrades 1

Behavioral Perspectives on Home Energy Audits Page 96

While homeowners indicated that they may prioritize some recommended upgrades over others, they also noted a few recommendations that they were unlikely to complete. These included upgrades that auditors had advised the homeowner would not be cost-effective (window upgrades) or that drew on costly emerging technologies (LED lighting).

Upgrade Recommendations Completed—Survey Findings We asked survey respondents which of the recommendations that they received had they already completed, collecting responses as open-ended text. A few cautions preface the description of retrofit activity. First, the surveys cover only a limited period of time after the audit.52 The audit may motivate future retrofits, though all we could (and did) do was ask participants about such future plans. Second, home improvement is complicated, and there was sometimes ambiguity in the survey results about precisely what recommendations were completed (e.g., how many windows were replaced) and how they related to other sorts of home improvement activity.53 Sometimes respondents reported having completed a recommendation that did not show up in the written audit report that they received – a situation that we discuss briefly below.

As a gross indicator of retrofit activity, 57% of survey respondents reported completing at least one recommendation, while 49% reported completing at least one non-sealing recommendation. So overall, homeowners who undertook audits seemed to be fairly responsive to the recommendations they received, in that a moderate majority of respondents surveyed said they did at least something.

52 Homeowners were asked about retrofits completed in both the Post-Audit Survey and the Retrofit Survey. While the Post-Audit Survey sometimes took place relatively soon after the audit, we took the opportunity to ask some questions about retrofit activity at that point, in case the household could not be contacted again. The Retrofit Survey asked more detailed questions about retrofits, and was fielded to respondents to the Post-Audit Survey, as well as other households. For reporting on recommendation activity, depending on the situation, we sometimes define the denominator as all surveyed households, sometimes as the Retrofit Survey respondents plus a small number of Post-Audit Survey respondents who reported retrofits in the Post-Audit Survey but were not resurveyed in the Retrofit Survey, and sometimes only as Retrofit Survey respondents (e.g., when it concerns only questions asked in the Retrofit Survey). This is why counts sometimes seem at odds. 53 For example, a respondent for which the only reported energy upgrade was “eliminating air infiltration in the basement,” said that they spent $90,000 on this upgrade. Further investigation suggests that they converted their basement into a living space, likely involving a variety of improvement activities beyond air sealing.

Behavioral Perspectives on Home Energy Audits Page 97

Table 22 gives the number of recommendations that survey respondents reported having completed, at two different levels: all recommendations (left) and recommendations other than air sealing (rightmost column).54 Many of the recommendations reported as undertaken, however, were quite small. Data on expenditures, below, may be a better indicator of degree of change—though savings certainly do not necessarily parallel expenditures.

54 These are “recommendations” as reported by the respondent, not necessarily present in the written record, as discussed in the previous paragraph.

Behavioral Perspectives on Home Energy Audits Page 98

Table 22: Number of recommendations that survey respondents reported having completed (n=235) 55

Number of Percentage of Percentage of Survey Recommendations Survey Respondents, for Completed Respondents Recommendations Other than Sealing 0 43% 51% At least one 57% 49% 1 17% 24% 2 14% 13% 3 11% 7% 4 6% 3% 5 5% 1% 6 or more 3% 2% Total Reported Completed 362 236 Recommendations

Table 23 (below) summarizes the retrofits completed by category of recommendation. The first numerical column reports the percentage of surveyed occupants who received a written recommendation for a given upgrade, according to our records. The next column reports the percentage of all respondents receiving each type of written recommendation who reported completing it (i.e., conditioned on presence in the audit recommendations). The rightmost column gives the percentage of respondents who reported completing a given upgrade—whether or not our records show that it had been recommended. Clearly, at the time of the survey, most recommendations remained uncompleted. Though many reported completing some air sealing (34% of those for whom it was recommended), just 6% of the recommendations to seal ducts were completed. The recommendations that respondents were most likely to complete were floor insulation (19%), ceiling/attic insulation (16%), upgrading the heating system (12%), and though not often recommended, upgrading windows (12%) and installing a high-efficiency water heater (15%).

Recall that this research was not intended to estimate program impact. So while the activity reported seems modest, it should also be balanced with the question of what degree of activity should be expected over this often rather short period of time. We interpret this data with respect to what looked most compelling and achievable from homeowners’ perspectives, and why. A few observations:

 Air sealing was the most common upgrade recommendation completed. Air sealing was recommended in 92% of the surveyed homes, and about one third (34%, see Table 23 of all households who received this recommendation reported doing at least some air sealing. About a quarter of these reported doing only air sealing, and most of those respondents spent between $5 and $50 on this task. Among those who did some type of air sealing, these upgrades were typically performed by someone in the household (47%), though many used a contractor for all (32%) or some

55 Includes both Post-Audit Survey and Retrofit Survey respondents.

Behavioral Perspectives on Home Energy Audits Page 99

(21%) of this work. Generally, upgrades that respondents said they did themselves involved simple air sealing, such as caulking and weather-stripping.

Table 23: Energy upgrade recommendations received and reported completed (Survey respondents only; n=211)

Among Those Percent of All Percent of Receiving Surveyed Households Category Recommendation, Households Who Receiving Percent Who Reported Recommendation Reported Completing Completing

General air sealing 92% 34% 35% Seal ducts 51% 6% 4% Insulate - Ducts 33% 6% 1% Insulate - Floors 44% 19% 11% Insulate - Ceiling/Attic 74% 16% 12% Insulate - Walls 69% 7% 8% Upgrade - Solar panels 14% 0% 0% Install photovoltaic panels 13% 0% 0% Perform maintenance on your solar array 1% 0% 0% Upgrade - Water heater 41% 6% 3% Install a heat-pump water heater 14% 0% 0% Install a high efficiency electric or gas water 8% 15% 1% heater Install a tankless water heater 18% 7% 1% Upgrade - Solar water heater 19% 0% 0% Upgrade - Windows 30% 6% 4% Install storm windows 14% 0% 0% Upgrade windows 16% 12% 4% Upgrade - Heating system 54% 12% 6% Upgrade - Appliances 61% 2% 2% Dishwasher 30% 0% 0% Refrigerator 39% 0% 0% Washing machine 37% 3% 1% Upgrade - Cooling system 7% 0% 0% Install heat pump 6% 0% 0% Upgrade to Energy Star air conditioner 1% 0% 0%

 Insulation was the next most common recommendation, though only 1 in 6 who received the recommendation had completed it by the time of the Retrofit Survey. Nearly eight out of ten surveyed households received a recommendation to add or replace insulation to their floors, ceiling, attic, or walls. A lower percentage of households undertook wall insulation (7% of households receiving the recommendation) and compared to floor or ceiling/attic insulation (19% and 16%

Behavioral Perspectives on Home Energy Audits Page 100

completion, respectively). Open-ended responses indicate that the expense and the disruption, as well as unforeseen technical complications, contributed to the fairly low completion rate. As presented below, insulation was often on the “to do later” list.

 Few respondents reported making the recommended appliance replacements. Though 60% of surveyed households received a recommendation to replace at least one appliance, many said that they preferred to wait until the appliance needed to be replaced. So the audit experience may encourage some energy efficiency that may not come to fruition for even years after the audit. On the other hand, many households replaced appliances with energy efficient ones as a change they made outside of official recommendations. Such replacement may often be seen as a relatively easy, moderate-cost action with benefits other than energy savings. As noted above, in our surveys, respondents often reported having completed an upgrade that was not clearly recommended in the written record. Most of these upgrades were quite minor, but sometimes they involved a major change such as replacing the heating system or installing insulation. There are a variety of plausible explanations. Some recommendations may have been made by the auditor only orally. Others might be explained by a homeowner’s nuanced review of the audit report, such as replacing a heating system for performance or modest energy savings, even if the predicted savings were too low for an official recommendation in the audit report. Finally, and most interesting, these homeowners did not come into the audit process “blind,” nor were the audits necessarily their only source of upgrade guidance. As discussed elsewhere, participating homeowners often already had upgrade plans, and consultation with contractors and others after the audit certainly influenced upgrade decision-making as well. So when asked what audit recommendations they had completed, survey respondents may not have always made clear distinctions as to the source of the recommendation.

Upgrade Decision-Making—Retrofit Survey Findings When asked in the Retrofit Survey why they undertook the retrofits they did, the most common response was to save energy and resources (44%), as shown in Table 24. About a quarter of respondents also cited improving comfort (28%), being more efficient (25%), or reducing energy bills (23%). Some also said that they did the retrofits that they did because they were easy to do. Relatively few mentioned increasing or preserving the home’s value as a reason for undertaking the retrofits they did.

What Further Retrofits are Planned?—Retrofit Survey Findings Sixty percent (95 of 159) of Retrofit Survey respondents said they were planning to complete one or more of the recommended retrofits in the future, whether or not they had already completed some recommendations. Two-thirds of these homeowners mentioned insulation, which was frequently recommended but infrequently completed at the time of the retrofit survey, as one of the upgrade recommendations that they were planning in the future. Respondents who had not completed any recommendations were somewhat more likely to say they planned to complete recommendations in the future (67%) than were respondents who had already completed at least one upgrade (52%). The most common reason given for deciding to put off doing the “still planned” retrofits was the expense (Table 25). Lack of time and other concerns—such as waiting for better weather, practical considerations of bundling with other home improvements, etc.—were also common reasons.

Behavioral Perspectives on Home Energy Audits Page 101

Table 24: Reasons homeowners gave for choosing the retrofits they completed (Retrofit Survey; n=109) 56

Reason Provided % Stating Save energy and resources 44% Improve home’s comfort 28% To be more efficient 25% Reduce utility bills 23% Easy to do 17% Replace older equipment 10% Increase or preserve home value 9% Provide fresh air or improve indoor air quality 5% Improve safety 4%

Table 25: Reasons homeowners gave for delaying planned retrofits (Retrofit Survey; n=95)

Reasons for waiting to do planned retrofits % Stating Too expensive 47% No time/Household schedule 27% Need a contractor or somebody else to do it 7% Not sure they’re worth it 4% Disruption 2% Waiting for current equipment to break 2% Other 31% Don’t know 3% Refused 3%

Recommended Upgrades That Households Decided Against—Retrofit Survey In the Retrofit Survey, nearly four in ten respondents (38%) said that they had definitely decided against doing at least one recommended upgrade.57 The “won’t do” recommendations cited were varied, but generally were high-cost retrofits such as insulation, window replacement, or replacing the heating system or water heater. Table 26 summarizes the reasons people gave for deciding against particular retrofits. The most common reason for deciding against pursuing these retrofits was cost: 62% said that the retrofit was too expensive. Some respondents also noted that they were not convinced that the retrofit was worth the benefits—i.e., they would not reduce costs or increase comfort enough to warrant the cost and effort, or savings were too uncertain. Some said that they would not make the changes because of what they learned after consultation with a contractor. For example, one

56 Open-end responses recoded by the interviewer. 57 In some cases, there may have been ambiguity about what these recommendations were. In particular, some homeowners received customized recommendation lists in addition to the more standardized recommendations in the Energy Analysis Report, and the reports also often included “Deep Retrofit” suggestions. For reasons of privacy, interviewers could not directly access the specific upgrade recommendations.

Behavioral Perspectives on Home Energy Audits Page 102 respondent stated, “we will not insulate the roof because it’s impossible,” and another reported, “we had a furnace guy come out, and he said we did not need a new furnace.” These direct challenges to the original recommendations were not commonly offered, but it was clear that contractors did not always have the same assessment as the auditor.

Table 26: Reasons given for deciding against particular recommendations (Retrofit Survey; n=61)

Reasons for deciding against particular % Stating recommendations Too expensive 62% Not sure if they’re worth it 41% Won’t reduce energy costs 10% Waiting for current equipment to break 5% Disruption 2% Other 36% Don’t know 3%

Who Did the Work?—Retrofit Survey Almost two-thirds of the households who said they had done some of the upgrades recommended used a contractor for at least some of the work, while just over one-half said that they had done at least some of the work themselves (see Table 27). The work homeowners reported completing themselves was most often local air sealing such as caulking and weather-stripping.

Table 27: Who performed the upgrades that were completed? (Retrofit Survey; n=82)

% among those who had Who did the upgrades that have been completed? completed retrofits Contractor 47% Someone in the household 37% Some by a contractor, some by somebody in the household 16%

What Other Home Improvement Activities Went On?—Retrofit Survey Many homeowners had been doing other sorts of home improvement over the period between the audit and the final Retrofit Survey. Asked whether they had undertaken retrofits or made changes to their home other than those recommended in the audit report since their audit, four out of every ten homeowners surveyed in the Retrofit Survey said that they had.58 Some were very minor and some substantial. Though not necessarily undertaken to save energy, these changes often had energy implications. Many had replaced one or more appliances, several mentioned remodeling kitchens or

58 A few mentioned behavioral changes when they discussed what recommendations they had completed (e.g., turning down the water heater tank temperature—possibly recommended by the auditor), but these were not prevalent and are not included in the current accounting.

Behavioral Perspectives on Home Energy Audits Page 103 bathrooms, and a few had made changes in fuel such as solar electric panels or switching over to gas heat. Others talked about improvements such as seismic retrofits or landscaping.

Most of these changes had been planned before the audit, respondents said, even as the audit influenced the remodeling decision. For example, some respondents mentioned that, as a consequence of the audit, they had looked for energy-efficient models when they replaced appliances. Others said that the audit had raised their awareness of how the house worked and the resources used within it. This could translate, for example, to installing water-efficient toilets, fixing water leaks, or electrical improvements. In some cases, such as adding wall insulation in combination with painting or adding siding, energy efficiency upgrades and other changes bundled naturally.

4.10 Costs and Financing

Effect of Financing and Incentives on Decision-Making—Pre- and Post-Audit Interviews Because of the complexities of talking about financing, we used the interviews (both Pre- and Post- Audit) to drill down further on homeowner views on financing and incentives—generally before they had undertaken retrofits.

Incentives Most respondents rated rebates and tax credits as being influential on the upgrade decisions they might make, but their comments revealed that this influence was often contingent on other factors, such as overall cost or non-energy benefits.

A majority of all Pre-Audit Interview respondents (19 of 33) were aware of an incentive or tax credit available for energy efficiency upgrades. While homeowners cited both rebates and tax credits as likely to influence their decision-making, interview findings suggest that incentives may be somewhat more likely to affect upgrade decisions. Two-thirds of respondents (22 of 33) stated that a rebate would be “very likely” to affect which upgrades they pursue, while slightly less than half (16 of 33) said that a tax credit would be “very likely” to affect this decision.

Table 28: Rating of likelihood of incentive affecting upgrade decision (Pre-Audit Interview; n=33)

Incentive Rating Type Very likely Somewhat likely Not at all likely Not Answered Rebate 22 7 1 3 Tax Credit 16 11 5 1

Homeowners also noted that the influence of the rebate or incentive would depend on other factors. Contacts reported that they would consider overall cost, non-energy benefits, and which upgrades would result in the most energy savings in addition to the availability of incentives. Representative comments include:

Behavioral Perspectives on Home Energy Audits Page 104

“If there is more than one thing, we will look at the thing that causes the biggest energy loss, regardless of whether it's subsidized or not; beyond that the subsidies will make a difference if all others are equal.”

“Comes down to bang for the buck. If the rebate brings something that was out of line [not cost effective] into line, that makes a big difference.”

“I think [incentives] will tie into it. But we are going to be in this house for 20 years, so more of the decision process will be about creating the most comfortable and efficient house we can.”

Six of the 22 respondents who said the rebate would be “very likely” to affect their decision did not explicitly contextualize the influence of the rebate within a broader decision-making framework, instead making comments such as, “Rebates are an important variable. Free money is good.”

Financing Pre- and Post-Audit Interview findings suggest that these audit participants do not see financing as an attractive option in completing efficiency improvements. Less than a third of all Pre-Audit Interview respondents (9 of 33) indicated that they would be willing to finance part or all of the projects they chose to complete following the audit. Nearly all of the remaining respondents (22) indicated that they would pay out of pocket. The information homeowners gained from the audit did not seem to affect these views. Of the twelve respondents with whom we completed both Pre-and Post-Audit Interviews, none had changed their mind about financing after receiving the audit: one homeowner indicated he or she would finance part of the upgrade, and the rest indicated they would pay out of pocket.

Post-audit respondents’ assessment of whether it would be “worth taking out a loan” to complete recommended upgrades are consistent with the lack of interest Pre-audit respondents showed in financing. Of the 23 homeowners who completed Post-Audit Interviews (including homeowners not interviewed Pre-Audit), a plurality (12) stated that it would not be worth taking out a loan to complete the upgrades, and the rest were divided between those reporting that it might be worth it or would be worth it (six and five respondents, respectively).

Table 29: Post-Audit Interview responses to whether it would be “worth” taking out a loan to complete upgrades (n=23)

Response Count No 12 General debt avoidance 7 Not worth it for these particular improvements 4 Not specified 1 Maybe 6 Yes 5

Behavioral Perspectives on Home Energy Audits Page 105

Homeowners’ reasoning behind these responses revealed a general hesitance to borrow money, and sometimes the assessment that the value of energy efficiency upgrades did not warrant the costs of getting a loan. Of the 12 who stated that loan would not be worth it, seven stated that they avoid taking on debt of any kind. Representative comments include:

“Not worth it. Avoid debt like the plague, would rather not take on debt of any kind.”

“No. If I had to take out a loan, I wouldn't do it, because I don't like to pay interest.”

On the other hand, four stated that they did not think a loan was worth it for efficiency improvements specifically:

“Not worth it. Can't see justifying paying the interest on something like this.”

“I don't think there is anything that is that big a deal that would require a loan. It's not worth it for us.”

“No. The furnace, if it had other problems, I would spend the cash or get a small loan, but generally no.”

Five people said it would be worth it to take out a loan. These homeowners generally provided little justification, but focused on not having the cash on-hand to complete the upgrades. One respondent mentioned non-energy benefits of the upgrades, and one mentioned the payback: “Worth it … It may not increase overall value of the house, but I will get the loan now while it’s cheap and rates are good. Eventually, the upgrades will pay for themselves.”

Of the six people who said “maybe,” two mentioned payback period as an important factor, and one mentioned that limited-time incentives might induce them to take out a loan. Two other respondents also mentioned the urgency or impact of the improvement as a possible reason for a loan.

How and How Much Homeowners Paid—Retrofit Survey In the Retrofit Survey, we asked homeowners who said that they had completed some of the recommended upgrades about how much these upgrades cost, in total, and how the costs were paid for.

Most respondents (83%) said they used cash or savings, and 14% said they paid for the work with credit. Few respondents (6%) reported taking out a loan to pay for the energy upgrades they had completed.59 Those who did take out a loan said that they had spent at least $10,000, some much more, while some paid cash even for jobs over $10,000.

59 Some respondents used more than one mode of financing.

Behavioral Perspectives on Home Energy Audits Page 106

Table 30 shows total estimated expenditures for all upgrades performed, for the households who did at least some upgrades and who were willing to give us an estimate of their expenses. While some said that they spent as much as $90,000,60 half spent $1,500 or less.

Table 30: Total estimated expenditures for all energy upgrades performed for the households which completed some upgrades (Retrofit Survey; n=66)

Percentile Average Expenditure 25% $200 50% $1500 75% $3875

4.11 Reactions to Upgrades—Retrofit Survey In the Retrofit Survey, we asked homeowners a number of questions about the results of their upgrades and the extent to which they were satisfied with them. Findings on the technical quality of the upgrades are discussed in the next chapter.

Half of the homeowners who completed upgrades said that they were “very satisfied” with the results. An additional 25% of households said that they were still “somewhat satisfied.” A quarter of households who undertook retrofits were less than satisfied, though less than 10% said that they were “somewhat” or “very dissatisfied.” Among those who said they were neither satisfied nor dissatisfied (15%), respondents sometimes commented on the high cost of the work they had done (e.g., the insulation), the uncertain savings, and the changes they made had other downsides (e.g., lost storage space, or other unspecified things working less well).

Homeowner Perceptions of Energy Savings Survey respondents most often (46%) reported that they noticed little reduction in their energy bills as a result of the recommended upgrades that they had completed. A relatively small proportion (10%) reported that their bills were reduced “a lot”. These results are summarized in Table 31. Normal variability in energy use, in occupancy patterns, in energy-using equipment, etc., may easily obscure changes in energy use due to minor or modest levels of retrofits. To give some insight, we analyzed the variability of the electricity and natural gas billing data we collected for these homeowners for the time period just prior to the audit—i.e., not related to any changes instigated by the audit itself. For the audited households for which we had electricity data, 40% showed changes of 10% or more in total electricity use from year to year; 38% showed year-on-year changes in natural gas use of more than 10%.61 One in every five survey respondents reported notable changes in occupancy patterns since

60 Probably an over-estimate that includes other upgrades, as noted elsewhere; the second highest estimate or total cost of upgrades already completed was $55,000. 61 Using the middle of the month preceding the month of the audit as a breakpoint, we calculated annual electricity expenditures for up to the three years prior, and compared year on year differences between the two

Behavioral Perspectives on Home Energy Audits Page 107 completing their retrofits, such as retirement, being laid off, new babies, illnesses, or increased travel. A few survey respondents expressed disappointment in the energy savings they experienced, but overall, many households still said that they expected long-term savings, even if they could not see them. Increased comfort, as discussed below, seemed to be the major palpable benefit.

Table 31 also shows the median reported total expenditure on recommended upgrades for each level of perceived savings. The level of perceived savings generally increases with the median expenditures on retrofits. There is a great deal of variation in retrofit cost, however, among participants who perceived the same level of savings: for example, 25% of those who reported achieving “not much” savings said that they spent over $2,000 on recommended upgrades. Some of the highest-spending households reported that it was “too early to tell” whether their upgrades had saved energy.

Table 31: Homeowner perceptions of savings from recommended upgrades they undertook, and median total reported expenditures by reported level of savings (Retrofit Survey; n=109)

How much do you think the Percentage of Corresponding upgrades you did reduced your Respondents Among Median energy bills each month? Those Who Did Retrofits Expenditure Not much 46% $450 Some 30% $2,000 A lot 10% $3,250 Too early to tell 11% $9,000 Don’t know 3% n/a

There were a few survey respondents who expressed clear disappointment with the retrofits that they did or pointed out unexpected difficulties. For example:

“I learned that I’m not saving as much with my new furnace as the auditor said I would. Also, very few people had good ideas on how to insulate a 1940s brick house that didn’t cost $10,000 and didn’t destroy the wallpaper.”

The comment points to potential downsides or difficulties of retrofits that energy efficiency recommendation processes may often overlook, focusing instead only on financial cost and financial and more general benefits. Particularly for older houses, common in the Seattle area, there may also be concerns about preserving the historic character of the home, as implied by several respondents.

Other Changes Resulting from Upgrades More than cost savings, though, most respondents who had completed some recommended upgrades said that they felt more comfortable, sometimes in the entire house and sometimes in particular rooms. pairs of successive years (n=699). The percentages given above are for two years prior to the audit compared to one year prior to the audit. During that period there was a 14% increase in SCL electricity rates. The calculation procedure was similar for natural gas, though we had these two years of consecutive natural gas billing data for only 47 households.

Behavioral Perspectives on Home Energy Audits Page 108

These households often mentioned that temperatures were warmer and more consistent, that the heater cycled less frequently, and that there were fewer drafts. Even those who had done only local sealing and weather-stripping often said that comfort had improved. A smattering of co-benefits were also mentioned: less noise from the street, better air quality and less dust, better protection against rodent infestation, and less indoor moisture.

We also asked about downsides: many mentioned cost and some mentioned the time that was required to complete the retrofit. A couple mentioned more functional downsides of the upgrades they completed, such as it taking longer to get warm water (for two who had installed tankless water heaters) or that the attic was more difficult to access because of the increased insulation.

4.12 Auditor Impacts on Outcomes—Retrofit Survey Findings The auditors providing SCL home energy audits are trained to follow a standard protocol, use a standard input audit data form, and generate recommendations and the audit report using the same tool. Yet auditors may differ substantially in how they approach the audit, interact with the customer, customize the report, and perhaps even in what advice they give. To determine how much difference the auditor makes in what homeowners actually do, we considered how auditors, operating within the same programmatic context and utilizing the same energy modeling and reporting tool, affect homeowner actions.

Besides the “standardized” recommendations provided as part of almost every Energy Analysis Report, many auditors also provided detailed notes and additional “auditor customized” recommendations in the “General Notes” section of the report.62 In fact, this was a strongly encouraged component of the program. Some of these auditors prioritized these recommendations; others did not. Of the homeowners for which Retrofit Survey data was collected, 75% received auditor customized recommendations, and two-thirds of these (49% of all Retrofit Survey homeowners) were also prioritized. Earlier work found that a prioritized list of upgrades helps homeowners actively approach efficiency upgrades for their home (Earth Advantage Institute and CSG 2009). The desirability of such lists was echoed by quite a few of the homeowners we surveyed and (as noted above) in the interviews as well. The standardized recommendations summary list was organized categorically and not prioritized.

We tested whether the presence of auditor customized recommendations, above and beyond the standardized recommendations, increased the propensity of homeowners to do upgrades, and whether the auditor’s prioritization of these recommendations was associated with an additional increase. Homeowners who received customized recommendations were more likely to report doing at least one upgrade (with 35% doing at least one upgrade) as opposed to those who did not (with 15% doing at least one upgrade)—and those who received customized recommendations were more likely to do more than one upgrade as well. That is, though only a quarter of these households did not receive customized recommendations, those that did not were considerably less likely to act. There was little evidence of an

62 Additional details are provided in Chapter 1.3.

Behavioral Perspectives on Home Energy Audits Page 109 additional effect for prioritized lists of upgrade recommendations: 32% of surveyed homeowners who received prioritized customized recommendations reported completing at least one upgrade, while almost as many (29%) who received non-prioritized customized recommendations also completed at least one upgrade.

In short, the extra attention paid by the auditor in composing a note to the homeowner that laid out upgrade recommendations seemed positively associated with the homeowner’s decision to act. This may be a matter of the clarity that a prioritized list provides—as a checklist where homeowners can clearly see where to start as well as a roadmap of where to go, as opposed to or in addition to recommendations organized by category. Or it may be a matter of the enthusiasm of the auditor and the thoroughness of his or her assessment. Or both. But overall, the results speak against interpreting home energy audits as simply a technical exercise, even if the technical aspects of the delivery—the audit tool, the blower door test, etc.—and the scientific backing are essential. This working conclusion is underscored by homeowner comments about the auditors as well as what auditors said about themselves: the auditor’s demonstration of his or her knowledge and dedication likely builds homeowners’ personal trust, potentially increasing the propensity to upgrade or at least to think differently about energy.

4.13 Energy Use Behavior Changes—Post-Audit and Retrofit Survey We asked homeowners what behavioral changes they made: in the Post-Audit Survey, we asked respondents about changes made in response to the audit itself, while in the Retrofit Survey, we asked only those who had completed upgrades about changes in response to upgrades they had completed.

Behavioral Perspectives on Home Energy Audits Page 110

Table 32 summarizes responses for both questions.63

As to behavior changes reported in response to the audit (Post-Audit Survey), 25% of those surveyed reported making some changes—almost as many as said they undertook some air sealing measures. While the audits were designed to focus on efficiency of the building shell and systems, the Energy Analysis Report also provided no-cost and low-cost strategies to save energy, such as closing the fireplace damper when the fireplace is not in use, putting bathroom ventilation fans on a timer, turning down the heat, or using natural ventilation for cooling. Additionally, and perhaps more important here, auditors sometimes gave suggestions that differed from the standard list concerning what members of the household could change about the way that they used energy—for example, to use portable heaters for zone heating rather than the central heater. Program sponsors did not encourage this type of advice.

63 From the standpoint of data quality, collecting reliable information about energy use behavior change is difficult, since, for example, shifts in habitual behavior can be subtle, changes may be under-reported, and there is sometimes conflation between behavior change and technological change (e.g., beginning to replace lamps with CFLs rather than standard incandescent lights). See Woods (2003) for further discussion on the quality of survey data on behavioral conservation.

Behavioral Perspectives on Home Energy Audits Page 111

Table 32: Reported changes in energy use practices after the audit or after upgrades

Post-Audit Survey Retrofit Survey Respondents: Respondents: Changes in Energy Use Changes after Changes after audit (n=134) upgrades (n=69) Turned off lights more often 2% 17% Turned off appliances/electronics 1% 7% Unplugged items - 3% Turned down heat 14% 9% Turned off heat 1% 2% Washed clothes in cold water 1% 1% Changed AC settings 2% 1% Used different heating equipment - 2% Other 8% 19% Any change 25% 41% Number of respondents reporting changes 33 28

The most common behavioral change reported in reaction to the audit was turning down the heat. Perhaps this was a change recommended in discussions with the auditor, or perhaps it was rather a reaction to having done the audit in general, which could have (as some comments implied, see below) made some of the no-cost or low-cost conservation tips seem more compelling. Some households reported closing interior doors to avoid losing heat to unused, unheated areas. Some comments underscore that the auditor’s visit changed homeowners’ ideas about which energy-saving actions were worthwhile and which were not. For example, one respondent said that they had stopped unplugging things, commenting that they “found that they were only reducing comfort” by this action.

This isolated comment may be a clue to a more widespread mismatch between a top-down and a householder-centered view: run of the mill energy tip lists often include actions that may have imperceptible impact on an individual home’s energy use. Advice about how to save energy by changing behavior could take better account of effort versus payoff, mimicking the cost-effectiveness used for energy efficiency upgrades, versus the more typical “everything counts” model of most energy saving tips. This finding echoes some survey respondent comments reported above, namely, that some of the respondents would have liked more information related to usage choices, such as how much it costs to run a particular appliance. So one way home energy audits might better support these homeowners is to provide more sophisticated information about how practices such as specific appliance usage patterns affect energy use.

In the Retrofit Survey, homeowners who had completed some recommendations were asked how they changed energy use in response to the changes in their home. About four in every ten (41%) said that they had. The most common response, curiously, was by turning off lights more often—a highly generic energy conservation action. Several said that they had become more attentive in general, and a few mentioned turning the heat down or off. One of the things that researchers were looking for in this

Behavioral Perspectives on Home Energy Audits Page 112 question is evidence of “takeback,” i.e., that people who had made upgrades to save energy or increase comfort would take back the energy savings by increasing their use of energy services such as using more heat once their heating system was more efficient. Only one respondent mentioned something that could be directly interpreted as takeback: that since they had installed CFLs in the basement, they tended to keep those lights on more than previously. This very low prevalence does not prove that there was virtually no takeback, but rather suggests that, if there was, respondents were unlikely to directly recall or offer it when asked in a survey.

As to why respondents made the behavioral changes they reported, in addition to citing the auditor’s suggestions or the “No cost/low cost” strategies section of the audit report, many mentioned that the audit experience itself changed their awareness:

“If I am going to pay to repair aspects about my home, then it means that I am more committed and aware.”

“The whole audit process caused me to think about my energy use and expand my environmental awareness.”

“Now we have more information on the upside of these behavioral changes, and viable alternatives.”

These comments suggest that home energy audits can have deep effects on how some occupants think about energy use in their home, even in a group that overall is relatively attuned to energy efficiency. They may be more attentive to energy efficiency in subsequent changes that they make to the home for any reason, or in things that they buy, or in how they use energy in the home and their expectations about the balance between behavior and device energy efficiency.

4.14 Summary of Findings Our analysis focused on where the homeowner perspective seems to differ from that usually assumed in residential energy efficiency program design,64 as well as on confirming what parts of program theory and audit process already fit well with most homeowners’ expectations. The underlying question was not necessarily ‘how do we get homeowners to do more retrofits?’ but to understand the rationale, practices, and experiences of homeowner behavior and decision-making.

Audit participants seemed to believe in the potential benefits of energy efficiency, particularly in reducing their energy costs, and they were interested in taking steps to increase the efficiency of their homes. These motivated participants were generally satisfied with the type of comprehensive audits SCL offers, and most participants seemed to value the detailed information about specific upgrades that these audits provided. Audit participants used this detailed information to determine which upgrades they would pursue and to prioritize their energy improvements.

64 See, for example, the literature review by Sanquist et al (2010), which points to the some of the standard assumptions.

Behavioral Perspectives on Home Energy Audits Page 113

Nonetheless, while the audits seemed effective in providing participants with specific information about energy improvements, numerous issues (upfront costs, low energy prices, etc.) limited the degree to which audit participants acted to undertake certain improvements.

While they valued the specific recommendations that the SCL home energy audit provided, the homeowners who participated in this home energy audit program might not be the most receptive audience for the types of comparison the EPS or similar scores allow. The majority of homeowners we surveyed and interviewed were not planning on selling their home in the near term and were seemingly more interested in making home improvements, wanting specific upgrade advice toward improving their own living conditions rather than for directly affecting market value of their home. Interviews indicate that people who are motivated to (or have already decided to make changes) but just want to know what to do, were less likely to be motivated by the EPS score itself than those seeking to compare their home to others. SCL anticipated as much and it is for this reason that they designed a program that tested homeowners’ reception of an asset rating along providing an audit report with customized recommendations.

Key Findings

Who Participated in the Seattle Home Energy Audit Program, and Why?  Homeowners participated in the program for a variety of reasons, but saving energy and money, and improving comfort were bigger primary rationales than environmental concerns. The relatively low price of the audit ($95) seemed to have also attracted homeowners who were curious about how their home worked.  The program was lightly marketed, and less than one percent of eligible households actually undertook the audit during the period we studied. Those who did may have been unusually interested in energy efficiency and possibly particularly active in home improvement activity.  Demographic data collected indicate that overall, participants were more educated and wealthier than typical of Seattle single-family households. However there was still appreciable variability in demographic circumstances, including lower-income households and participants without college degrees, as well as life-stage, household size, etc.

 Half of the households surveyed said that they had been living in their homes for more than ten years. Also, few homeowners (7%) said that they were planning on selling their home within the next three years.

What was the Overall Reaction to the Audits?  A high majority of the homeowners we talked to said that they found the audit was thorough (93%) and that the auditor communicated well (92%), and many homeowners seemed quite impressed with the audit itself. When it came to considering what upgrades to actually make and how, respondents were somewhat less enthusiastic, pointing to difficulties they had in finding contractors, how much savings they could really expect, in understanding the details, and overall, in undertaking upgrades that were expensive with uncertain benefits.

Behavioral Perspectives on Home Energy Audits Page 114

 The individual auditor has an important role in the homeowner reaction to the audit.

 Homeowners pointed to blower door tests and IR photography as particularly effective in making the audit convincing in terms of its use in illustrating the value of making changes and indicating exactly what locations needed work. In particular, these techniques seemed to motivate high levels of weatherization and local air sealing, and to lead some homeowners to think of energy use in their homes in a different, more sophisticated, way.

 A few homeowners objected to an asset-based rating approach because it used static assumptions for energy consumption—“why treat me like an average?”

Scores and Labels  Receiving an energy or carbon rating for their home appeared to motivate or draw only a modest fraction of all participants to undertake their audit. However, less than half of the interview respondents said that they knew they would receive such a rating, and the rating may not have played a strong role in the marketing of these audits, though scoring is mentioned on the SCL Home Energy Audit Program offer webpage.65  Despite the fact that receiving the score may not have been a major draw, many homeowners seemed interested in the EPS Energy Score, and to a lesser extent, the Carbon Score.

 Most participants surveyed (95%) said that they would like to see an energy rating when buying a home, and almost as many (82%) said that they would be willing to share theirs—often even when the score was lower than average. While this enthusiasm is promising, the responses should be interpreted as hypothetical, since there is no context established (e.g., mandatory versus voluntary disclosure) and few of the homeowners surveyed said that they expected to put their home on the market soon.

Reports and Recommendations  Homeowners made several suggestions for the report as it was specifically constructed per the SCL Home Energy Program design,66 including the recommendations for upgrades: o More recommendations, especially achievable, low-cost suggestions. o More precise information about costs and benefits associated with upgrades. o More information and practical support on how to complete recommended upgrades. o Greater level of detail in describing a home’s issues, such as inclusion of infrared photos67 and other testing results.

65 http://www.seattle.gov/light/conserve/hea/ 66 These suggestions ultimately reflect upon the program implementation and on the efforts of the auditors, highlighting the desire of many of the homeowners surveyed for even more specific and detailed guidance on ways to go about improving their home. SCL worked with the auditors to increase the level of detail and customization of the report over the course of the program.

Behavioral Perspectives on Home Energy Audits Page 115

o Information on available upgrade subsidies or other financing incentives. However, at the time of the audits, there were not many incentives available.68

Retrofit Activity: Recommended Upgrades and Other Home Improvement Activity  Air sealing was the most common recommended upgrade (89% of all audited households received this recommendation) and among those surveyed, it was the upgrade most frequently completed. Air sealing upgrades were frequently completed by somebody in the household, often consisting of low- cost measures such as caulking and weather stripping as opposed to a contractor-completed whole house air sealing effort.

 Many homeowners (40%) said that they planned to do insulation work in the future, but were waiting because they found it too expensive or for other reasons, some temporary (such as weather) and some possibly harder to overcome. These stated plans may or may not come to fruition.

 Homeowners often decided against more expensive recommended upgrades such as insulation, double-pane windows, or replacing the heating system or water heater. In addition to cost, homeowners sometimes indicated that they questioned the usefulness of these more expensive retrofits.

 Of those who estimated their expenditures on completed recommendations, the median expenditure was $1,500. Some spent much more, but only 25% reported spending more than $3,875.  Taking out a loan to complete upgrades was unpopular. Few interviewed homeowners anticipated that they would be interested in doing so, and in the end, most (84%) surveyed respondents who told us about how they paid for the upgrades that they undertook said that they used cash or savings.

Reaction to Retrofits  While most households (84%) said that they were satisfied with the results of the upgrades, only half said that they noticed more than minimal energy savings. Rather, the primary short-term benefit seemed to be comfort or the expectation of long-term cost and energy savings.

 Households who spent higher amounts were more likely to say that they had noticed reductions in their energy bill, though many higher spenders said that it was too early to tell about bill reductions.

Behavioral and Other Changes in the Household  Though the audits were not designed to focus on behavior, 25% said their household changed how they used energy because of the energy audit. An even higher percentage (41%) of those who undertook upgrades said that they changed how they used energy in response to the changes made to their home, though not necessarily directly related to those changes (e.g., 17% said that they turned off lights more often).

67 A limited number of Infrared photos were typically included in reports. A few of those who did not receive photos mentioned wanting these, while a few others wanted more photos than the limited number they received. 68 According to SCL staff, besides temporary tax credits, SCL had no home weatherization rebates available for this population for insulation, windows, etc, and PSE rebates for gas heated homes was capped at low levels. So while rebates are naturally desired, the fact was that there was not much to offer here.

Behavioral Perspectives on Home Energy Audits Page 116

 Beyond whatever recommended energy efficiency upgrades were completed, many households made substantial changes to their homes (e.g., kitchen remodel) or had experienced occupancy changes (e.g., retirement, a new baby), which could make as much or more difference in energy consumption than many energy upgrades. These underscore some of the statistical difficulties of estimating post- upgrade changes in energy consumption with a relatively small sample, and why longer periods and a comparison sample are needed to help evaluate actual energy savings.

Behavioral Perspectives on Home Energy Audits Page 117

5. Post-Retrofit Assessment and Retrofit Quality Verification For a subset (n=50) of the homes receiving a Seattle City Light (SCL) home energy audit and where homeowners reported having completed some upgrade recommendations, we were able to conduct an independent assessment of the homes after the upgrades. This “Post-Retrofit Assessment” was conducted by an independent auditor, different from the auditor completing the original audit, who re- measured house technical characteristics, evaluated the quality of completed upgrades, completed new blower-door and combustion safety tests (as appropriate), and recalculated the Energy Performance Score (EPS) for the home based on the new measurements. This chapter reports on the findings of these Post-Retrofit Assessments.

5.1 Motivation Verifying energy savings from retrofits is difficult, statistically and practically speaking. In computing savings from energy audit programs, it is common to claim the technically-expected savings from upgrades that were undertaken rather than to statistically measure savings. But actual energy savings depend on, among other important things, the quality of the upgrades completed. The energy efficiency field has done relatively little concerted work in quality verifications for existing home retrofits. One exception is a recent DOE Inspector General report investigating one state’s weatherization assistance program that found substantial quality problems for that state’s implementation (Department of Energy 2010). This study has brought attention to the importance of quality assurance in promoting energy efficiency upgrades. A brief by Shapiro (2011) reports that, of a small sample of audits he reviewed in detail, 60% suffered from poor building description and that 30% suffered from inadequate basic review.69

Another concern that has received recent attention from the environmental health industry is that tightening building envelopes to improve energy efficiency may lead to unacceptably low air quality judged relative to standard guidelines (Manuel 2011; Stephens et al. 2011). The diagnostic testing of these homes after upgrades provides a means for investigating this issue.

The Post-Retrofit Assessments also provided an opportunity to quantify the variability of particular technical measurements and to contribute to an understanding of the potential effects of measurement and input uncertainty on energy audit results. Some inputs are subjective, measurement and estimation processes vary, even the most careful professionals may make input errors, etc. Translating a house into technical modeling inputs is not—and cannot be expected to be—an exact science. Work on simulation modeling has shown that even experienced modelers may arrive at quite different conclusions when modeling the same building with the same software (Guyon 1997). Additionally, some aspects of the re- assessed houses did change, preventing us from isolating measurement variability from these other changes.

Finally, the calculation of a new EPS after upgrades provided us an opportunity to look at the practicality and implications of generating such a post-upgrade score. In theory, the possibility of receiving an

69 The statistics cover an equal number of residential and commercial audits (15 each).

Behavioral Perspectives on Home Energy Audits Page 118 improved score after completing upgrades could help reward or motivate some homeowners to undertake audit recommendations. For example, a DOE Home Energy Score website suggests that consumers who obtain a Home Energy Score for their home can ask their contractor to include a post- upgrade assessment as part of an improvement package.70 While receiving a home energy rating did not seem to be a major draw for many of the households surveyed, homeowners did sometimes mention that they would have liked a follow-up to help better judge if the work was effective (rather than, say, for the market value of the higher efficiency score). If these new scores were based on actual inspection, updating would verify the quality of the retrofits completed, and the new score would symbolize the value of the energy efficiency upgrades as a tangible home improvement. Here, as often (e.g., Cataldo 1998; Shapiro 2011) the upgrades completed were usually modest. Measurement uncertainty, in combination with other incidental changes to the home, could thus swamp the “energy efficiency” value of the energy upgrades completed. What if the score calculated reflecting improvements is not appreciably better than, or is even worse than, the original score?

Below, we first summarize the measurement procedure and overall findings on the quality of upgrades, then present some of the results on measurement uncertainty, and finally present rather exploratory results on differences between original and post-retrofit scores as observed for the re-assessed households.

5.2 Measurement Procedure Homeowners who took part in the Retrofit phone Survey who reported completing energy upgrades were invited to receive a follow-up visit by a home energy auditor to determine the effectiveness and quality of the completed work. The re-inspections were done by one company (the Home Performance Collaborative), primarily by one auditor, and in all cases by a different auditor than the one who did the original audit. Fifty homeowners, selected from survey respondents who reported completing the most extensive upgrades and who agreed to participate, received the Post-Retrofit Assessment. As part of the follow-up visit, the re-assessment auditor provided a description of the upgrades, the equipment and appliances installed, and evaluated the quality of the completed upgrades. The auditor also input new data on some of the major technical components of the house, such as square footage and window area. Blower door measurements were completed as part of the post-retrofit assessment, regardless of what retrofits were completed, and combustion safety testing was completed where appropriate. To prevent undue influence of the original measurements on the re-measurement, the original data for infiltration measurement, floor area estimate, and several other data were masked: the re-assessment auditor could not see the original values that had been input for these components, reducing the potential for unintentional bias.

70 http://www1.eere.energy.gov/buildings/homeenergyscore/homeowners.html

Behavioral Perspectives on Home Energy Audits Page 119

5.3 Assessment of the ‘Quality’ of Completed Upgrades Completed upgrades were coded into the thirteen categories below and quality issues within each upgrade category were noted. Table 33 below provides a summary of the results across the thirteen upgrade categories.

Table 33: Summary of upgrade quality assessment, by upgrade type (n=50 houses)

Number of homes Number with Percent with quality Upgrade Type that completed quality issues issues recommendations Upgrades to appliances 5 0 0% Upgrades to cooling system 3 0 0% Upgrades to heating system 11 0 0% Upgrades to windows 14 0 0% Insulating floors 19 2 11% Sealing ducts 9 0 0% General air sealing 36 4 11% Attic insulation and ceiling upgrades 18 3 17% Insulating ducts 10 1 10% Insulating walls 19 3 16% Upgrades to water heater 3 1 33% Upgrading to solar water heater 0 N/A N/A Adding or maintaining solar panels 0 N/A N/A

General air sealing was the most commonly completed upgrade among the re-assessed homes (n=36), followed by insulating floors (n=19), insulating walls (n=19), attic and ceiling insulation (n=18), and window upgrades (n=14). Overall, 142 upgrades were completed across these 50 homes.71 Of these, quality issues were noted for 10% of upgrades (n=14).

In two of these cases, health and safety concerns were identified. In one case, a home that had completed air sealing and insulation of the basement was found to have inadequate ventilation; the auditor noted that the “ACHn was dropped to .22 which is less than 70% or the Building Airflow Standard for this home,” indicating that the home now needed a mechanical ventilation system to ensure adequate fresh air. In the other case, a new gas water heater failed combustion safety testing. This water heater replaced an old water heater that had also failed combustion safety testing during the original audit, but did not resolve that issue.

In the other 12 cases, the auditor noted shoddy work or missed opportunities—for example, areas missed when insulation was added, unsealed or un-dammed attic hatches, or missed air sealing

71 In this analysis, multiple upgrades within the same category—for example, multiple air sealing actions or replacing two appliances—are counted as one upgrade. Therefore, results may be different compared to other sections of the report.

Behavioral Perspectives on Home Energy Audits Page 120 opportunities, or insulation applied with the vapor barrier incorrectly oriented. In some cases the incorrect work was attributed to the homeowner, but in other cases the work was done by a contractor.

Overall, the quality issues found were modest. Most represented detail work that was not thoroughly or correctly completed, of a level that may only marginally degrade the efficacy of the upgrade in terms of the performance of the home or the durability or aesthetics of the upgrade. The case where air sealing and insulation lowered the air infiltration into the house below levels considered safe represents a recognized risk though with no current consensus as to its degree of seriousness (e.g. Manuel 2011, Stephens et al. 2011). Our results confirm that rendering existing homes “over-tight” with respect to current standards seems possible, though with the small sample analyzed here, and the lack of controlled situations, it is difficult to know whether this is a frequent issue in this program.72

We conclude that for the homes reviewed, poor upgrade quality is likely not a major issue, although combustion safety problems and inadequate ventilation, while rarely identified, present some risk for homeowners completing upgrades.

5.4 Measurement Differences We also compared blower door measurements of infiltration, which are expected to change for many of the homes that completed upgrades, and floor area measurements, which usually are not expected to change—between the original audit and the post-retrofit assessment.

Figure 13 provides a simple comparison of the CFM50 measurement before the retrofit—at the time of the original audit—to measurement after retrofits. The high majority (86%) of test houses showed lower post-retrofit infiltration measurement than the infiltration measurement for the original audit, with a median decline of 17%. Most of these houses (71%) reported having done some air sealing.73 In fact, even houses that did not report air sealing usually showed some decline in measured infiltration. The decline is statistically significant for both groups of households—those who reported having done air sealing and those who did not—and though perhaps due to limited sample size, the percentage decline is not statistically significantly different between these two groups of houses. Since the original infiltration value was masked to the post-assessment audit team, this shift to lower infiltration values cannot reasonably be attributed to unconscious bias on the part of the post-retrofit assessment auditor. There are a number of other possible explanations for infiltration measurements to have shifted lower even where no sealing was reported: that other retrofit measures led to reduced infiltration, that some air sealing was unreported, that the differing accuracy of manometers could create a consistent pattern

72 Note that for this particular home the original and post-retrofit audit results differed substantially in terms of floor area, volume, number of stories. For this house, the volume difference in particular raises the possibility that the drop below the threshold was the result of these measurement differences (which affect the calculated ACHn) as much as a real decrease in air infiltration rates. This raises the question of how reliably these measurements can establish whether a safety risk is present in a home.

73 We counted a household as having completed air sealing if the respondent reported doing so in the Retrofit Survey; the review by the post-retrofit assessment auditor revealed some additional cases, for a total of 36 households having completed air sealing out of the 50 post-retrofit assessment households.

Behavioral Perspectives on Home Energy Audits Page 121 such as the one seen here, 74 and that—particularly for the few cases where deep reduction in infiltration were measured, there may have been differences in blower door test set up (e.g., a basement door open for the original test but closed for the follow-up test). Further investigation on blower-door accuracy, precision, and bias across methods seems warranted, using a more controlled experimental design. We also compared floor area estimates between original and post-retrofit assessment. These values usually matched closely between the original audit and the post-retrofit audit (r=0.9) though in some cases there were large differences (e.g., 14% showed a difference in measured floor area of more than 30%). One factor that likely contributes to these differences is whether basement areas are considered to be part of the conditioned, finished space that is heated or cooled, or whether these areas are considered to be unconditioned and therefore not actively heated or cooled. This determination may sometimes require a judgment call on the part of the auditor, and may also be different between home energy audit modeling tools.

Figure 13: Changes in infiltration between values recorded for original audit versus Post-Retrofit Assessment (n=50)

5.5 Updating Scores to Reflect Completed Retrofits Home energy audits that provide an energy efficiency or energy consumption asset rating typically also offer an estimate of what the rating would be if all recommended retrofits are completed. This provides

74 As noted above, most of the post-retrofit assessments were performed by a single auditor.

Behavioral Perspectives on Home Energy Audits Page 122 a “before/after” picture that helps homeowners judge the relative difference that upgrades, as a whole, would make in their home’s estimated energy performance. The EPS Audit and the Home Energy Scoring Tool both offer these comparisons.

As mentioned above, the “before/after” depiction raises the question: if a homeowner were to complete all recommended upgrades, would a new, independent post-upgrade assessment of the house yield a performance rating close to the one predicted? Issues that confound this prediction include:

 The quality and nature of completed upgrades may not match what was foreseen. Though results from the post-upgrade assessments completed for select Seattle homeowners did not show major quality issues, others have found this to be a significant issue, at least for some home retrofit programs (U.S. Department of Energy 2010).  Measurement variability and data entry error. Reassessment of the technical characteristics of the house may result in substantial differences in recorded values, such as of air leakage, floor area and volume, insulation levels, and other inputs to energy models. Some of these may be due to actual changes in the house but others may be due to measurement and recording factors such as different techniques, subjective aspects of assessment, and data entry errors.

 Tools evolve. A given evaluation tool may be different at the Post-Retrofit Assessment than it was during the original audit. As a case in point, over the course of this project’s research period, there were substantial changes in all three of the tools we used, directed to improving model performance and applicability.  Houses do not remain unchanged save for energy efficiency upgrades. A household that performs energy efficiency upgrades may often also make other changes to the house that affect their rating (for better or worse) but which were not considered in the predicted score—finishing a basement, adding an air conditioner, or completing energy upgrades that were not recommended.

Also, in practice, and especially relevant for the comparison below, homeowners may rarely complete all of the recommended upgrades. Smaller changes in scores resulting from a partial set of completed upgrades would be more easily obscured by these sources of variability in scores. For the homes evaluated here, we do not know how many homeowners would have wanted an updated score, or would have expected it to change much based on the upgrades they completed.

To roughly assess the consequences of the above issues, we compared new EPS scores, based on the Post-Retrofit Assessment measurements and computed in the EPS Auditor tool, with the original EPS. Most of the 50 households had done some substantial upgrading, but only a few had completed all or nearly all of the recommended upgrades. Whatever the degree of energy efficiency upgrades completed, in theory, the new score should be lower than the first, if the same modeling procedures

Behavioral Perspectives on Home Energy Audits Page 123 were used and there were no measurement error.75 The comparison of original vs. Post-Retrofit Assessment scores showed, instead, that scores improved for only half (26 of the 50). While the predicted score was quite close to the measured post-retrofit score for the one case where the homeowner had completed very deep retrofitting, overall this comparison underscores the fact that that the factors listed above may have a substantial effect on scores. This calls attention to a potentially very visible concern in rating precision: a homeowner who undertook upgrades would likely be disappointed to have his or her house receive a worse rating after upgrading than it did before. We did not attempt to determine the relative contributions of these factors to the changes in scores we observed, beyond deducing that more than just retrofits were at play. While methods can be adopted to minimize the impact of measurement variability (and conceivably, it would be possible to avoid re- measurement altogether), the consequences and extent of uncertainty—a combination of a number of factors affecting both the original and post-retrofit assessment—need be considered in debating whether and how “post-retrofit” ratings are determined and the credibility of home energy ratings in general.

5.6 Conclusions Based on analysis of Post-Retrofit Assessments of 50 households, the quality of retrofits completed seemed quite high, with little evidence that quality considerations would diminish savings that would be expected on a theoretical basis. While our investigation of measurement and input uncertainty was limited, our results suggest that further exploration of measurement and model input differences may be warranted—especially if home energy audit outputs such as scores are intended to be are to be replicable from auditor to auditor, or comparable from house to house. We also looked at changes in calculated Energy Score between the original audit and the post-retrofit assessment for the re-assessed houses. The comparison showed improved scores for only half of these houses. Even considering that the degree of upgrades in these houses were sometimes only modest, this result suggests that uncertainties of the measurement and modeling processes may readily obscure the efficiency gains from typical levels of retrofits.

75 In one case, the household made major changes to their house in addition to energy upgrades. Also, changes to the auditor protocol resulted in some basements being counted as conditioned space in the Post-Retrofit Assessment score, whereas they were counted as unconditioned space in the original audit. This contributed to an increase in estimated energy consumption and score in these cases.

Behavioral Perspectives on Home Energy Audits Page 124

6. Home Energy Audit and Assessment Modeling Tool Comparison

6.1 Introduction and Background

Goals and Scope The preceding chapters in this report have focused on various non-quantitative aspects of home energy audits, including homeowner experience with the Seattle City Light (SCL) Home Energy Audit Program, and auditor and real estate professional views. This chapter provides a complementary analysis of residential audit and assessment76 modeling tools that play an important part in generating information and guidance for the auditor and homeowner: scores, upgrade recommendations, and cost and savings estimates. Simply put, we look at elements of the quality and comparability of the tool-generated estimates and upgrade recommendations that are provided to homeowners who undertake audits. To the extent that the information provided by the audit tool is inaccurate or inapplicable, relative to what is claimed or assumed about that information, the value of what is being provided to the homeowner or potential investor is diminished. We compared three residential modeling tools—EPS Auditor, the Home Energy Scoring Tool, and Home Energy Saver Pro (HESPro)—to each other and to utility-reported usage for a set of homes that received the SCL Home Energy Audit. Only the EPS Auditor tool and results were used in the original audits—the remaining results were generated separately from the SCL program and participants.

Our primary goal, driven by the homeowner focus of this research, is to begin to assess the information and guidance, particularly the energy estimates and upgrade recommendations, generated for homeowners by home energy modeling tools.

A variety of factors complicate this task. In particular, there is little in the way of absolute references for determining whether the various model outputs are “correct” or “optimal” or “accurate:”  Asset-based77 scores are highly dependent on the assumptions necessary to model the house under “standard operating conditions,” and are therefore not directly comparable from one tool to another unless the assumptions are matched or otherwise accounted for.

 Model estimates of the actual energy use of a home have the most intuitive point of reference: actual utility usage of the home. In practice, however, weather and occupant behaviors need to be accounted for in the modeling, and utility data must be available and parsed to match the yearly (or other) period of the model estimates. Utility billing data provides only aggregate usage by fuel type, but not by end use. It therefore does not readily allow direct comparison to model end use estimates—though analysis can be performed to estimate heating end use, etc., from totals. When it

76 The term “home energy audit” can be interpreted narrowly as specifying a particular set of activities such as diagnostic testing, or broadly, encompassing a spectrum of activities. For simplicity, we use the “home energy audit” term in the general sense—including both the more comprehensive EPS audits and the “Home Energy Score” assessments or surveys which may or may not include diagnostic testing and customized reports. 77Asset evaluations involve “modeling the energy use of a house under standardized occupant assumptions (so results can be compared fairly to another”, while operational evaluations consider “the specific occupant behavior” when modeling the energy use of a house (Appendix A from Polly et al. 2011).

Behavioral Perspectives on Home Energy Audits Page 125

comes to recommendations and savings estimates, end use consumption is a more relevant reference than total usage.  Upgrade recommendations and savings estimates have no ready point of reference, while the cost of upgrades depends on the house details, market conditions, and the contractor chosen. While detailed building simulation can be used to generate reference “optimal” upgrade recommendations based on technical-economic criteria for a home and “average” upgrade cost information,78 this simulation is still subject to auditor interpretation, measurement variability, data entry error, and model assumptions. And, these upgrades may not coincide with the interests of a particular household—or what is technically feasible in that home.

Because of this lack of readily available reference points, we compared outputs of these models to the reference points we had available—outputs of the other tools, and utility-reported energy usage for these homes. These comparisons were designed to address several questions about the outputs of these modeling tools:

1. How well do the different asset models agree with each other on estimated energy use for this group of homes? 2. How well do the different asset models agree on upgrade recommendations made to homeowners? 3. How well do these asset-based estimates of energy use match actual bills? This is a useful question even recognizing that asset-based assessments are not designed to match actual energy bills for any particular set of occupants. 4. How do asset model “standard occupant behavior” assumptions compare to self-reported energy use behaviors for Seattle survey respondents? 5. How well do model estimates match actual bills after being “corrected” for weather and self- reported energy use behavior?

This analysis was not intended to determine whether one tool is “better” than the others. Our comparison required assumptions and approximations, including those made in screening and converting house technical data between tools, mapping self-reported energy use behaviors to model inputs, processing weather and utility consumption data, and design and development of tools to emulate and add capabilities to the Home Energy Scoring Tool and to Home Energy Saver Pro. Additionally, these tools were subject to ongoing development and improvement and this study only represented the tools at a particular point in time.

The homes included in this study were all originally audited using the EPS Auditor tool and protocol. The Home Energy Scoring Tool and HESPro modeling and analyses were performed after the fact, separately from the program. The project was initiated prior to the release of the Home Energy Scoring Tool, and

78 For example, the upgrade cost information provided in the National Residential Efficiency Measures Database (2012).

Behavioral Perspectives on Home Energy Audits Page 126 during the data collection phase the initial version of the Home Energy Scoring Tool was released. Because of its importance to the U.S. Department of Energy’s (DOE) evolving Home Energy Score Program, we incorporated it into our analyses, even as it and the other tools evolved during the course of our analysis.

Background

Previous Work on Assessing Home Energy Analysis Models Several recent efforts have compared home energy model estimates of usage to observed utility usage for homes. Polly et al. (2011) and SENTECH (2010) each provide reviews of recent efforts, including Energy Trust of Oregon’s Energy Performance Score Pilot comparison (Earth Advantage Institute and Conservation Services Group 2009). These studies consistently found that residential energy models typically overestimate energy use, with the implication that the savings estimated for upgrades would also be overestimated. Additionally, model estimates tend to vary widely around actual usage, with the Energy Performance Score 2008 Pilot reporting model estimates (for EPS Auditor and Home Energy Saver “Full”) within +/-25% of weather normalized utility bills for only 50-60% of 190 modeled homes, depending upon the particular model. The Energy Performance Score 2008 Pilot report proposed establishing a minimum accuracy of tools used to generate Energy Performance Scores (Earth Advantage Institute and Conservation Services Group, 2009), compared to weather-normalized utility usage. This report proposed that, at a minimum, model estimated total energy use should be within +/- 25% for 70% of homes, and within +/-50% for 90% of homes. More recent work for the Energy Trust of Oregon (Robison 2012) compared the energy use estimates from several asset scoring tools to a detailed building simulation baseline, as well as to observed utility usage. This study found a lack of consistency between different asset tools, and notes that differences in model assumptions are a source of inconsistencies. Robison also found, with the comparison of model estimates to observed utility usage, that the asset models tended to over-predict usage of low energy using homes, and under-predict usage of high energy using homes. For asset models, one would expect model estimates to not match observed utility usage—as many of the determinants of actual utility usage are intentionally excluded from consideration.

There has been research on household energy use behaviors and how these can be represented in modeling (Parker et al. 2010; Hendron et al. 2004; Lutz et al. 1996)—but little indication, outside of detailed model technical documentation, of how these assumptions differ amongst models and how these differences affect model estimates.

How Energy Modeling Tools Fit into the Audit Home energy audit and assessment programs often use energy modeling and reporting tools. For such tool-based audits, the tool influences the audit process: auditors must measure and record a predetermined set of house characteristics and input these into the tool to generate energy use estimates and recommendations. Depending on the tool, auditors may have more or fewer house characteristics to measure, more or less time to talk with the homeowner (Blasnick 2011), more or less latitude in generating and customizing upgrade recommendations, or providing details on the conditions in the home. The tool will often determine the look and feel and content of the report the homeowner

Behavioral Perspectives on Home Energy Audits Page 127 receives. Labels or energy scores are also a function of the model and tool. Upgrade recommendations may be generated by the tool or the auditor; cost and savings calculations, when present, typically are generated based on modeled estimates of household energy use by various systems and end uses.

Through these influences then, home energy audit tools directly and indirectly affect the homeowner experience and the information they receive, including scores or ratings and upgrade savings estimates, and how that information is presented. And, they constrain the auditor—how they interact with the homeowner, to what degree they focus on house or household, what recommendations they make, and how customized these recommendations are.

Basic Attributes of Tools Included in the Comparison Our analysis addressed three different home energy modeling tools: the EPS Auditor tool from Earth Advantage Institute, the Home Energy Scoring Tool from the U.S Department of Energy (DOE) and Lawrence Berkeley National Laboratory (LBNL), and Home Energy Saver Pro from LBNL. These tools were selected for comparison on pragmatic grounds, i.e., the ability to use the existing SCL Home Energy Audit Program data available for Seattle to address the interest of the project sponsor (DOE) in the Home Energy Scoring Tool and HESPro, and to achieve a breadth of comparison covering both asset and operational modeling capabilities. However, there are important differences between these tools; a high-level comparison of tool similarities and differences is presented in Table 34; additional detail is provided in Appendix O.

Table 34: Similarities and differences between EPS Auditor, Home Energy Scoring Tool, and HESPro

Aspect EPS Auditor Home Energy Scoring Tool HESPro Intended Existing homeowners Existing homeowners Homeowners or auditors a a use or users Real estate transactions Real estate transactions looking to add operational considerations to their Home Energy Scoring Tool results; homeowners looking to do self- assessment; auditors looking to do assessments Depth/ Modest set of model inputs; Modest set of model inputs; HESPro does not require a Level of Diagnostic audit—includes not billed as diagnostic, but specific level of detail. Tool is Detail blower door and combustion capable of utilizing blower capable of incorporating some safety testing (as appropriate) door results in modeling. diagnostics, and can and encourages use of IR Report is not customized by incorporate house and camera; provides a the auditor. Assessment operational inputs not included customized report duration is indicated at in the other models under 1 hour Asset or Asset-based model, though Asset-based, with scope Operational modeling tool in operational some white goods and other limited to the house. White that it accepts energy use large energy uses included, goods and other large behavior and appliance inputs and the auditor may energy uses are not (but not utility information) customize the report with considered operationally-focused guidance a Note: While real estate transactions are listed as an intended target market for both EPS and the Home Energy Scoring Tool, this analysis focuses on the application of these tools for existing homeowners.

Behavioral Perspectives on Home Energy Audits Page 128

6.2 Approach This comparison is composed of multiple analyses drawing upon various data streams, along with intermediary data processing and preparation steps. The model comparison process involved four sets of activities: data collection, data processing and preparation, tool and model runs, and the comparison analysis. These activities are depicted in Figure 14, with more detail provided below and in Appendices N-T.

Figure 14: Major sets of activities for model comparison

MODEL MODEL RUNS DATA PROCESSING AND DATA COLLECTION COMPARISONS PREPARATION

EPS Auditor Inputs Audits and Post- Comparison of EPS and Input Mapping (EPS Auditor and Results Retrofit Home Energy Scoring Assessments Tool asset run results Inputs to Home Energy Scoring Tool and HESPro) Model Runs in: Converter Utility Comparison of EPS and Home Energy Scoring Home Energy Energy Use Tool Recommended Scoring Tool Behavior Survey Energy Use Energy Use Upgrades Response Behavior Survey Behavior Survey HESPro Mapping to Responses Recruiting HESPro Inputs Emulator Comparison of asset, Weather Analytics asset+weather, and house+weather+ Weather Data University of Oregon behavior run results Processing and Weather Data Solar Radiation including utility usage Synchronization Monitoring Laboratory

Utility Data Utility Data (pre- processing audit)- gas and SCL (electric) data (annualization, electric synchronization)

PSE (gas) data

Data Collection The project data collection activities are detailed in Chapter 2. The four major source data sets that were utilized to complete the modeling comparison analyses are presented in Table 35. The details of the data collection activities for each are referenced in the table.

Data Processing and Preparation To enable consistent model runs and comparisons, it was necessary to perform additional processing and preparation tasks on these data streams. We developed a utility to convert EPS auditor inputs and Energy Use Behavior Survey inputs for use in the Home Energy Scoring Tool, HESPro, and in their emulators. Weather and utility data were each processed into full-year periods synchronized with each other to allow model runs incorporating historical weather conditions generating energy use estimates directly comparable to the full year utility usage estimates. These data processing steps are described in detail in Appendix N.

Behavioral Perspectives on Home Energy Audits Page 129

Table 35: Data sets utilized for the model comparison

Total Potential Sample Data Set Size (fewer were used Additional Detail for analyses) EPS Auditor Inputs and Results: 1,355 See Chapter 2  Technical house characteristics collected during SCL home energy audits  Audit report (outputs) from SCL Home Energy Audit Program Energy Use Behavior Survey responses 346 See Chapter 2 Electric and natural gas utility usage and billing 251 See Chapter 2 information Historic hourly weather conditions in Seattle over N/A See Appendix N the study period, garnered from two separate sources to get a complete history for a basic set of inputs:  Weather Analytics  University of Oregon Solar Radiation Monitoring Laboratory

Behavioral Perspectives on Home Energy Audits Page 130

Model Runs Various model runs were necessary to inform the model comparisons. These runs utilized both the original tools as well as “emulators” developed to replicate the results of the tools but to allow additional control over model inputs and outputs. These runs are described in Appendix N.

Table 36: Modeling tools and emulators used in the analyses

Model Description of Model Model Runs Completed Additional Detail /Used For EPS Auditor Asset modeling tool used for -Used for EPS vs Home Energy -See Appendices N, Seattle City Light’s Home Energy Scoring Tool comparison of P, Q, and S Audit Program. We did not repeat energy estimates any model runs; instead, we -Used for EPS vs Home Energy referenced the results from the Scoring Tool comparison of original audits. recommended upgrades -Used for model comparison vs utility-reported usage Home Energy DOE’s asset modeling tool. Online -Used for verification of the -See Appendices N Scoring Tool interface. Emulator and S -Used for EPS vs Home Energy Scoring Tool comparison of recommended upgrades Home Energy A tool we developed to replicate -Used for verification against -See Appendices N, Scoring Tool the results of online Home Energy the online tool P, and Q Emulator Scoring Tool but to give additional -Used for EPS vs Home Energy detailed end use estimates and Scoring Tool comparison additionally allow use of historical -Used for model comparison vs weather files. Utilizes same utility-reported usage DOE2.1E configuration as Home Energy Scoring Tool and HESPro, in combination with a spreadsheet calculation tool. Home Energy LBNL’s operational modeling tool. -Used for verification of the -See Appendices N Saver Pro Online interface. Emulator (HESPro) Home Energy Tool we developed to replicate the -Used for verification against -See Appendices N Saver Pro results of online Home Energy the online tool and Q Emulator Scoring Tool but to allow us to use -Used for model comparison vs (HESPro historical weather files. Utilizes utility-reported usage Emulator) same DOE2.1E configuration as Home Energy Scoring Tool and HESPro, in combination with a spreadsheet calculation tool.

Behavioral Perspectives on Home Energy Audits Page 131

Model Comparison Analyses Three primary model comparison analyses were completed using the model runs described above, and a fourth analysis was completed as a follow-up. These comparisons used the subsample of Seattle houses for which we had sufficient data for the analysis. Two additional analyses were completed which did not rely on the model runs but instead focused on other characteristics of the tools. The six analyses are described in Table 37, and additional details and results are provided in the appendices referenced.

Table 37: Six model comparison analyses completed

Model Comparison Analysis Focus of Analysis Additional Detail EPS Auditor, Home Energy Evaluate how the tool characteristics are similar or different See Appendix O Scoring Tool, and Home Energy in terms of intended use, scope of inputs, assumptions, Saver Pro Feature Comparison model results, generation of recommendations, presentation of outputs, and the role of the assessor or auditor in the process Comparison of Home Energy Evaluate how closely Home Energy Scoring Tool and EPS See Appendix P Scoring Tool and EPS Asset Auditor estimates of total, fuel, and end-use model Models estimates match for a set of Seattle households Comparison of Model Estimates Evaluate how closely asset and operational models estimate See Appendix Q to Utility-Reported Usage utility usage for a set of Seattle households Assessment of Impact of Missing Evaluate the impact of missing Home Energy Scoring Tool See Appendix R Home Energy Scoring Tool Inputs inputs on the model comparisons Comparison of Upgrade Evaluate how closely Home Energy Scoring Tool and EPS See Appendix S Recommendations from EPS Auditor upgrade recommendations match, in terms of Audits and Home Energy Scoring which recommendations are made, as well as tool Tool estimates of costs and savings Comparison of Model Evaluate how well the assumptions and defaults utilized in See Appendix T Assumptions Regarding the three modeling tools represent Seattle audit recipients “Standard” Operation to Energy by comparing these assumptions to responses to the Energy Use Behavior Survey Responses Use Behavior Survey

Behavioral Perspectives on Home Energy Audits Page 132

6.3 Summary of Findings Findings are pulled together thematically from the analyses referenced above, which are described in more detail in Appendices O-T.

Asset Tool Usage Estimates are Surprisingly Similar for Modeled Seattle Households The review of EPS Auditor and the Home Energy Scoring tool characteristics (Appendix O) indicates that the formulation of these tools is quite distinct in a variety of dimensions. However, when a set of 237 Seattle homes was modeled in both tools (Appendix P), energy usage estimates79—totals, by fuel and by major end use categories—were generally quite similar. For example, a scatterplot for the comparison of total site energy use estimates from the Home Energy Scoring Tool versus EPS Auditor is presented in Figure 15. The Pearson correlation coefficient for this comparison was r=.89, and as the scatterplot shows, consistency was fairly high. Model estimates were within +/-25% for 86% of these homes. Differences between the tool’s estimates were higher for houses estimated to be higher-consuming.

Figure 15: Estimated total site energy use for Home Energy Scoring Tool compared to EPS (n=237). Solid line represents perfect agreement. Dashed line represents best fit.

This good consistency between energy use estimates from EPS Auditor and the Home Energy Scoring Tool provides some indication that different asset models in general may be capable of generating similar rating when modeling the same house or set of houses. However, to the degree that ratings are sensitive to model assumptions regarding “standard occupant behavior,” modeling tools relying on very different assumptions would not be expected to generate similar ratings. It is noteworthy that while the assumptions used to represent “standard occupant behavior” in EPS Auditor and the Home Energy

79 Though we did not compare ratings per se, we compared both total site energy and total source energy— surrogates for the rating systems used for EPS and for the Home Energy Scoring Tool, respectively.

Behavioral Perspectives on Home Energy Audits Page 133

Scoring Tool differed somewhat at the time of the analysis (see Appendix T), the model estimates still coincided reasonably well. Also, our comparison between EPS Auditor and the Home Energy Scoring Tool, due to the method we employed80, excluded auditor measurement and interpretation variability— an important source of differences in cases where a house receives more than one assessment, as described in Chapter 5.

Asset Tool Upgrade Recommendations are Dissimilar We compared the “standardized” upgrade recommendations generated by the EPS Auditor tool and delivered to EPS audit recipients81 with the recommendations we generated using the Home Energy Scoring Tool, for 31 Seattle houses (Appendix S). We first compared frequency at which specific recommendations were made for the 31 homes, and we next compared the details of recommendations in four categories that EPS Auditor and the Home Energy Scoring Tool had most in common. Despite the similarity in asset model estimates of house energy usage given standard occupant behavior, the recommendations generated by EPS auditors and by the Home Energy Scoring Tool differ substantially both overall as well as within each of the four recommendation categories that were most readily comparable (air sealing, attic insulation, floor or basement insulation, and duct insulation).

Table 38 and Table 39 present the percent of homes that received each recommendation from the Home Energy Scoring Tool and from EPS audits for the 31 homes. This comparison shows substantial differences. The EPS recommendations drew from a larger portfolio of upgrades, including appliance replacement, the addition of mechanical ventilation, and a variety of water heater upgrade options, but did not differentiate the timing of the replacement. EPS audits also recommended certain common upgrades—e.g. air sealing, attic insulation, duct sealing, and wall insulation—much more often than the Home Energy Scoring Tool. On the other hand, the Home Energy Scoring tool presented recommendations in two categories—“repair now”, and “replace later” (when it is time to replace or upgrade) and recommended that most or all homes “pick one with an ENERGY STAR label” when upgrading heating systems, water heaters, or windows.

Figure 16 summarizes the results of the comparison of upgrade recommendations within each of the four categories that coincided most: air sealing, attic insulation, floor or basement insulation, and duct insulation.

80 This comparison utilized one set of house measurements, which were then converted to inputs for the Home Energy Scoring Tool and HESPro. Therefore, this comparison is subject to errors from the conversion process but not to variability from multiple measurements of the same home. 81 This comparison includes only recommendations made to homeowners in the “Summary of Recommended Energy Upgrades” section of the EPS report. Auditor customized recommendations, deep retrofit options, and additional recommendation detail comments were often delivered in separate parts of the report and were not evaluated here.

Behavioral Perspectives on Home Energy Audits Page 134

Table 38: All Home Energy Scoring Tool recommendations for 31 homes. Note that some recommendations are presented as “repair now” while others are presented as “replace later”

Percent of Homes Home Energy Scoring Tool Report Recommendation Receiving Recommendation Repair now: These improvements will save you money, conserve energy, and improve your comfort now Attic: Increase attic floor insulation to [range from R-19 to R-60]. 42% Basement/crawlspace: Insulate the floor above unconditioned space to at least [range from R-19 to R-38]. 39% Air tightness: Have a professional seal the gaps and cracks that leak air into your home. 32% Ducts: Add insulation around ducts in unconditioned spaces to at least R-6. 16% Basement: Add insulation to walls to R-19. 13% Seal ducts 0% Replace later: These improvements will help you save energy when it’s time to replace or upgrade Water heater: Pick one with an ENERGY STAR label. 100% Furnace: Pick one with an ENERGY STAR label. 81% Windows: Pick ones with an ENERGY STAR label. 61% Boiler: Pick one with an ENERGY STAR label. 13% Central Air: Pick one with an ENERGY STAR label. 6% Heat Pump: Pick one with an ENERGY STAR label. 3% Siding: Add insulating sheathing underneath it to R-5. 3%

Table 39: All EPS Auditor tool recommendations for 31 homes. No distinction is made about doing now or when replacing equipment.

Percent of Homes EPS Audit Report Recommendation Receiving Recommendation Seal air leaks to reduce leakage (air leakage rate remains above .35 ACHn). 81% Insulate attic to R-49. 68% Seal all seams and joints on ductwork and plenum with mastic paste. 55% Dense pack uninsulated wall cavity with cellulose insulation. 48% Replace refrigerator with an ENERGY STAR refrigerator. 45% Wrap ducts with fiberglass insulation. 42% Fully insulate floor joist cavity with fiberglass or cellulose (Savings assumes R30). 35% Upgrade to condensing gas furnace. 35% Upgrade to high efficiency windows. 32% Replace washing machine with an ENERGY STAR washing machine. 29% Replace dishwasher with a high efficiency dishwasher. 26% Install photovoltaic panels. 23% Install a tankless water heater. 19% Install a high efficiency gas water heater. 16% Install a solar water heater and use your current water heater as a backup. 16% Install a high efficiency electric water heater. 13% Install tight-fitting storm windows. 13% Reduce air leakage rate below .35 ACHn and install mechanical ventilation. 10% Install a solar water heater and new backup tank. 10% Insulate floor cavity with spray foam. 6% Install an ENERGY STAR heat pump. 6% Install a ductless mini-split heat pump. 6% Install mechanical ventilation to provide adequate make-up air. 3% Sprayfoam 2x4-inch roof-deck cavity 3% Upgrade to condensing boiler and integrated domestic hot water heating system 3% Add rigid foam to the exterior of the house when re-siding. (Savings based on R5 rigid foam) 3% Install a heat-pump water heater. 3%

Behavioral Perspectives on Home Energy Audits Page 135

Figure 16: Summary of comparison of upgrade recommendations (n=31)

Comparison of Estimated Savings For Houses Upgrade Home Energy Scoring Comparison EPS Auditor where Recommendations from Both Tools Category Tool Coincide (Line represents perfect agreement) Air tightness: Have a Seal air leaks to Most common professional seal the reduce leakage (air recommendation gaps and cracks that leakage rate

leak air into your remains above .35 (Number receiving home. ACHn). recommendation) Air (N=10) (N=25) Sealing Number coinciding 10

Average estimated cost $800-900a $400-2000

Average estimated $96 $171b savings per year

Attic: Increase attic Most common floor insulation to: recommendation R-19 N=3 Insulate attic to R- R-30 N=2 49. (Number receiving R-38 N=7 (N=21) recommendation) R-60 N=1 Attic (Total N=13) Insulation Number coinciding 11 N/A (cost provided Average estimated cost $1215a is per square foot) Average estimated $132 $132b savings per year Fully insulate floor Basement/crawlspace: joist cavity with Insulate the floor fiberglass or Most common above unconditioned cellulose (Savings recommendation space to at least: assumes R30).

R-19 N=1 N=11 (Number receiving R-25 N=2 Insulate floor cavity Floor or recommendation) Basement R-38 N=9 with spray foam. Insulation (Total N=12) N=2 (Total N=13) Number coinciding 7 N/A (cost provided Average estimated cost $1121a is per square foot) Average estimated $173 $95b savings per year Most common Ducts: Add insulation recommendation around ducts in Wrap ducts with unconditioned spaces fiberglass insulation. (Number receiving to at least R-6. (N=13) recommendation) (N=5)

Duct Number coinciding 5 Insulation Average estimated cost $890a $200-700

Average estimated $160 $44b savings per year a Calculated from savings and payback information b Only a range is provided in the EPS report

Behavioral Perspectives on Home Energy Audits Page 136

We can see, therefore, that these recommendations differ substantially in several respects:  Which upgrades are available to be recommended  Whether a particular upgrade is recommended or not  Estimated costs of upgrades  Estimated savings resulting from upgrades

Homeowners would therefore seem likely to get quite different upgrade information from an EPS audit compared to a Home Energy Score assessment. For example, while the Home Energy Scoring Tool gives fairly low estimates for potential savings from air sealing for these houses, with little variation, the EPS gives highly varied estimates, sometimes several times higher than for the Home Energy Scoring Tool. Similarly, the degree of improvement recommended is often different—for example an EPS Auditor recommendation for attic insulation always recommends R-49; the Home Energy Scoring Tool recommendations varied from R-19 to R-60.

We have no standard by which to determine whether the recommendations from one system are “better” than the other. The recommendation selection process was quite different—for EPS audits, the auditor follows a protocol, but has some latitude to select what recommendations to make from the limited set available within the EPS Auditor tool; the Home Energy Scoring Tool selects the recommendation based on feasibility and payback, without any influence of the assessor. And, legitimate differences in assumptions regarding upgrades (for example, different retrofitted insulation depth) likely contribute to many of the differences noted in Figure 16.

Also, for EPS, the auditor typically included in the report a set of “customized” recommendations— which may be more important to homeowners than the “standardized” recommendations analyzed here. On the other hand, the upgrade recommendations do provide important information and guidance for homeowners looking to improve their home, particularly the cost and savings estimates.

This analysis was based on a small set of homes (n=31) and the results are dependent, to some extent, on the assumptions made to translate inputs from EPS Auditor to the Home Energy Scoring Tool and to fill in Home Energy Scoring Tool inputs for which we did not have data. However, these observed differences between the recommendations generated by the two asset tools contrast with the similarity we observed between usage estimates, indicating that upgrade recommendations for a particular home may be particularly sensitive to the process and criteria used to generate the recommendations. That is, while estimates of total energy use are similar between the two tools, the resulting upgrade recommendations and savings estimates show rather startling differences. This finding is particularly important for homeowners motivated to save money or energy—these homeowners are likely to be less interested in a score or in replicating their utility bills, and more interested in the information provided by upgrade recommendations. The differences we found may indicate an opportunity for improvement of the audit tools.

Behavioral Perspectives on Home Energy Audits Page 137

Energy Use Behavior for Seattle Respondents Does Not Match Asset Model Assumed “Standard Occupant Behaviors” We compared certain asset model “standard occupant behavior” assumptions used by EPS Auditor and the Home Energy Scoring Tool to the self-reported energy use behavior of the Seattle households who completed an Energy Use Behavior Survey. This comparison is a means for understanding how well the asset model assumptions represent these households. This comparison is not designed or intended as a critique of the particular assumptions employed by either EPS Auditor or the Home Energy Scoring Tool; rather, we are interested in how well this set of households can be modeled by any standardized set of occupancy and behavior assumptions, and to what degree household and behavioral heterogeneity might limit the ability of asset-based models to provide useful guidance to households interested in making energy upgrades.

For this particular set of households, responses were substantially different from asset model assumptions for “standard behavior:”

Average reported energy use behaviors differed from asset model assumptions.  Washing machine usage was assumed in the Home Energy Scoring Tool to be 6 loads per week at various cycle temperature settings.82 Respondents reported 4 loads per week on average.

 Clothes dryer usage is assumed in the Home Energy Scoring Tool to be 5 loads per week.83 Respondents reported between 3 and 4 loads per week on average.

Behavior or operation was highly variable in many cases, and not well represented by an assumed fixed value or relationship.  For both the Home Energy Scoring Tool and EPS Auditor, occupancy was estimated based on the number of bedrooms in the home, using similar but not identical calculations. These occupancy assumptions appear to represent the relationship in the data well. However, the actual variability of occupancy around this relationship was high—estimating occupancy based on the number of bedrooms in the home therefore only represents a portion of what would be captured by including actual occupancy information in the models. Self-reported occupancy is plotted against the number of bedrooms in the home in Figure 17, which also includes EPS and Home Energy Scoring Tool occupancy assumptions.

82 EPS Auditor assumptions for clothes washing are embedded in hot water usage estimates and based on occupancy and machine efficiency, and were excluded from this comparison. 83 EPS Auditor assumptions for clothes drying are based on occupancy (and in turn on the number of bedrooms in the home), and were excluded from this comparison.

Behavioral Perspectives on Home Energy Audits Page 138

Figure 17: Self-reported and model estimated occupancy vs number of bedrooms in home. Jitter was added to self-reported values (on both X and Y axis) to make density of points visible (n=312)

 Reported usage of large appliances, including dishwashers, washing machines, and dryers, also varied widely from household to household. The Home Energy Scoring Tool, which relies on a single assumed usage value, does not appear to represent many households well. EPS Auditor was not included in this comparison, as it uses a more complex algorithm to model large appliance hot water and fuel usage, based on the number of bedrooms in the home.  The Home Energy Scoring Tool assumes, for Seattle, that someone is home during the day— translating to an increase in the modeled hot water usage. In this set of Seattle households, the majority of homeowners report that someone is home 50% of the time or more. However, more than 1/3 of households report being only occasionally home during the day. This substantial portion of households is not well represented by this assumption. EPS Auditor does not make a distinction of this nature.

Some common behaviors and energy uses are not modeled.  Thermostat setback behaviors84 among respondents appear to be common and more diverse than represented in model assumptions and defaults. We looked at just daytime and overnight setbacks;

84 For this analysis, we defined a thermostat “setback” as a period where the temperature setting was lower during that period than both the period before and the period after, based on survey responses indicating the thermostat setting for four daily time periods: mornings (6am-9am), daytime (9am-5pm), evenings (5pm-10pm), and overnight

Behavioral Perspectives on Home Energy Audits Page 139

76% of respondents reported at least one of these, while 44% reported setting back their thermostat both during the day and overnight. These results are summarized in Table 40. The models vary, but assume either no setback or only a daytime setback.

Table 40: Thermostat setbacks reported in the energy use behavior survey (n=308)

Setback…. Percent Reporting Only during the day 2% Only overnight 30% Both during the day and overnight 44%

 Respondents frequently report using various types of supplemental heating systems. Over 60% (200 of 322) report at least sometimes using supplemental heating such as gas or wood fireplaces, portable electric heaters, baseboard heaters, other wall or floor heating systems, or heating blankets. While the tools will consider primary or secondary (in EPS Auditor) baseboard or wall heating systems, in most of these homes supplemental heating was not considered in the modeling.

 The presence of other large energy using equipment is also quite common for these Seattle households. Examples of these include electric vehicle charging, kilns, saunas, air compressors, medical equipment, shop tools, dehumidifiers, heated greenhouses, pool pumps and heating systems, air filtration systems, sewage and well pumps, and elevators. Compiling results from the home’s audit (EPS Auditor does consider many other large energy uses) and energy use behavior survey responses, 24% of households (77 of 322) had other large energy uses that could increase their utility usage beyond what would be expected from Home Energy Scoring Tool results. EPS audit estimates included one or more other large energy uses for about 2/3 of these households.

Together these factors provide a strong case that the energy use behaviors of these households—and the types of non-standard equipment households might possess—are diverse. The asset modeling approach, which relies on an assumed “standard” set of behaviors and occupancy and may exclude other large or non-standard energy uses, fails to take these factors into account when advising households on energy efficiency.

To generate a first approximation of how variable household energy use behaviors translate into changes in modeled energy usage, a limited set of behavioral factors, translated from the energy use behavior survey, along with additional technical inputs,85 was run with the asset model and weather inputs for a subset of these houses (n=101). The resulting energy use estimates were compared to the

(10pm-6am). These results therefore represent only a rough approximation of actual household behaviors. Additional details are provided in Appendix T. 85 Due to limitations on the analysis, a handful of additional house inputs were included in these model runs along with energy use behavior inputs. These additional house inputs represent variables not modeled in the Home Energy Scoring Tool but which are modeled in HESPro, for which information was available.

Behavioral Perspectives on Home Energy Audits Page 140 asset+weather modeled energy use for the same homes—thereby isolating the effect of the modeled energy use behavior (Figure 18).

Figure 18: Histogram of fractional change in model estimated total site energy use of a home when the household’s energy use behaviors and some appliance efficiency inputs are included in the model (compared to asset assumptions) (n=101 households)

When behavior and additional house factors were considered, model estimated total site energy use was 12% less on average than when these factors were not included in the models. Those households whose simply modeled behaviors result in reduced total energy usage compared to asset model assumptions are labeled “Adding behaviors to model reduces energy use vs. asset model”—84 of the 101 respondents. For this sample of homes, incorporating energy use behaviors (as reported) leads to a large difference in model estimates compared to asset “standard” behaviors, although we were unable to differentiate the contribution of behavior from the effect of the additional technical factors. Obviously, this exercise attempted to only very simply represent and model a small set of self-reported behaviors. Modeling additional behaviors or the same behaviors with greater resolution may change these results.

Asset Models Substantially Under- or Over-estimated Usage For a Sizable Fraction of Homes We compared asset model estimates to utility-reported natural gas and electricity usage in the year prior to their audit for 101 Seattle homes. Recognizing that asset-based models are not designed to predict actual usage of a household, as they do not consider operational factors, this comparison provided a baseline for determining how much the addition of operational factors—weather and occupant energy use behaviors—changed the model estimates. Asset model estimates and recommendations, when delivered as part of a rapid, low-cost assessment of a home, also merit

Behavioral Perspectives on Home Energy Audits Page 141 scrutiny. Model estimates of energy usage inform the upgrade recommendations and savings estimates—when modeled energy usage differs from actual usage, it is likely that savings from completed upgrades will be over- or under-estimated.

We found EPS Auditor and the Home Energy Scoring Tool estimates to be moderately consistent with utility-reported usage, but they left a substantial proportion of the sampled homes with large differences between modeled and observed usage, as summarized Table 41 and Table 42 below.

Table 41: Percentage of homes where EPS asset model estimated usage was substantially lower or higher than utility reported usage

Model estimate is ….. More than More than More than More than than utility-reported use 50% lower 25% lower 25% higher 50% higher Total Site Energy Use 0% 8% 37% 20% Total Source Energy Use 0% 14% 33% 19% Electricity Use 7% 22% 37% 19% Natural Gas Use 3% 11% 47% 31%

Table 42: Percentage of homes where Home Energy Scoring Tool asset model estimated usage substantially higher or lower than utility reported usage

Model estimate is …… More than More than More than More than than utility-reported use 50% lower 25% lower 25% higher 50% higher Total Site Energy Use 0% 8% 44% 27% Total Source Energy Use 0% 8% 43% 25% Electricity Use 4% 15% 45% 30% Natural Gas Use 3% 8% 46% 32%

Presuming that homeowners use the asset-based audit results that they receive to inform their decisions about home upgrades, model over- or under-estimates of usage may also translate into over- or -underestimated savings for particular upgrades. For cases where model estimates are substantially higher than the actual usage, apparently quite common in this set of homes, it is likely that households will not achieve the energy or energy cost savings from upgrades as predicted in the model.

For cases where model estimates are substantially below the actual usage, though less common in this set of homes, opportunities for upgrades may be missed and the models may underestimate savings. However, high utility usage relative to model estimates in these cases could be related to other large energy uses that are not within the scope of the modeling and wouldn’t necessarily see any improvement from the upgrades (for example, electric vehicle charging). EPS Auditor does consider certain other large energy uses, while the Home Energy Scoring Tool as currently implemented does not.

While we found that asset model estimates on average ran higher than actual usage, this varied by modeling tool and by fuel. Both tools substantially over-estimated natural gas usage (EPS Auditor estimates averaged 23% high and Home Energy Scoring Tool estimates were 27% high compared to

Behavioral Perspectives on Home Energy Audits Page 142 utility-reported usage) while electricity usage estimates were much closer (EPS Auditor estimates averaged 1% low and Home Energy Scoring Tool estimates were 11% high). Both site and source estimates of total energy use were therefore high on average—consistent with what was found by Polly et al. (2011). However, in this case the high total usage appeared to be mostly a function of high average natural gas estimates in the 68 homes (of 101 total) that used gas.

In general, these differences between model estimates and utility-reported usage are an indicator that the models may not provide reliable guidance to homeowners looking to make upgrade decisions. However, this analysis does not provide any indication of whether the model-generated asset scores are “correct”—the score depends on model assumptions and is only loosely related to actual utility usage.

Adding Weather+Behavior Improved Model Estimates, but Usage was still Under- or Over- Estimated for Many Homes In addition to asset model estimates from EPS Auditor and the Home Energy Scoring Tool, we completed two additional sets of model runs:  House+Weather Runs: Historical weather conditions were used in the simulation in addition to “asset” house technical inputs.

 House+Weather+Behavior Runs: A small set of household-reported energy use behaviors were used in the simulation in addition to a handful of house characteristics not accepted by the Home Energy Scoring Tool, historical weather conditions, and “asset” house technical inputs.

To account for weather in these comparisons, we chose to use historical weather strips for Seattle in the model runs, instead of the more common procedure of completing model runs utilizing “typical meteorological year” weather files86 and normalizing the utility billing data based on heating and cooling degree days. Our approach uses historical “actual meteorological year” weather strips we generated and relies upon the DOE2.1E simulation engine to adjust heating and cooling loads for the modeled homes according to these weather conditions. We constructed these historical weather strips from reported weather conditions in Seattle, synthesized into year-long “actual meteorological year” weather files.

Each approach to accounting for weather when comparing model estimates to utility bills has advantages and disadvantages:  Weather normalization of utility usage data tends to rely exclusively on heating and cooling degree days and does not normalize for other important weather variables such as wind speed and solar radiation. Normalization also relies on a determination of the base temperature from which degree days are measured; this, in combination with a regression model fit lead to variability affecting the determination of baseload. Additionally, these factors tend to be dynamic in actual homes— thermostat setpoints vary and help determine what the base temperature is, for example.

86 EPS Auditor, the Home Energy Scoring Tool, and Home Energy Saver Pro all utilize the TMY2 “typical meteorological year” weather representing typical weather for Seattle between 1961 and 1990 (Marion 1995).

Behavioral Perspectives on Home Energy Audits Page 143

 Running historical weather conditions through the building simulation addresses some of these concerns: beside hourly temperatures, we were able to include wind, solar radiation, and humidity information. Model inputs included information on thermostat settings and baseload uses— therefore, issues of base temperatures were accounted for directly within the modeling. On the other hand, these factors were buried within the modeling and are therefore not transparent. Additionally, the weather history was pulled from two separate data sources, and small gaps were present and had to be filled—possibly contributing minor discrepancies to outputs. Finally, we developed a special emulator so we could use custom weather files; this limited our use of historical weather conditions to Home Energy Scoring Tool and HESPro model runs. EPS Auditor was therefore excluded.

When weather and behavior (and a handful of additional house characteristics) were considered, the consistency of model estimates with observed utility usage improved. For example, the Pearson correlation coefficient increased from .58 to .68 for total site energy, the mean estimated total site energy use was 1.2% higher than utility bills instead of 19.3% higher, and the percent of modeled homes with total site energy greater than +/-25% off from utility bills dropped from 52% to 39%.

This confirms that the models better represented actual usage when basic operational considerations were included, as expected. Occupancy and behavior appear to be important in making modeling sense of a household’s energy use—consistent with other research (e.g., Cayla et al. 2010; Lutzenhiser and Bender 2008; Woods et al. 2010).

However, despite the improvements noted from including operational considerations in the modeling, for a substantial portion of the 101 homes studied, large differences between model estimates and utility-reported usage remain. These results are summarized in Table 43; additional details are provided in Appendix Q.

Table 43: Percentage of homes where HESPro house+weather+behavior model estimated usage is substantially higher or lower than utility reported usage

Model estimate is … More than More than More than More than than utility-reported use 50% lower 25% lower 25% higher 50% higher Total Site Energy Use 0% 11% 28% 13% Total Source Energy Use 0% 19% 22% 10% Electricity Use 8% 26% 23% 11% Natural Gas Use 3% 13% 36% 24%

Compared to the asset model runs, adding consideration of operational factors decreased the number of households where model results overestimated utility-reported energy but increased the number underestimated. Overall, though, estimates were close for more homes, with 61% of homes modeled within +/- 25% of utility-reported total site energy use, while 48% of the Home Energy Scoring Tool asset model estimates were this close. This is partially the effect of considering the weather, but the majority of this difference appears to be associated with behavior and the additional “house” inputs. The Energy Performance Score 2008 Pilot report (Earth Advantage Institute and Conservation Services Group 2009)

Behavioral Perspectives on Home Energy Audits Page 144 found that models estimated total energy use within +/-25% for 50% to 60% of homes—this is consistent with our findings for both asset modeling and the operational model runs.

We also found that, when considering operational factors with the HESPro runs, the average of model estimates was much closer to average utility-reported usage for these households than the asset model runs using the Home Energy Scoring Tool. Again this varied by fuel. The average of the HESPro (including weather and behavior) estimates was 8% higher than average utility-reported natural gas usage, while the average of the Home Energy Scoring Tool (asset only) estimates was 27% high compared to average utility-reported natural gas usage. For electricity usage, the average HESPro estimate was on average 6% lower than average utility-reported usage while average Home Energy Scoring Tool estimates were 11% high. Both average site and source estimates of total energy use from HESPro with weather and behavior were quite close to average utility reported usage for the tested households—1% high for site energy, and 3% low for source energy.

Asset modeling has recently been positioned as one element87 of an home energy assessment approach intending to provide low-cost and valuable, but lower-resolution, guidance to homeowners—including providing upgrade recommendations and savings estimates (Home Energy Score 2012). The substantial improvement in model estimates that we observed from including behavioral considerations suggests that there is an improvement in resolution when energy use behaviors are considered in the modeling.

While adding operational factors into the modeling runs does appear to improve the model estimates substantially, there remain a number of cases of large under- and over-estimation. This suggests that even for audits that consider household occupant behaviors, model guidance—particularly savings estimates for recommended upgrades—may still sometimes be quite misleading, at least for a portion of households.

Seattle Average Energy Score Listed on the EPS Scorecard is Similar to Average Score of Homes Receiving the SCL Home Energy Audit. 1,435 Seattle audits were analyzed and found to have a mean total estimated site energy use of 30,560 kWheq/year, with a median just 4% lower (29,380 kWheq/yr). This average is 9% higher than the “Seattle Average” energy score of 28,000 kWheq/yr used as a reference point on the EPS Scorecard, overall lending credence to the expectation that the recorded average is close to the mark.

6.4 Conclusions

Conclusions Before stating our conclusions, we want to underscore that these results are based on subsets of homeowners participating in the SCL Home Energy Audit Program for whom data were available, and are not representative of either the SCL Home Energy Audit Program participants or Seattle households overall. Various assumptions were necessary for converting available data to model inputs, and data

87 The other element is the use of a very streamlined set of house inputs for the modeling.

Behavioral Perspectives on Home Energy Audits Page 145 was limited in many cases. Even with these caveats, we feel that the results in general support a number of working conclusions:

 The relatively high consistency between EPS Auditor and Home Energy Scoring Tool usage estimates for total household energy usage, and therefore scores, provides support for the idea that different asset modeling tools can generate similar scores or ratings, particularly if they utilize similar assumptions regarding “standard occupant behavior.”  The upgrade recommendations generated using the two asset-based modeling tools appear to be quite sensitive to the particulars of the tools and how they and the auditors generate upgrade recommendations. For homeowners looking for information on how to upgrade their home, common among the Seattle households we spoke to (Chapter 4), recommendations and savings estimates are important to achieving completed upgrades. Therefore, how these tools are used to select upgrades and estimate costs and savings may deserve additional attention.  Model estimated usage varies considerably from utility reported usage for many modeled homes. This finding is the case for asset modeling runs and, to a lesser degree, models that include basic consideration of occupancy, energy use behaviors, and weather. Households with large differences between model estimates and actual usage likely receive upgrade savings estimates that are substantially higher or lower than they would get if the model estimates were close to their actual usage.  Asset model estimates from both EPS Auditor and the Home Energy Scoring Tool substantially overestimated average total utility-reported energy usage for this set of homes, consistent with findings from Polly et al. (2011). When weather and occupant behavioral inputs were included in the modeling, HESPro average estimated usage was quite close to utility-reported usage for these homes. While this result could be coincidental and attributable to the particular occupancy and behaviors of this set of households, it suggests that assumptions regarding “standard occupant behavior” could contribute to this tendency of home energy models to overestimate utility usage.

 The difference between model energy usage estimates and observed utility bills suggests that asset modeling tools—if used to give guidance to households looking to reduce their actual energy bills— could benefit from more and higher-quality information. Reduction of measurement and interpretation variability and data entry errors may help. Use of occupancy and behavior information may also help. Use of actual billing history may provide an alternative means for bounding savings estimates, for example, preventing cases where the model estimates savings from completed upgrades that are greater than the energy used in the house (Khawaja and Koss 2007).

 Overall, we found indications that the information generated by these home energy audit modeling tools may be generating less-than-ideal guidance for homeowners—in terms of energy use estimates and possibly upgrade recommendations. When operational inputs were included in the modeling, estimates were improved, but still had wide variability. The Energy Performance Score 2008 Pilot report suggested criteria for a minimum accuracy of tools used to generate Energy Performance Scores (Earth Advantage Institute and Conservation Services Group, 2009), compared to weather- normalized utility usage, model estimated total energy use within +/- 25% for 70% of homes, and

Behavioral Perspectives on Home Energy Audits Page 146

within +/-50% for 90% of homes. The EPS Auditor model comparisons we performed did not take weather into consideration, but the Home Energy Scoring Tool (asset+weather) and HESPro (house+weather+behavior) model runs fell short of this proposed criteria, with Home Energy Scoring Tool (asset+weather) total site energy estimates within +/-25% of utility reported usage for 51% of homes and within +/-50% for 73% of homes, while HESPro (including weather+behavior) runs were within +/- 25% of utility-reported usage for 61% of homes and within +/-50% for 87% of homes—close to the proposed criteria. While we do not imply that we endorse their particular criteria, our findings support their implication that residential modeling tools should undergo validation to demonstrate sufficiency for their intended use—whether for producing asset home scores or for generating upgrade recommendations for current owners of a home. Similarly, Polly et al. have suggested the need for application of a rigorous methodology to “to assess and improve the accuracy of whole- building energy analysis for residential buildings” (2011). Depending on how model outputs are to be used, validation would not be limited to comparison of model estimates against actual utility use, but may also include assessment of upgrade recommendations and savings estimates.

Recommendations for Improving Tools and How they are Used Based on these findings we make the following recommendations for improving home energy audit tools:

 Pursue validation of home energy audit tools to ensure that models generate information that is sufficiently accurate and reliable for the intended uses and audiences, whether homeowner, homebuyer, or others, and whether producing asset ratings, generating upgrade recommendations, estimating savings from completing upgrades, or generating information for other purposes.

 More fully acknowledge the potential risks (to homeowner and to program) of using asset-based assessments to guide homeowners, and consider alternatives that incorporate occupant behavior. o Assess the opportunity for operations-based home energy assessments to give customized recommendations on the most effective behavioral changes and potential savings estimates—a behavioral analogue to technical upgrades recommendations currently provided by most assessments. o Consider combining asset and operational functionality—provide an asset score AND operationally- informed recommendations to existing homeowners. Inform the operational assessment with either (or both) self-reported key energy use behaviors and actual utility billing history

Behavioral Perspectives on Home Energy Audits Page 147

7. Conclusions

Overall Research in Perspective The conventional perspective regarding home energy upgrades is that the technical potential for upgrades has not been achieved because homeowners face informational, initial cost, and other “barriers” which prevent them from taking advantage of the opportunities these upgrades present. This leads to policies and programs that focus on providing information and defraying up-front costs (Sanstad et al. 2010). Two recent reviews of the literature on homeowner decision-making and adoption of home energy retrofits highlight the limits of this perspective and provide a variety of other perspectives on the “problem” of lower than expected uptake, and suggest that home energy audit programs should adopt a more sophisticated view of homeowner decision-making (Sanquist et al. 2010; Sanstad et al. 2010). We compare the findings from our study of participants in the Seattle City Light (SCL) Home Energy Audit Program to select findings from some recent reviews and literature in Table 44.

The question is, to what extent can the often ambitious objectives of home energy audit programs—the view of what homeowners should do or should want from the top-down perspective of those who finance, justify, and design programs—match the realities of what motivates people to upgrade, the retrofit supply chain, the practical state-of-the-art of the audit and modeling quality, and the private and public benefits and costs of completing these upgrades? Home energy audit programs may do very well within their limits, as the SCL Home Energy Audit Program seems to have done, but the limits may be appreciable, and the confines of the theory of home energy audits—that they should revolve around house and system efficiency, promote technical change, focus on cost-benefit ratios, and that most homeowners should be interested—may imagine that homeowners can be converted to the theoretical view more than is actually possible. Rather it seems advisable to attend in more detail to what homeowners need and want. Our research underscores the importance of a shift toward homeowner perspectives, requiring more than simple repackaging of energy efficiency, but rather fuller appreciation of the position of the homeowner and the personal nature of homes. Based on the findings from this project, we have recommendations for going forward.

Behavioral Perspectives on Home Energy Audits Page 148

Table 44: Findings from the SCL Home Energy Audit study compared to the literature on home energy audits and retrofits

Findings from the Literature Our Findings for the SCL Home Energy Audit Program While information on possible upgrades, costs, Most participating homeowners were interested in customized, detailed and savings are important factors for information. Many wanted better information on costs and benefits. But, they homeowner decisions, they are not a sufficient expressed a variety of motives for seeking an audit and for completing upgrades. basis for intervention. Instead, homeowner Most upgrade decisions did not appear to take the form of a simply economic cost- decisions are much more heterogeneous benefit investment decision but instead incorporated many other factors. (Sanstad et al. 2010). Costs matters to homeowner upgrade decisions, Most homeowners in this group paid out of pocket for the work they but does not appear to form an “initial cost completed, even for amounts more than $10,000. Expense, as well as barrier.” Rather, an inability to afford upgrades questionable return on investment, certainly deterred some investments, tends to systematically limit upgrades by low- especially given that homeowners seemed to have expected more financial income households (Sanstad et al. 2010). subsidies than were actually available at the time. Most of the homeowners we interviewed did not feel that it was “worth it” to take on debt to improve the energy efficiency of their home—indicating that loan programs, as a means for addressing the “initial cost barrier”, may have limited salience for these middle to upper income homeowners. Many retrofits undertaken because of aesthetics, Many completed upgrades related to doing things that were easy to do, had instinct, or emotions (Crosbie and Baker 2010; clear consequences related to specific issues, were lower-cost or lower- Sanquist et al. 2010, Wilson 2008). hassle. Decisions tend not to take place at a single A variety of contextual factors affected decision-making. Cycles of decision point but rather are part of a decision equipment replacement appeared to influence appliance upgrade timing. process (Wilson 2008). Also, many of these homeowners seemed particularly active in completing energy upgrades prior to their audits, and many were already considering completing energy upgrades along with other home improvement activities Most completed upgrades tend to be lower cost, The pattern of retrofits completed by Seattle respondents appears to be simpler measures such as air sealing (Sanquist et consistent with this finding, with half of homeowners who reported al. 2010, citing Stern 1985). expenses for upgrades saying that they spent less than $1,500 to complete them and one in four saying they spent $200 or less. Behavior matters in terms of baseline Our modeling results indicated that incorporating self-reported energy consumption and outcomes of retrofits (Sanstad behaviors into home energy audit tool modeling could substantially increase et al. 2010). the degree to which modeled total energy use matched observed bills. We could not test the extent to which this improvement could improve the quality of end use estimates, of cost-effectiveness estimates, or of upgrade recommendations. Homeowners seemed less interested in energy efficiency per se or in the “asset” assessment of their home than in improving their own living conditions and saving energy and money. Uncertainty in savings predictions translates to a Our modeling findings underscore that savings predictions generated by risk to homeowners choosing to “invest” in models are subject to large uncertainty, in that the models tested often retrofits to generate savings (Sanstad et al. 2010) substantially differed in what recommendations they gave and how much savings could be expected. Also, the model estimates of total consumption did not match bills very well, especially when evaluated without behavioral variables. The EPS audit reports received by participants often reflected wide ranges in expected savings as well as in expected retrofit costs; sometimes the expression of uncertainty confounded action. Middle to upper income, higher educated, and Survey respondents as a group tended to be considerably higher-income Caucasian ethnicity households are more likely and more highly-educated than typical for Seattle, though there was still to participate in programs (Sanquist et al. 2010) appreciable diversity in demographic characteristics, and in what and to retrofit (Sanstad et al. 2010). homeowners wanted from the audit and retrofits.

Behavioral Perspectives on Home Energy Audits Page 149

The question is, to what extent can the often ambitious objectives of home energy audit programs—the view of what homeowners should do or should want from the top-down perspective of those who finance, justify, and design programs—match the realities of what motivates people to upgrade, the retrofit supply chain, the practical state-of-the-art of the audit and modeling quality, and the private and public benefits and costs of completing these upgrades? Home energy audit programs may do very well within their limits, as the SCL Home Energy Audit Program seems to have done, but the limits may be appreciable, and the confines of the theory of home energy audits—that they should revolve around house and system efficiency, promote technical change, focus on cost-benefit ratios, and that most homeowners should be interested—may imagine that homeowners can be converted to the theoretical view more than is actually possible. Rather it seems advisable to attend in more detail to what homeowners need and want. Our research underscores the importance of a shift toward homeowner perspectives, requiring more than simple repackaging of energy efficiency, but rather fuller appreciation of the position of the homeowner and the personal nature of homes. Based on the findings from this project, we have recommendations for going forward.

In the Seattle audits we examined, the auditor appeared to often function as an expert agent and advisor, and seemed to have an important influence on what homeowners did or did not do. Compared to presenting homeowners with standardized recommendations selected solely on estimated costs and energy savings, skilled and engaged auditors may provide critical additional value by being better able to judge the multi-dimensional nature of home energy upgrades—comfort, hassle, risk, safety, reliability of savings estimates, the present circumstances and plans of the homeowners, and so on—than software can, while perhaps offering inspiration and personal guidance to the homeowner. Additionally, based on our modeling tools analysis, home energy audit modeling tools do not necessarily live up to expectations in generating specific, scientific, and accurate guidance for homeowners. Until modeling tools are rigorously validated against performance criteria, this factor lends further support favoring auditors as the primary source of guidance for the homeowner. Positive auditor-homeowner interactions were by design and not accidental in this program. However, the auditor role ended with delivery of the home energy audit report, suggesting that homeowners could benefit from continued support as they complete upgrades, whether by the auditor or another function, for example an “energy advisor” or “energy advocate” (Van de Grift and Schauer 2010).

Diagnostic testing was also a memorable and compelling element of these audits for many homeowners, particularly the blower door testing and infrared imaging. These portions of the site visit appeared to effectively engage the homeowner, highlight heat loss, and identify problems that homeowners were more likely to act on. Air sealing was recommended to a high majority of households (92%). Air sealing activities, including caulking, weather-stripping, and patching gaps, were the most commonly completed type of recommendation as well, even though most homeowners have undoubtedly heard these activities recommended before.

We also found that an asset-oriented audit may not align well with all homeowners’ primary interests. The asset-based home rating was interesting enough to most homeowners, and is something that

Behavioral Perspectives on Home Energy Audits Page 150 almost all claimed they would want to see when buying a home or would reveal when selling a home. Few of the homeowners we surveyed and interviewed were planning on selling their home in the near term and most were seemingly more interested in making home improvements toward improving their own living conditions rather than for directly affecting market value of their home. Interviews indicate that people who are motivated to (or have already decided to make changes) but just want to know what to do, were less likely to be motivated by an asset rating itself than those seeking to compare their home to others. . More important, in assuming standardized usage of the home, asset-based recommendations and cost and savings estimates may be quite different from what would be recommended if how the homeowner actually used the home were considered. Household use of energy is highly variable. Good recommendations for frugal users, for example, may be much different than those for liberal users. Our modeling results indicated that accounting for actual use—whether through bills or through operational data—might lead to important changes in recommendations and savings estimates, as others have also noted (e.g., Khawaja and Koss 2007). Additionally, some homeowners seemed interested in receiving advice on their energy use behaviors and operation of the home. Operationally-focused audits could provide the opportunity for providing this type of specific advice, e.g., when to use a portable heater rather than the central heater, how much a household could save by turning down the heat at night, or how much a given appliance uses. It may also be possible to combine asset and operational functionality within a home energy audit tool: retaining the asset rating while utilizing operational inputs to generate recommendations, potentially enabling higher quality and broader spectrum guidance for homeowners.

Many homeowners appeared to find the guidance provided by the audits quite useful, especially when auditors provided customized or prioritized recommendations. Still, uncertainty in retrofits savings and cost estimates, difficulties in going about completing some retrofits, the high cost of completing some of the recommendations, and unremarkable expected payback are all reasons that many retrofits were not done. Assertions that a particular retrofit would or could be cost-effective or that it would increase energy efficiency were just two among several factors that appeared to influence homeowner decisions to complete upgrades. Relatively actionable upgrades that resolved tangible problems—such as leaks, comfort, or safety problems, and smaller, lower-cost, or do-it-yourself upgrades—seemed to have more natural appeal. This may reflect entirely reasonable choices by homeowners unwilling to take on risk from uncertain returns on larger upgrade investments and who instead went with upgrades that were inexpensive and easy-to-do or that promised more direct benefits.

Those who did undertake upgrades seemed pleased with them, although the non-energy benefits seemed often more tangible and may outweigh energy- and money-saving benefits, at least in the short term. While reductions in utility bills due to upgrades are difficult to positively identify given the natural variability in utility bills, so are they difficult to disprove. Therefore, homeowner satisfaction is not a reliable indicator of whether actual energy savings met homeowner, or program, expectations. The utility consumption data we collected was not sufficient to statistically test the degree of energy savings. We generally found good quality work on the completed upgrades that were inspected for the project. It is unlikely, therefore, that the quality of the upgrade work substantially compromised the benefits that

Behavioral Perspectives on Home Energy Audits Page 151 most homeowners saw from upgrades, as has been noted in some other home energy audit programs (Shapiro 2011; NREL 2010).

Demographically, the group of surveyed homeowners tended to be considerably higher-income and more highly educated than other homeowners in the area, but there were still substantial differences in household characteristics and in what various homeowners appeared to want from audits. While many expressed sentiments in alignment with typical program promotion, e.g., energy efficiency, many did not. Other participants seemed to be interested in diagnosing and solving concrete problems such as high bills or comfort issues, or in shaving off monthly costs. Many auditors appeared to tailor their discussions to the wants and needs of homeowners, as they perceived them.

Finally, this research drew largely upon a generally aware, enthusiastic, often highly-educated, and affluent slice of Seattle homeowners—those voluntarily seeking out a home energy audit and who were subsequently willing to talk to us at length. People who do not sign up for a home energy audit may behave quite differently. If home energy audits are to be expanded, there is a research need to look at the circumstances and expectations of households who do not currently participate in home energy audit programs, and to understand their goals and decision-making rationale.

By many criteria, the SCL Home Energy Audit Program led to successful outcomes, with participants indicating that they were pleased with their involvement in the program. In many instances, these audits led to quality upgrades that improved the condition of participants’ homes, and they may have led to lower energy consumption in upgraded homes. Considering the increased national interest in home energy audits and ambitious claims as to what home energy audits can and should do, our major overarching impression is that the home energy audit and retrofit “problem” may be less about information than about over-ambitious expectations of what audits can reasonably do and about how compelling conventional recommendations for upgrades are. In short, the lukewarm interest in home energy retrofits (Palmer et al. 2011) might be less about “barriers” that can be overcome with reasonable levels of policy intervention, than the fact that, all things considered, the potential benefits are only strongly compelling in a limited number of situations.

What we suggest, then, is that in planning home energy audit programs in general, it is useful to consider how much of what is being offered makes good sense and is compelling to potential participants. This perspective requires taking a more home- and owner-centered view than programs appear to typically adopt. It may also mean a harder look at the influence of personal interactions with a trusted expert such as an auditor or energy advisor, the quality, accuracy, and customization of the guidance being offered, in what cases diagnostic testing is warranted, and when are asset or operational assessments most useful. Of course, homeowners do not necessarily know what is possible or what will inspire them, and certain elements—the auditors themselves, or the blower door test—may sometimes be transformative.

Behavioral Perspectives on Home Energy Audits Page 152

Key Findings

Homeowner Circumstances and Motivation Survey respondents included a variety of different demographic circumstances, but overall were wealthier, more educated, and older than typical for Seattle. The program was not widely advertised; less than 1% of eligible customers completed an audit. This underscores our overall impression, from talking to homeowners and auditors, that participants were often particularly interested in energy consumption or energy efficiency. The characteristics of the homeowners that were surveyed are consistent with reported participation in previous home energy audit programs (Sanquist et al., 2010). The extent to which results from studying participant populations should be extrapolated to apply to non-participants is a typical challenge of program-oriented research.

Overall, these homeowners appeared to be quite motivated and oriented towards completing upgrades when entering the program, though some may have been more “curious” than predisposed to complete upgrades. When asked, after their audits, about their motivation for signing up for the audit, homeowners expressed a variety of reasons for undertaking the audit that indicated a predisposition to upgrade, including a desire to reduce energy costs (26%), improve energy efficiency (23%), address comfort problems (13%), or contribute to environmental sustainability (10%). However, a substantial portion of homeowners (23%) appeared to be largely motivated by curiosity about how their home worked or by the discount on the audit cost; these homeowners may not have been oriented to making a big investment. Nearly three-fourths of the homeowners had already done some energy upgrades to their home; many of these were looking for what to do next, though some wanted to figure out why they were not saving as much from those upgrades as they expected.

For the mostly middle-to-upper income households receiving these audits, “financing” was not stated as a major consideration. Interviews early in the audit process indicated that most households would pay out of pocket for any upgrades they would complete. Surveys after retrofits had been completed bore this out: few respondents reported taking out a loan to pay for the energy upgrades they had completed. Rather, most (83%) said they used cash or savings, and most of the rest said they paid for the work with credit. Those who took out a loan said that they had spent at least $10,000, some much more, while some paid cash even for jobs over $10,000.

Audit Site Visit—The Homeowner Experience The auditor was integral to the homeowner’s overall reaction to the audit. Many homeowners identified the importance of face-to-face discussions with the auditor, and appeared to be energized by the auditor’s enthusiasm. Homeowners frequently reported that the discussion with the auditor at the end of their visit was the most informative part of the whole process. The audit site visit was designed to allow time for these interactions. Conversely, a few households felt that their auditor was not sufficiently oriented to their own circumstances. For many households, the home energy audit may be more about interactions with the auditor than the report or detailed calculations. However, the auditor’s role typically ended with the delivery of the audit report and response to any follow-up questions—auditors reported limited post-audit follow-up, in line with the SCL program design.

Behavioral Perspectives on Home Energy Audits Page 153

Blower door testing and infrared thermography seemed particularly compelling to the homeowner, and may have motivated higher levels of weatherization and local air sealing, and possibly led some homeowners to think of energy use in their homes in a different and more sophisticated way. These diagnostic techniques make invisible energy flows and leakage problems tangible and offer a good spectacle. Also, after upgrades are completed, blower door and combustion safety testing provide useful checks to make sure that upgrades resolve—and do not create—health and safety problems.

How Scores were Perceived While program-participant Seattle homeowners found their home energy performance scores to be interesting, only a modest fraction of the audit participants appeared to be drawn by the opportunity to get an energy or carbon rating for their home. Less than half of the interview respondents said that they knew they would receive such a rating, and the rating may not have played a strong role in the marketing of these audits. The homeowners we talked to did not necessarily represent typical consumers of home ratings or scores, as the majority were not planning on selling their home in the near future. Instead, auditors thought the scores helped the homeowner make sense of their report. SCL’s program design was a deliberate balance of providing a comparable asset-based home energy score with a report providing more tailored recommendations for upgrades.

When asked to imagine that they were buying or selling a home, Seattle respondents familiar with the scores, after receiving an audit, consistently expressed an interest in seeing the scores of homes they were thinking about buying (95%). Respondents also indicated that they would be willing to share their home’s score with potential buyers (82%). However, these homeowners were rarely currently involved in selling their home or buying a new one, so the real estate context for this use of the scores was hypothetical.

When asked about the home purchase or sales process, real estate professionals saw qualified potential for ratings to provide information useful for the home buying decision. The real estate professionals we interviewed expressed concern that efficiency is not generally an advantage for existing homes compared to new homes, and also distinguished between marketing the energy efficiency features of a home and marketing the efficiency of the home as a whole.

How Reports and Recommendations were Perceived by Homeowners Auditors often provided extensively customized upgrade recommendations and other audit report content, and were encouraged to do so by Seattle City Light and the audit report format. Homeowners appreciated the customization, and those who received the auditor customized recommendations were more likely to complete more upgrades than those who did not. Auditors frequently developed this separate list of recommendations, tailored to what they perceived as the needs of the homeowner, and also often including a personal note to the homeowners. 75% of respondents to one of the surveys received this type of list, and in two thirds of these lists the recommendations were prioritized.

Still, many homeowners expressed the desire for more information and guidance from the reports. Some homeowners expressed an interest in getting more recommendations, particularly achievable, low-cost suggestions, or a greater level of detail on their home’s issues, such as inclusion of infrared

Behavioral Perspectives on Home Energy Audits Page 154 photos and other testing results. Some homeowners asked for more practical support on how to complete recommended upgrades, or more information on what upgrade subsidies or other financing incentives are available. A few households felt that the recommendations they received were too standardized, or not actionable for various reasons. Conversely, some homeowners said that the report was too detailed—indicating that not all homeowners necessarily wanted the same information from their audit.

The audit reports that homeowners received often gave very broad ranges for cost and savings estimates, which led to frustration for some auditors and homeowners. On the other hand, broad ranges may be more reflective of real modeling uncertainty than point estimates would be. In some cases the ranges provided to Seattle homeowners may have been unnecessarily broad (e.g., encompassing saving nothing and saving all end use costs). Still, both auditors and homeowners often found the estimates useful as a starting point, and doing without them does not seem to be a credible option. Typically, the EPS audit report provides a single number for the estimated fuel use after upgrades and for the approximate annual savings in dollars. However, the audit report implemented for SCL provided ranges instead of a single number for estimated fuel costs after upgrades and for the approximate annual savings in dollars after upgrades.

Upgrades Completed after the Home Energy Audit Homeowners seemed to prefer actionable, do-it-yourself problem-solving, interesting, and lower cost upgrades, though some households did bigger upgrades, too. In particular, homeowner action seemed focused on solving problems—such as plugging leaks or fixing safety problems—as compared to more abstract, less-visible, energy efficiency upgrades. In addition, these types of problem-solving upgrades may be easier to complete for many households. Of the homeowners we talked to at our latest survey point—an average of 8 months after the audit—43% said that they had not yet completed any recommendations, although many of these (67%) and many of those who had completed at least one recommendation (52%) still had plans to do upgrades in the future. Homeowners often cited costs, weather, family schedules, and waiting to bundle the job with other improvements as reasons for waiting, as well as, especially for white goods, waiting until appliance end of life. Nearly three in every four homeowners said that they had already completed upgrades prior to their audit—possibly leading to a reduced set of compelling retrofits remaining for auditors to recommend and homeowners to complete.

The blower door and infrared thermal imaging offered to participants seemed to make air sealing recommendations particularly appealing, based on our discussions with survey participants, while insulation upgrades, particularly wall insulation, were completed less, often due to expense, disruption, and technical challenges. Appliance upgrades appeared to be more a function of whether the appliance otherwise needed replacement. The most common recommendation provided to homeowners was air sealing (89%). Next was attic insulation (71%). Sixty percent received a recommendation to replace their dishwasher, refrigerator, or washing machine. Over half of audited households also received recommendations to insulate walls (61%) or install a high efficiency or tankless water heater (54%). Less frequent, but still common, were recommendations to seal or insulate ducts, to replace the heating system, to insulate the floor, or to add storm windows or high efficiency windows.

Behavioral Perspectives on Home Energy Audits Page 155

We conducted a post-retrofit assessment of 50 households who had completed upgrades, in order to assess the quality of the upgrades that had been completed. Based on this sample, upgrades that homeowners completed generally appeared to be of good quality. Where quality issues were found, these mostly reflected a lack of attention to detail or missed opportunities in air sealing or insulation— some attributed to contractors and some attributed to do-it-yourself upgrades. For these homes, the potential benefits from upgrades—energy, comfort, or otherwise—do not appear to have been substantially degraded by implementation. Also, two safety issues were found after upgrades—one home where replacement of suspect equipment failed to resolve the combustion safety issue found in the original audit, and another home whose rate of air leakage dropped below 0.35 ACHn (natural air changes per hour) after upgrades. These two cases (4%), while unusual in this sample, highlight that there are risks inherent to completing energy upgrades that can potentially be mitigated by “test-out” diagnostic assessments (after retrofitting work has been completed).

The primary tangible short-term benefits homeowners received from completed upgrades may often differ from their stated motivations for completing upgrades. At the time of the Retrofit Survey, three quarters of the homeowners who had completed upgrades expressed satisfaction with them, and reported improved comfort in their house along with the expectation of long-term cost and energy savings, a sense of accomplishment, and occasional co-benefits such as reduced street noise, reduced dust, and improved protection against pests. Less than half of these households indicated that they had noticed some (30%) or a lot (10%) of energy savings, while others reported that it was too early to tell (11%). A few expressed disappointment with the lack of energy savings. This differs somewhat from the reasons homeowners gave for completing the upgrades they did—which focused first on cost and energy savings (67%), and secondarily on comfort (28%) and other motivations. For most households, the study period covered only 6 to 12 months after the audit, so we could assess only short-term rather than longer-term reactions to the audits and upgrades. Gathering follow-up utility data would enable assessment of the energy savings from these upgrades.

Changes in Energy Use Behaviors Though the audits as formally designed addressed homeowner behavior only peripherally, a quarter of respondents reported that their household changed their energy use behaviors in reaction to the audits. For these asset-based audits, behavior change was not a major objective of the audit, though some recommendations for behavior change were included in “no or low-cost strategies” portions of the home energy audit report. In addition to conservation-oriented changes in household energy management, a second type of behavior change could potentially occur in response to technical changes in the home, e.g., increasing thermostat set-points once a more energy-efficient furnace is installed. Few homeowners mentioned this kind of behavior change, including take-back effects, though accurate tracking of behavior change is notoriously difficult to achieve.

Modeling Tools—Scores and Recommendations Using a circumscribed set of model inputs, we found that EPS Auditor and the Home Energy Scoring Tool gave energy use estimates that were reasonably consistent with each other—for a set of homes modeled in both tools. This finding provides support for the idea that different asset tools can

Behavioral Perspectives on Home Energy Audits Page 156 generate comparable asset scores or ratings for houses, especially when using a similar scope of inputs and similar assumptions regarding occupancy and energy use behaviors. There is no consensus on what energy uses are “in scope” or what occupancy and energy use behavior assumptions to use in asset modeling and scoring tools. While the assumptions used to represent “standard occupant behavior” differed somewhat between EPS Auditor and the Home Energy Scoring Tool at the time of the analysis, the model estimates still coincided reasonably well. Also, our analysis did not consider sources of variation in model results such as auditor measurement and interpretation differences, while the limited model inputs we used likely caused model estimates to be somewhat more consistent with each other than they would otherwise have been if we had a full set of model inputs available for the Home Energy Scoring Tool.

For a small set of homes, we compared the upgrade recommendations generated by auditors using the EPS tool with recommendations we generated for the same house using the Home Energy Scoring Tool. Despite the consistency in modeled energy use estimates noted above, the upgrade recommendations, costs, and savings estimates were quite different between these two asset-based tools. That is, a homeowner would be likely to receive different recommendations from an audit using EPS Auditor than from an assessment using the Home Energy Scoring Tool. We have no basis to determine whether either approach gave “better” recommendations by any particular criteria, and assessing the source of these differences, e.g., cost-effectiveness criteria or cost assumptions, was beyond the scope of our analysis. The auditor-customized recommendations also provided with most SCL audit reports were not included in this comparison.

For 101 homes, we compared audit modeling tool estimates of total energy use to utility-reported energy consumption. While the tools that we considered estimated group-average energy use moderately closely, we found large differences between model estimates and reported usage for many homes, with larger differences for asset-based model estimates than for estimates that included inputs on occupancy and basic energy use behaviors. Asset-based tools, e.g. EPS Auditor, used for these Seattle audits, and the Home Energy Scoring Tool, do not consider occupancy and household energy use behaviors beyond applying standardized assumptions, and are not necessarily intended to reliably predict actual energy usage. However, these tools do use model-generated usage estimates to select upgrade recommendations or to calculate upgrade savings estimates. Homeowners using these savings estimates for upgrade decisions risk being misinformed if models significantly over or under- estimate the savings they would see from completing upgrades. For EPS audits, personal interactions with the auditor and their customization of the report may often overshadow or qualify this model- generated guidance.

Follow-Up Research The data collection and analysis we have completed for this project create an opportunity to assess the degree of energy savings that were achieved by participating households and to assess these savings by the retrofits completed. This would require collection and analysis of utility consumption data for the period from October 1, 2011 to September 30, 2012 from the limited number of households that both reported completing upgrades and for whom we have utility data, and preferably, given statistical

Behavioral Perspectives on Home Energy Audits Page 157 considerations, would include additional natural gas utility data collection for households for which we were not able to obtain data within this study period.

Additional Research Questions Our findings suggest several open questions that, if better understood, could help home energy audits and assessments better speak to households:

 While we found substantial diversity in terms of what households wanted from their audits, this research focused on what appeared to be a relatively narrow slice of homeowners. The population as a whole may have much more diverse inclinations regarding energy efficiency. Could research pursuing household segmentation with respect to home energy efficiency be used to refine home energy audits and better tailor programs to meet these diverse homeowner (and non-homeowner) needs?

 We found that some households were interested in better operating their home and optimizing their energy use behaviors—not just in improving the efficiency features of their home. Do home energy audits provide a viable opportunity to provide households with specific behavioral guidance, based on incorporating operational and behavioral inputs into the modeling, or encouraging auditors to actively consider these issues? Do some household segments have a greater opportunity for savings, or are some more willing and able to take these actions than others? Would the addition of behavioral information and guidance be too much information to present in the audit, or would it trigger only transient changes?

 We found indications that the guidance generated or influenced by the modeling tools—particularly upgrade savings estimates, but also the upgrade recommendations themselves—may present an opportunity for improvement and a need for more rigorous validation. This appears to apply to both asset and operational modeling. How much can the information and guidance to households be improved by: o Reducing auditor measurement and interpretation variability and input error? o Improving model calculations or assumptions? o Considering household behavior in the modeling? o Considering non-energy benefits and downsides?

We are optimistic that research in these topics—and others—will strengthen policies and programs to help deliver the best home energy audits possible and to further support efforts to achieve homes that provide good indoor environments and reduced energy use.

Behavioral Perspectives on Home Energy Audits Page 158

References

Allcott, H. 2010. “Social norms and energy conservation.” June 9. MIT and NYU. LINK

Blasnick, M. 2011. “Simplified Calculation Methods & Modeling Limitations.” Presentation for ACI National Home Performance Conference CHAL1, March 30, 2011.

[BPI] Building Performance Institute, Inc. 2011. BPI-2400-S-2011: Standardized Qualification of Whole House Energy Savings Estimates. Consensus Document, November 30, 2011-Version 2.

Cataldo, S. 1998. “Massachusetts Audits: More Smiles than Savings,” Home Energy Magazine. July/August.

Cayla, J., B. Allibe, and M. Laurent. 2010. “From Practices to Behaviors: Estimating the Impact of Household Behavior on Space Heating Energy Consumption.” ACEEE Summer Study on Energy Efficiency in Buildings France (2010).

Crosbie, T., and K. Baker. 2010. “Energy-efficiency interventions in housing: learning from the inhabitants.” Building Research and Information 38(1): 70 – 79.

Deru, M., and P. Torcellini. 2007. Source Energy Factors - Source Energy and Emission Factors for Energy Use in Buildings. NREL-38617. LINK

Earth Advantage Institute and Conservation Services Group. 2009. Energy Performance Score 2008 Pilot: Findings and Recommendations Report. Prepared for Energy Trust of Oregon. LINK

Fairey, P. 2010. “Time-of-Sale Energy Labeling of Homes: A Concept Paper.” Home Energy Magazine. July/August.

Farhar, B.C. 2000. Pilot States Program: Home Energy Rating Systems and Energy-Efficient Mortgages. National Renewable Energy Laboratory. Golden, CO. LINK

Frondel, M. and C. Vance. 2011. “Heterogeneity in the effect of home energy audits: Theory and Evidence.” SFB 823 Discussion Paper. Nr. 41/2011.

Fuller, M., C. Kunkel, M. Zimring, I. Hoffman, K. Lindgren Soroye, and C. Goldman. 2010. Driving Demand for Home Energy Improvements: Motivating residential customers to invest in comprehensive upgrades that eliminate energy waste, avoid high bills, and spur the economy. Report LBNL-3960E. Berkeley, CA: Lawrence Berkeley National Laboratory, Environmental Energy Technologies Division. LINK

Glickman, J. 2011. Home Energy Score Partnership Opportunities. U.S. Department of Energy. Presentation to NASEO members on November 15, 2011 conference call. LINK

Guyon, G. 1997. “Role of the model user in results obtained from simulation software program.” In Proceedings of the International Building Performance Simulation Association. Prague. LINK

Behavioral Perspectives on Home Energy Audits Page 159

Hendron, R., R. Anderson, C. Christensen, M. Eastment, and P. Reeves. 2004. “Development of and Energy Savings Benchmark for All Residential End-Uses.” SimBuild 2004 Conference, Boulder, NREL/CP- 550-35917, National Renewable Energy Laboratory, Boulder, CO.

Hendron, R., and C. Engebrecht. Oct. 2010. "Building America House Simulation Protocols", National Renewable Energy Laboratory.

Hirst, E., L. Berry, and J. Soderstrom. 1981. “Review of Utility Home Energy Audit Programs.” Energy 6(7):621-630.

“Home Energy Saver: Engineering Documentation. Home Energy Scoring Tool for Asset Rating.” 2012. Lawrence Berkeley National Laboratory. March 9, 2012. [https://sites.google.com/a/lbl.gov/hes- public/home-energy-scoring-tool]

“Home Energy Score.” 2012. U.S. Department of Energy: Energy Efficiency & Renewable Energy. March 9, 2012. [http://www1.eere.energy.gov/buildings/homeenergyscore/homeowners.html]

Khawaja, S. and P. Koss. 2007. “Building better weatherization programs.” Home Energy Magazine. March/April. LINK

Lutz, J.D., X. Liu, J.E. McMahon, C. Dunham, L.J. Shown and Q.T. McGrue. 1996. Modeling Patterns of Hot Water Use in Households. Berkeley, CA: Lawrence Berkeley National Laboratory. LBNL-37805. November.

Lutzenhiser, L., and S. Bender. 2008. “The ‘Average American’ Unmasked: Social Structure and Differences in Household Energy Use and Carbon Emissions.” Proceedings of the 2008 ACEEE Summer Study on Energy Efficiency in Buildings. American Council for an Energy Efficient Economy, Washington DC.

Manuel, J. 2011. “Avoiding Health Pitfalls of Home Energy-Efficiency Retrofits.” Environmental Health Perspectives 119(2):A76-A79.

Marion, W. and K. Urban. 1995. User’s Manual for TMY2s: Typical Meteorological Years. National Renewable Energy Laboratory. June. LINK

MetaResource Group. 2012. Home Energy Performance Scores: Efforts to Date with Modeling Tool Comparison and Summary of Key Issues. January 16, 2012. Prepared for Energy Trust of Oregon, Inc. LINK

Mills, E. 2002. Review and Comparison of Web- and Disk-based Tools for Residential Energy Analysis. Prepared for U.S. Department of Energy, EERE Building Technologies Program. LBNL-50950. September 5. LINK

National Renewable Energy Laboratory [NREL]. 2010. Summary of Gaps and Barriers for Implementing Residential Building Energy Efficiency Strategies. August. Building Technologies Program. Building America. Golden, CO. LINK

Behavioral Perspectives on Home Energy Audits Page 160

“National Residential Efficiency Measures Database.” n.d. National Renewable Energy Laboratory. March 22, 2012. http://www.nrel.gov/ap/retrofits/about.cfm

Palmer, K., M. Walls, H. Gordon, and T. Gerarden. 2011. Assessing the Energy-Efficiency Information Gap: Results from a Survey of Home Energy Auditors. October. Resources for the Future. Washington, DC. LINK

Parker, D., P. Fairey, and R. Hendron. 2010. “Updated Miscellaneous Electricity Loads and Appliance Energy Usage Profiles for use in Home Energy Ratings, Building America Benchmark Procedures and Related Calculations.” Florida Solar Energy Center report FSEC-CR-1937-10. Prepared for the U.S. Department of Energy, Building America Program.

Pérez-Lombard, L., J. Ortiz, R. González, and I.R. Maestre. 2009. “A review of benchmarking, rating and labelling concepts within the framework of building energy certification schemes.” Energy and Buildings 41(3):272-278.

Polly, B., N. Kruis, and D. Roberts. July 2011. Assessing and Improving the Accuracy of Energy Analysis for Residential Buildings. Prepared for Building America, Building Technologies Program, U.S. Department of Energy.

Pryne, E. 2011. “The Seattle-area housing market’s big plunge.” The Seattle Times. January 28. LINK

Robison, D. 2012. Energy Performance Score Modeling Comparison. Prepared for Energy Trust of Oregon by Stellar Processes, Inc. Appendix B to Home Energy Performance Scores: Efforts to Date with Modeling Tool Comparison and Summary of Key Issues. January 18, 2012. LINK

Shapiro, I. 2011. “10 Common Problems in Energy Audits.” ASHRAE Journal 53(2):26-32.

Sanquist T., A. Sanstad, and L. Lutzenhiser. 2010. Selected Review of Literature Pertinent to the Question: What Can Be Done To Increase The Adoption Rate Of Home Energy Retrofits? Lawrence Berkeley National Laboratory, Pacific Northwest National Laboratory, and Portland State University.

Sanstad, A., R. Diamond, T. Sanquist, and L. Lutzenhiser. 2010. “U. S. single-family homeowners’ decision-making regarding energy retrofits: a literature review.” Draft July, 2010. Lawrence Berkeley National Laboratory, Pacific Northwest National Laboratory, Portland State University, July 1.

SENTECH, Inc. November 2, 2010. Review of Selected Home Energy Auditing Tools: In support of the Development of a National Building Performance Assessment and Rating Program. Prepared for The U.S. Department of Energy, Office of Energy Efficiency and Renewable energy.

Stephens, B., E. Carter, E. Gall, M. Earnest, E. Walsh, D. E. Hun, and M. C. Jackson. 2011. “Home energy- efficiency retrofits.” Environmental Health Perspectives 119(7):a283-a284.

Stern, P. 1985. Energy Efficiency in Buildings: Behavioral Issues. Washington, DC: National Academy of Sciences.

Behavioral Perspectives on Home Energy Audits Page 161

Tachibana, D. 2010. Residential Customer Characteristics Survey 2009. Seattle City Light. Seattle, WA. LINK

Toshi A.H., S. Li, R.G. Newell, and K. Palmer. 2011. Cost-Effectiveness of Electricity Energy Efficiency Programs. NBER Working Paper No. 17556.

U.S Department of Energy. 2010. Audit report. The State of Illinois Weatherization Assistance Program. OAS-RA-11-01. October. Washington, DC. LINK

Van de Grift, S., and L. Schauer. 2010. “A Hand to Hold: a Holistic Approach to Addressing Barriers in the Home Retrofit Market.” Proceedings of the 2010 ACEEE Summer Study on Energy Efficiency in Buildings, 2-305. Washington, D.C.: American Council for an Energy-Efficient Economy.

Wilson, C. 2008. “Understanding and Influencing Energy Efficient Renovation Decisions.” Unpublished doctoral dissertation, University of British Columbia.

Woods, J., L. Lutzenhiser, and M. Moezzi. 2010. “How Much Does Household Behavior Count?” Poster Presentation at the 2010 ACEEE Summer Study for Energy Efficiency in Buildings (17 August, 2010).

Woods, J. 2003. “Oh Yea: Imputing Forgotten Responses in a Multi-Year Survey of Conservation Behavior.” IDEAS, Economic Research Department of the Federal Reserve Bank of St., Louis.

Behavioral Perspectives on Home Energy Audits Page 162

Appendix A: EPS Scorecard

Behavioral Perspectives on Home Energy Audits Page 163

Behavioral Perspectives on Home Energy Audits Page 164

Appendix B: EPS Energy Analysis Report

Behavioral Perspectives on Home Energy Audits Page 165

Behavioral Perspectives on Home Energy Audits Page 166

Behavioral Perspectives on Home Energy Audits Page 167

Behavioral Perspectives on Home Energy Audits Page 168

Behavioral Perspectives on Home Energy Audits Page 169

Behavioral Perspectives on Home Energy Audits Page 170

Behavioral Perspectives on Home Energy Audits Page 171

Behavioral Perspectives on Home Energy Audits Page 172

Behavioral Perspectives on Home Energy Audits Page 173

Behavioral Perspectives on Home Energy Audits Page 174

Behavioral Perspectives on Home Energy Audits Page 175

Behavioral Perspectives on Home Energy Audits Page 176

Behavioral Perspectives on Home Energy Audits Page 177

Behavioral Perspectives on Home Energy Audits Page 178

Appendix C: Sample Home Energy Score Report From: http://apps1.eere.energy.gov/buildings/publications/pdfs/homescore/energy_label_sample.pdf (Home Energy Score, April 23, 2012)

Behavioral Perspectives on Home Energy Audits Page 179

Behavioral Perspectives on Home Energy Audits Page 180

Behavioral Perspectives on Home Energy Audits Page 181

Appendix D: Auditor Interview Guide

Intro Thank you for taking the time to help us with this research. We are hoping that these interviews will help us develop a general understanding of auditors’ experiences with homeowners, their reactions to the Energy Performance Scorecard and audit report.

We won’t reveal the names of the people we talk with, but we may use something you say to make a point in our summary report. You can skip over questions if you would rather not provide an answer for some reason.

Auditor Experience and View of Homeowners 1. First off, how long have you been doing home energy audits?

2. In all, about how many did you do in 2010? o Were they all similar to the type of audit you conduct for the Seattle City Light EPS program? o How do they differ?

3. Thinking of the EPS audits you did in 2010 for the Seattle City Light program, how often did homeowners 'follow along' while you audit their home? o What are the advantages and disadvantages for you when they do?

o Do homeowner benefit from following along?

o Do you actively encourage homeowners to follow along?

o And do they see homeowner energy bills? How do they use them?

4. What do you think is the most valuable part of the SCL audit for homeowners?

o What about the specific audit recommendations?

o The savings calculations? Do homeowners value this information?

o What about the estimates of what the upgrade would cost?

Behavioral Perspectives on Home Energy Audits Page 182

5. As I understand it, the EPS audit generates a score and suggests recommended upgrades based on the house itself assuming average occupancy and behavior, while the actual people living in these homes differ from the model’s averages.

o Do you think that homeowners typically understand this distinction?

o And do you ever tailor your recommendations to fit the current occupants rather than for the house in general? If so, how do you do this?

o Do you ever talk to people and get the sense that they won't be able to do any of your recommendations? Ya know, they don't seem to have the money or time? In these cases what do you do … do you talk about behavioral changes that might try, or minor do it yourself things like weather stripping?

IF NEEDED: By taking into account the occupants, their behaviors, HVAC setting, and the like when recommending which upgrades make the most sense.

6. How, if at all, will Scorecards be useful to those in the home audit business? How (do you / or might) you use them?

Homeowners and the Audit Report

7. After getting their EPS report do homeowners ever call you with follow-up questions or comments? o [If YES] About what percentage called you back for any reason last year?

o And what did they typically call about? [Probes: clarification, more information, specific advice, or what?] . Are many of these questions from people trying to understand their Scorecard? Y/N/DK Are there any questions on the scorecard that come up most frequently? . Are many of these questions from people trying to understand their report? Y/N/DK Are there any questions on the report that come up most frequently?

o How much time does responding to a call back generally take?

Behavioral Perspectives on Home Energy Audits Page 183

8. Can you think of any additional type or level of communication (during audit, after report) that would be beneficial for homeowner understanding of recommendations?

Homeowner Use of Audit Report related to Retrofit Decision Making 9. We know that different things motivate homeowners to remodel. But do you have any ideas about how to motivate people to do more energy-efficiency upgrades or remodeling? o IF NEEDED: How does the audit information help?

10. Is there anything you think that many homeowners do that you think they shouldn’t be doing in terms of energy efficiency retrofits or upgrades, common mistakes?

We’re about finished, but before we wrap this up…

11. Do you, or does your company, also have the capacity to do contracting work to implement recommended upgrades?  No – the company only does audits  skip to End  Contact also personally does contracting work - skip to Q12  Someone else in the company does the contract work

Continue:

Who would be the best person for me to call to find out contracting work that may have been done for SCL audit households?

NAME(s) [ck spelling]: ______

Any specific speciality (insulation/HVAC/Plumbing/ Whole House, etc] ______

Phone(if different): ______

And/or Email: ______

12. [IF Auditor also does contracting work] o What factors do you think make it more likely that a homeowner will complete most or all of the upgrades recommended from the audit?

o About how many SCL audit participants did you do contracting work for during 2010? _____

o How did your and these homeowners use the audit to make choices about which recommendation to undertake?

Behavioral Perspectives on Home Energy Audits Page 184

o Have you ever done contract work for a household basing energy upgrade choices on an audit that you did not perform? . [IF YES] How do you manage those situations?

13. In general, do you see the audit as an effective sales tool for remodeling/retrofit services? o [IF YES] Which part or parts of the audit process are most effective in selling contracting services? What about homeowner present during audit, the EPS Scorecard, the EPS report, other?

Behavioral Perspectives on Home Energy Audits Page 185

Appendix E: Auditor Interviews Results – Presentation to Team, 3/2/11

Behavioral Perspectives on Home Energy Audits Page 186

Behavioral Perspectives on Home Energy Audits Page 187

Behavioral Perspectives on Home Energy Audits Page 188

Behavioral Perspectives on Home Energy Audits Page 189

Behavioral Perspectives on Home Energy Audits Page 190

Appendix F: Real Estate Professionals Interview Guide

ID:____

Contact name: ______

Company name: ______

Phone number: ______

Date: ______

Intro: Hello, I'm ____ with Research Into Action in Portland. We're conducting research on behalf the U.S. Department of Energy. We're exploring what realtors think about offering “energy-efficiency” labels on single-family homes that are on the market.

SCREENING Q: We’re focusing on residential sales – I’d like to confirm that you work in the residential market.

Yes  continue

No  discontinue and thank

Do you have time to talk? This will take only 20-30 minutes. I'll keep your identity and responses confidential. If you're too busy right now, I'd be happy to schedule a time that will work better for you.

Call back (Date and time) ______

Background: Let me start with some introductory information. Recently, there have been proposals that home energy rating “labels” be offered for houses on the market throughout the US. These ratings would be based on a professional home energy audit. They might be mandatory or voluntary. Such labels would be included in an RMLS listing or shared with the homebuyer. The rating would: show buyers the home's expected energy use, and use the home selling process to encourage energy- efficiency retrofits that could make the US housing stock more energy-efficient.

In some of these questions, I'll ask you to differentiate between new homes and existing homes, and home buyers and sellers.

Let’s start with how energy efficiency is, or isn’t, being used to market homes.

1. How long have you been in residential home sales? [Years:___] a. Do you sell primarily new homes, existing homes, or both? [Collect percentages]

Behavioral Perspectives on Home Energy Audits Page 191

2. Are realtors using information on energy efficiency to market homes? [Note: RMLS listings on the Internet include “Green Certification” and “Energy Efficiency Features” information when it is available.] [YES/NO] a. IF YES: How, if at all, has this changed in the past year or two?

b. Does the industry tend to use EE info when marketing a newly built home? [YES/NO] Why is that?

c. Does the industry use EE information when marketing existing homes? [YES/NO] Why is that?

3. Do you use any specific resources to help you explain the potential connections between a home's energy features and its market value? [YES/NO]

a. [If YES] What is most useful? b. What, if any, additional information would you and other realtors find useful to help you explain the potential connections between a home's energy features and its market value?

4. Is it safe for me to assume that virtually all buyers ask you about the home's current energy bills? [YES/NO] [IF NO] What percentage of the time do your buyers want to know about the utility bills? [Get a number]Do you find that buyers make the connection between utility costs and the energy-efficiency (features or rating) of the house?

5. How, if at all, do you discuss the value of an energy-efficient versus a non-energy-efficient home to potential homebuyers? a. In your experience, when are the best times to talk with buyers about this?

6. When your clients who are selling a home, are thinking about doing some upgrades to better market their homes, does making energy-efficiency upgrades (instead of non-EE upgrades) even come up? [YES/NO] a. [If YES] Please tell me what these discussions are like. b. [If not addressed] What do you do to encourage your clients to do energy-efficiency upgrades in the homes they’re selling?

Behavioral Perspectives on Home Energy Audits Page 192

7. What, if anything, do you do to encourage buyers of existing homes to consider energy- efficiency upgrades? a. [[If YES] When do these discussions with clients typically happen?

Now I’d like to ask you about home energy labeling. I'm referring to various kinds of labels, such as the Energy Performance Score (EPS), the Home Energy Score (HES), along with ENERGY STAR ratings and "green" certificates such as LEED.

8. From your perspective as a realtor, are there any advantages to the real estate industry of using energy and/or carbon scores to market new homes? [YES/NO] a. [IF YES] What are they? b. Are there any disadvantages from an industry perspective of using energy and/or carbon scores to market new homes? [YES/NO] i. [IF YES] What are the disadvantages?

9. Again, from your perspective, are there any advantages to the real estate industry of using labeling to market existing homes? [YES/NO] a. [If YES]: Can you describe those advantages? b. Will home labeling create any disadvantages from an industry perspective? [YES/NO] i. [If YES]: What are they?

10. Now, thinking about homebuyers, what benefits, if any, does home energy labeling offer to potential buyers of a newly built home? a. Does energy labeling offer the same or different benefits to buyers of existing homes? [Same/Different] b. [If Different] Please explain benefits to buyers of existing homes.

11. What about disadvantages? Does home energy labeling present any disadvantages to any market segments or groups of buyers of newly built houses? [YES/NO] [If YES], please explain. a. Do these labels offer the same or different disadvantages to buyers of existing homes? [Same/Different] b. [If Different] Please explain.

12. And what about the sellers of existing homes? From their perspective, how might labeling benefit or hinder the sale of their home?

13. Overall, do you think that requiring all homes to have an energy rating (such as EPS or HES) would help improve the energy efficiency of US homes? [YES/NO] [Why/why not?]

Behavioral Perspectives on Home Energy Audits Page 193

14. In the absence of mandatory ratings, what impact do voluntary labels (such as ES and LEEDS) have on improving the energy efficiency of US homes?

a. Do these generally affect the new homes market, and not existing market?

15. EPS and HES provide energy and carbon information. What information do you think labels should provide?

16. As I mentioned before, there seem to be quite a few labels and scores in the marketplace (including ENERGY STAR, EPS, HES, LEED). Do you have a preference? [YES/NO] Please explain.Do the different types of home labels make it easier or more confusing for homebuyers to decide if they want to buy a particular home? [Probe: New/existing] a. What would work better? Please explain.

We’re about done. I have just four more questions.

17. Do you identify yourself as a "green" real estate agent? [Open end] [Do customers seek you out for that?] [How do they know you have those capabilities/interests?]

18. About what percent of the houses you sell do you market as "green"? [Get a number] a. How has that changed in the last 5 years? b. Why?

19. Can you estimate the percent of your residential customers who ask about a home's energy features and energy consumption? [Get a number] a. How has that changed in the last 5 years? b. Why?

20. Is there anything else you’d like to add?

Behavioral Perspectives on Home Energy Audits Page 194

Appendix G: Real Estate Professionals Interviews Presentation 8/16/11

Behavioral Perspectives on Home Energy Audits Page 195

Behavioral Perspectives on Home Energy Audits Page 196

Behavioral Perspectives on Home Energy Audits Page 197

Behavioral Perspectives on Home Energy Audits Page 198

Behavioral Perspectives on Home Energy Audits Page 199

Appendix H: Homeowner Pre-Audit Interview Guide

Q46 Hello, my name is _____. I'm calling from Research Into Action on behalf of the City of Seattle Office of Sustainability and Environment. You recently signed up for a discounted home energy audit through Seattle City Light. The City of Seattle is interested in knowing why people like you are interested in an energy audit.

Q47 I'd like to talk to the person in your household who knows the most about the Seattle Home Energy Audit you signed up for. Is that you?

 Yes (1)  No, someone else [record information] (2)  Don't know [terminate] (3)

Q48 Alternate Contact info:

Info: (1) Name (1) Phone (2) Best time to contact: (3) Q5 First, can I verify that you are over the age of 18?

 Yes (1)  No [Ask to speak to someone else] (2)

Q4 Our discussion will take about 10-12 minutes. Is this a good time to talk?

 Yes, continue (1)  No, reschedule: (2) ______

Q6 Have you already had the Seattle City Light energy audit performed on your house?

 Yes, When: (1) ______ No (2)

Q7 If they already had the audit, switch to the post-audit guide or schedule a time to call back.

Q8 Before we start I’d like you to know that this interview is voluntary and your responses are confidential. We won't report anything you say in a way that could identify you. And if you’d like, we can skip over questions.

Q9 Why did you sign up for a home energy audit? [Probe: What were you hoping to get out of the energy audit? Are there any specific issues with your house that you are hoping to fix or improve (comfort, energy use, air quality, "performance")?]

Behavioral Perspectives on Home Energy Audits Page 200

Q10 After the energy audit, you'll receive an energy performance score that compares your home to the average home in Seattle. Were you aware of this?

 Yes (1)  No (2)

If Yes Is Selected, Then Skip To Was this part of your motivation to s...If No Is Selected, Then Skip To How did you hear about the Seattle Ci...

Q11 Was this part of your motivation to sign up for the home energy audit?

 Yes (1)  No (2)

Q12 [Elaboration if yes] Why were you interested in getting an energy performance score for your home?

Q13 How did you hear about the Seattle City Light Home Energy Audit? [Do not read options]

 Friend [Did the friend also get an audit?] (1) ______ Neighbor [Did the neighbor also get an audit?] (2) ______ Family members [Did they also get an audit?] (3) ______ Utility website (4)  Home energy auditor (5)  Other: (6) ______

Q14 How energy efficient do you consider your house to be now- that is, the structure and heating or cooling systems?

Q15 Are you able to maintain comfortable temperatures during winter and summer?

 No, can't keep cool in summer (1)  Never able to keep comfortable (2)  No, can't keep warm in winter (3)  Always able to stay comfortable (4)  Other (5) ______

Q16 Do you consider your current energy costs to be high, low, or reasonable?

 High (1)  low (2)  Reasonable (3)

Q17 Have you previously tried to improve the house's energy efficiency by upgrading equipment or the structure?

 Yes (1)

Behavioral Perspectives on Home Energy Audits Page 201

 No (2)

If Yes Is Selected, Then Skip To What have you done? If No Is Selected, Then Skip To Which would you say is most true of y...

Q18 What have you done?

 Replaced lights with CFLs (1)  Weather stripping (2)  Added insulation (3)  Other: (4) ______

Q19 How did this turn out? [Probe: Did you see a reduction in your energy bills? Any change in comfort?]

Q20 Which would you say is most true of your household?

 Everybody in my household tries hard to conserve energy (1)  Some people in my household try hard (2)  Nobody in my household is concerned with saving energy (3)  Other: (4) ______

Q21 How do you think your household's energy use might compare to other households in Seattle that are similar, in terms of size and age?

 Greater (1)  The same (2)  Less than (3)  Don't know (4)

Q22 Would you say your household is "very", "somewhat" or "not at all" concerned about reducing its carbon emissions- or carbon footprint?

 Very (1)  Somewhat (2)  Not at all (3)  Don't know (4)

Q23 Now let's talk a bit about home remodeling and renovations

Q24 Have you done any home remodeling or renovation in the last five years?

 Yes (1)  No (2)  Don't know (3)

Behavioral Perspectives on Home Energy Audits Page 202

Q25 If yes, what did you do? About how long ago did you complete the renovations/remodel? How did it turn out? [good results, good experiences with the contractors? desired results acheived?]

Q26 In your household, who typically completes home repairs that involves ladders and tools?

 A household member (1)  A handyman/contractor (2)  Other: (3) ______

If A household member Is Selected, Then Skip To Does the 'handy' person in your house...If A handyman/contractor Is Selected, Then Skip To If doing something recommended by the...If Other: Is Selected, Then Skip To If doing something recommended by the...

Q27 Does the 'handy' person in your household have experience completing any energy-related improvements in the past?

 Yes (1)  No (2)  Don't know (3)

Q28 [If yes] What types of energy improvements have they done?

Q29 [If yes] Is the 'handy' person in your household likely to do some of the improvements recommended by the home energy audit?

Q30 If doing something recommended by the energy audit requires hiring a contractor, how might you go about finding one?

Q31 Based on what you know today, are there any projects you are interested in doing to improve the energy efficiency or comfort of your home? [If yes,] What kinds of projects are you interested in doing? If there are lots of them, just tell me the top three.

Behavioral Perspectives on Home Energy Audits Page 203

Choose How long have you been Do you have specific plans for doing this – if yes, three thinking about doing this? when will it be done? <1 1-2 2+ 1- 3- 6- Click 3 DK <1yr 10+yrs DK yr yrs yrs 2yrs 5yrs 9yrs (1) (4) (1) (5) (6) (1) (2) (3) (2) (3) (4) replace aging/inefficient            appliances (1) install efficient heating/cooling system            (2) install efficient hot water            system (3) install insulation, air sealing, weather stripping            (4) install new windows (5)            install solar electric/hot            water system (6) recycle            refrigerator/freezer (7) Other (8)            Other (9)            Not interested in any            upgrades (10) Q32 Overall, why are you interested in these particular improvements? [What benefits do you expect to see?] [Probe: Financial or non-financial benefits (comfort, health, value)]

Q33 I don't need any specifics here, but in general- how would you pay for these types of home energy improvements?

 Finance (1)  Out of pocket (2)  Both financed AND out of pocket (3)  Other: (4) ______

Q34 Do you know if any utility rebates or tax credits are being offered for energy improvements?

 Yes (1)  No (2)

Q35 If the energy audit recommends several upgrades, how likely is it that a rebate would affect your decision about which home energy improvements you’ll do?

 Not at all likely (1)  Somewhat likely (2)

Behavioral Perspectives on Home Energy Audits Page 204

 Very likely (3)

Q36 Optional comments:

Q37 Are tax credits likely to affect what upgrades you’ll choose to do?

 Not at all likely (1)  Somewhat likely (2)  Very likely (3)

Q38 How many years have you lived in this house?

 Years (use decimals, no fractions): (1) ______ Don't know (2)

Q39 Approximately how many years do you plan to stay in this house

 (1) ______ 1-2 years (2)  3-5 years (3)  6-9 years (4)  more than 10 years (5)  Don't know (6)

Q44 Including yourself, please tell me how many people in your household are in the following age categories:

Number in Household (1) 4 and under (1) 5 to 17 (2) 18 to 24 (3) 25 to 34 (4) 35 to 44 (5) 45 to 54 (6) 55 to 64 (7) 65 to 74 (8) 75 and over (9) Refused (10) Q45 And which category do you fall into?

Q41 After you’ve had your home energy audit, we’d really like to hear what you think of the results. May we call you back then?

 Yes (1)  No (2)

Behavioral Perspectives on Home Energy Audits Page 205

 No, if no could we call another household member? (3) ______

Q42 If you'd like us to call someone else, who should we call? [Get name and number]

Q43 Thank you so much for taking the time. Have a nice day.

Behavioral Perspectives on Home Energy Audits Page 206

Appendix I: Homeowner Pre-Audit Interview Closed-Ended Responses

This section contains responses frequencies for closed-ended interview questions. Questions for which there is no closed-ended data do not appear. Similarly, response options with a frequency of zero do not appear. Unless otherwise specified, all questions have 35 responses.

Q10. After the energy audit, you'll receive an energy performance score that compares your home to the average home in Seattle. Were you aware of this?

Count Percent No 19 54% Yes 16 46%

Q11. Was this part of your motivation to sign up for the home energy audit? (n=16)

Count Percent No 10 63% Yes 6 38%

Q13. How did you hear about the Seattle City Light Home Energy Audit?

Count Percent Family members [Did they also get an audit?] 3 9% Friend [Did the friend also get an audit?] 7 20% Neighbor [Did the neighbor also get an audit?] 3 9% Other: 16 46% Utility website 6 17%

Q15. Are you able to maintain comfortable temperatures during winter and summer?

Count Percent No, can't keep cool in summer 2 6% No, can't keep warm in winter 5 14% Always able to stay comfortable 20 57% Other 14 40%

Q16. Do you consider your current energy costs to be high, low, or reasonable? (n=31)

Count Percent High 22 71% Low 2 6% Reasonable 7 23%

Behavioral Perspectives on Home Energy Audits Page 207

Q17. Have you previously tried to improve the house's energy efficiency by upgrading equipment or the structure? (n=34)

Count Percent No 8 24% Yes 26 76%

Q18. What have you done? (n=26)

Count Percent Replaced lights with CFLs 12 46% Weather stripping 4 15% Added insulation 8 31% Other: 23 88%

Q20. Which would you say is most true of your household?

Count Percent Everybody in my household tries hard to conserve energy 25 71% Some people in my household try hard 6 17% Other: 4 11%

Q21. How do you think your household's energy use might compare to other households in Seattle that are similar, in terms of size and age?

Count Percent Greater 9 26% Less than 11 31% The same 7 20% Don't know 8 23%

Q22. Would you say your household is "very," "somewhat," or "not at all" concerned about reducing its carbon emissions – or carbon footprint?

Count Percent Very 17 49% Somewhat 13 37% Not at all 2 6% Don't know 3 9%

Q24. Have you done any home remodeling or renovation in the last five years?

Count Percent No 12 34% Yes 23 66%

Behavioral Perspectives on Home Energy Audits Page 208

Q26. In your household, who typically completes home repairs that involve ladders and tools?

Count Percent A handyman/contractor 12 34% A household member 11 31% Other: 12 34%

Q27. Does the 'handy' person in your household have experience completing any energy-related improvements in the past? (n=11)

Count Percent No 3 27% Yes 8 73%

Q31. Based on what you know today, are there any projects you are interested in doing to improve the energy efficiency or comfort of your home? What kinds of projects are you interested in doing? If there are lots of them, just tell me the top three.

Interest How long have you been Do you have specific plans for doing ed? thinking about doing this? this - if yes, when will it be done? Selected <1 yr 1-2 yrs 2+ yrs <1yr 1-2yrs 3-5yrs DK Count Count Count Count Count Count Count Count Replace aging/inefficient appliances 5 1 1 3 0 3 0 1 Install efficient heating/cooling 8 3 3 2 5 1 0 1 system Install efficient hot water system 5 3 0 2 1 1 0 2 Install insulation, air sealing, 19 7 3 9 11 0 0 6 weather stripping Install new windows 12 1 2 9 5 1 0 4 Install solar electric/hot water 4 2 2 0 2 0 1 1 system Recycle refrigerator/freezer 0 0 1 0 1 0 0 0 Other 7 2 2 2 3 1 0 1 Other 2 0 1 1 0 0 0 2 Not interested in any upgrades 7 0 0 0 0 0 0 0

Q33 I don't need any specifics here, but in general, how would you pay for these types of home energy improvements?

Count Percent

Behavioral Perspectives on Home Energy Audits Page 209

Both financed AND out of pocket 6 17% Finance 3 9% Out of pocket 19 54% Other: 7 20%

Q34. Do you know if any utility rebates or tax credits are being offered for energy improvements? (n=34)

Count Percent No 13 38% Yes 21 62%

Q35. If the energy audit recommends several upgrades, how likely is it that a rebate would affect your decision about which home energy improvements you’ll do? (n=32)

Count Percent Not at all likely 1 3% Somewhat likely 8 25% Very likely 23 72%

Q37. Are tax credits likely to affect what upgrades you’ll choose to do? (n=34)

Count Percent Not at all likely 6 18% Somewhat likely 12 35% Very likely 16 47%

Q39. Approximately how many years do you plan to stay in this house?

Count Percent 3-5 years 2 6% 6-9 years 8 23% more than 10 years 21 60% Don't know 4 11%

Q41. After you’ve had your home energy audit, we’d really like to hear what you think of the results. May we call you back then?

Count Percent Yes 35 100%

Behavioral Perspectives on Home Energy Audits Page 210

Appendix J: Homeowner Post-Audit Interview Guide

Q1 Hello, this is ______. I’m calling from Research Into Action on behalf of the City of Seattle Office of Sustainability & Environment. You recently received a discounted home energy audit through Seattle City Light. The City of Seattle is interested in what you think about the home energy audit. We are hoping to understand what people think of their home energy audit results.

Q84 Did this respondent complete a pre-audit interview?

 Yes (1)  No (2)

Answer If Did this respondent complete a pre-audit interview? No Is Selected Q2 Are you the person in your household who is most familiar with your Home Energy Audit results?

 Yes (1)  A different individual is the best contact [RECORD NAME and NUMBER IN THE CALL LIST] (2)  Refused (3)

Q4 It takes about 15-20 minutes to go through this information. Is this a good time to talk?

Answer If Did this respondent complete a pre-audit interview? No Is Selected Q3 Can you verify that you are over the age of 18?

 Yes (1)  No [Ask to speak to someone else] (2)

Q5 Before we start I’d like you to know that your responses to this interview are confidential. And if you’d like, we can skip over questions.

Q6 Program records show that a home energy audit was conducted at . Is that right?

 Yes, correct address (1)  No, discontinue the interview (2)

Q7 Have you gotten your home energy audit results yet?

 No. When do you expect them? (1) ______ Yes (2)

Q8 Have you had a chance to read though the audit report yet?

 No. When do you plan to do this? [Tell them you will call back after that date] (1) ______ Yes (2)

Behavioral Perspectives on Home Energy Audits Page 211

Answer If Did this respondent complete a pre-audit interview? No Is Selected Q10 What motivated you to sign up for the home energy audit? (What were you hoping to learn? Were you looking to resolve any particular issues?)

Q11 Did the audit meet your expectations?

 Yes (1)  No, Why not? (2) ______

Q12 Is there anything you didn’t get out of the audit that you expected to?

Q13 Did you accompany the auditor during the audit process?

 Yes (1)  No (2)

Q14 What did you think of it? (Lengthy or complicated process, any surprises?)

Q15 What, if any, new things did you learn about your home as a result of the audit? [energy, air quality, comfort, or performance related issues, safety issues, other)

Q16 Did you learn about these things from talking to the auditor or from the audit report itself? [Check all that apply]

 Talking with the auditor (1)  From Scorecard or report (2)  All of the above (3)  Don't remember (4)

Answer If Did this respondent complete a pre-audit interview? No Is Selected Q17 Before getting the audit, had you done anything to your home to reduce energy use? If so, what had you done? (Tried to do an audit, replaced all bulbs with CFLs, weather-stripped, added insulation…).

 Yes (1)  No (2)  Don't know (3)

Answer If Did this respondent complete a pre-audit interview? No Is Selected Q74 If so, what had you done? (Tried to do an audit, replaced all bulbs with CFLs, weather-stripped, added insulation…).

Q18 Overall, would you rate the people in your household as “very,” “somewhat” or “not at all” concerned about how much energy you use at home?

 Very (1)  Somewhat (2)  Not at all (3)

Behavioral Perspectives on Home Energy Audits Page 212

 Don't know (4)

Q19 Just to review, the home energy audit report has two parts, the Energy Performance Scorecard comparing your house to the average Seattle house, and the Energy Analysis Report with specific energy-efficiency recommendations. Let’s start with your reactions to the Scorecard. First, your home’s energy score was displayed on the left-hand side of the Scorecard…

Q20 If you recall, how did your home compare to the average Seattle home on the Scorecard (on the front page of your audit report)?

Q21 Did your home’s Energy Score surprise you? [If yes] What was surprising about it?

 Yes. What was surprising about it? (1) ______ No (2)  Don't know (3)

Q22 Does your Energy Score make any difference in the upgrades or home repairs you might want to do?

 Yes: optional comments (1) ______ No (2)  Don't know (3)

Q23 There was also a Carbon Score on the Scorecard -- on the right hand side the Scorecard. What, if anything, does this score mean to you?

Q24 Would these Energy and Carbon scores be useful to you during the following occasions? If so, how would they be useful?

Useful, Yes or No Elaboration Yes (1) No (2) How so? (1) When making remodeling   decisions (1) When selling your home   (2) Would it be useful to you to see the energy and carbon scores of a home   you were considering buying? (3) Q25 Overall, do you think it would be helpful if every home had an Energy Performance Score? How would it be helpful?

Q26 Now let’s talk about the energy audit report – the recommendations …

Behavioral Perspectives on Home Energy Audits Page 213

Q27 Did going through the audit process or reading the report cause any change in how you think or feel about your home? How is that? (Motivated, frustrated, satisfied, other)

Q83 “What do you think of the report? [Probes: What information was most useful? What was least useful]

Q75 Was any information missing, something that you would like to see in the report?

 Yes, please specify: (1) ______ No (2)  Don't know (3)

Q29 The report you received after the energy audit included a set of recommended upgrades specific to your home. These might have included things like insulating walls, attics, or floors, air sealing, heating system replacements, and other actions you could take to improve the energy efficiency of your home.

Q30 In general, did these recommendations make sense to you?

 Yes (1)  No (2)  Don't know (3)

Q76 Were any confusing or surprising?

Q31 Do the recommendations help you to know what you could do in your home to reduce you energy use and carbon footprint?

Q32 Did your energy auditor give you a prioritized list of upgrades? If so, what did they recommend doing first?

 Didn't prioritize (1)  Prioritized, what was first? (2) ______ Prioritized but can't remember first on list (3)  Don't remember (4)

Q33 Have you completed any of the recommended upgrades listed in the report?

 Yes (1)  No (2)

If No Is Selected, Then Skip To Thinking about the recommendations in...If Yes Is Selected, Then Skip To What recommendations have you completed?

Behavioral Perspectives on Home Energy Audits Page 214

Q34 Thinking about the recommendations in your audit report, which ones might you take? [Record recommendations in table below then follow up about each]

List How likely are you to do this? For those measures that are definitely or very likely: When are Measures Definately Very Not sure Probably Definately Don't you likely to (1) will (1) likely (2) (3) not (4) not (5) know (6) do this? (1) 1 (1)       2 (2)       3 (3)       4 (4)       5 (5)       6 (6)       Q35 Why might you make this/these upgrade/s over others? (What benefit? Cost?)

Q36 How will you get upgrades done - do yourselves, hire a contractor, do some yourself and hire the rest done?

 Do it yourself (1)  Hire a contractor (2)  Depends on the measure (3)  Don't know (4)

Q37 [If contractor for any] Do you know a contractor that can do this type of work? [If not] How would you go about finding one?

Q38 Are there any recommendations in the report that you’re not likely to take?

 Yes (1)  No (2)  Don't know (3)

If Yes Is Selected, Then Skip To Which ones? Why not?If No Is Selected, Then Skip To When it comes to making decision abou...If Don't know Is Selected, Then Skip To When it comes to making decision abou...

Q77 Which ones? Why not?

Behavioral Perspectives on Home Energy Audits Page 215

Q39 When it comes to making decision about home upgrades like the ones recommended in the audit report would you say that someone else in your household will play...

 A major part in the decisions (1)  A minor part in the decisions (2)  No part in the decisions (3)  Don't know (4)  Refuse (5)

Q40 Compared to you, how familiar do you think they are with the audit results and upgrade recommendations?

 Not at all familiar (1)  Less familiar than you (2)  About the same level of familiarity (3)  More familiar than you (4)  Don't know/refuse (5)

Q41 Thinking again about any upgrades you might complete, have you thought about how you would pay for these? Would you? [Check all that apply]

 Pay out of pocket (1)  Finance with a loan (2)  Don't know/refuse (3)  Other: (4) ______

If Pay out of pocket Is Selected, Then Skip To End of BlockIf Finance with a loan Is Selected, Then Skip To End of BlockIf Don't know/refuse Is Selected, Then Skip To End of BlockIf Other: Is Selected, Then Skip To End of Block

Q42 What recommendations have you completed?

Completed measures: How long ago did you do How long was this? this? Measures (1) In months (1) In weeks (1) 1 (1) 2 (2) 3 (3) 4 (4) 5 (5) 6 (6) 7 (7) 8 (8) 9 (9)

Behavioral Perspectives on Home Energy Audits Page 216

Completed measures: How long ago did you do How long was this? this? Measures (1) In months (1) In weeks (1) 10 (10) Q43 If you had not had the audit, would you have done that [any of those] upgrade[s]? Which would you have done even without the audit recommendation?

 Yes: which would you have done without the audit? (1) ______ No (2)

Q44 Why did you decide to do this/these particular upgrades, and not others? (What factors/benefits were most important in your decision?)

Q45 Did you do these upgrades yourself, hire a contractor, or some combination of the two?

 Yourself (1)  hired a contractor (2)  Combination of both (3)  Other, please specify: (4) ______

If Yourself Is Selected, Then Skip To Why did you decide to do the work you...If hired a contractor Is Selected, Then Skip To Why did you hire a contractor? If Combination of both Is Selected, Then Skip To What did a contractor do and what d...

Q46 Why did you decide to do the work yourself rather than hiring a contractor? [Knew how and had the time/tools, less costly than a contractor, don’t know a contractor, other-specify]

If Why did you decide to do th... Is Displayed, Then Skip To Again I don’t need specifics here, bu...

Q47 What did a contractor do and what did you decide to do yourself? Why did you decide to do some of the work yourself?

Q48 Why did you hire a contractor?

Q52 Approximately, how much did this [these] upgrade(s) cost?

Q50 Do you remember the names of the contractor(s) you used? [Collect names of companies and contractors if possible]

Q51 How did you find the contractor(s)? [Was it difficult at all?]

Q53 Again I don’t need specifics here, but how did you cover the cost of these upgrades? Out of pocket/financed via a loan?

 Out of pocket (1)  Financed (2)

Behavioral Perspectives on Home Energy Audits Page 217

 Some combination (3)  Other: (4) ______ Refused (5)

Q54 Since completing the upgrades(s), have you noticed any difference in your home? (PROBES: comfort, hotter/colder, air quality, etc.).

Q55 Have you changed how you use energy in response to these differences? (For example, changed thermostat settings/heating or cooling use, changed hot water use, etc.)

Q56 While making the upgrades recommended in the audit report, did you do any other work – things that weren’t in the audit report? IF YES, describe:

Q57 Are you thinking about doing any other recommended upgrades? [Record recommendation in table below and follow up:]

Write in How likely are you to do this? When do possible you think upgrades you might do this? Measures Definately Very likely Not sure Probably Definitely Year (1) (1) likely (1) (2) (3) not (4) not (5) 1 (1)      2 (2)      3 (3)      4 (4)      5 (5)      6 (6)      7 (7)      Q58 Do you think that it is worth taking out a loan to do these recommended upgrades? Why or why not? [For energy or non-energy benefits; to reduce carbon emission or not]Probe: Would you want to finance all of these recommendations or just some of them? Which ones?

Q79 Probe: Would you want to finance all of these recommendations or just some of them? Which ones?

Q59 Is there anything else you'd like to add about what could be changed or added to the home energy audit process, the scorecard, the report, or anything else- that might help you complete the upgrades recommended for your home?

Q78 Interviewer note: if the respondent has already completed a pre-audit survey, you can skip the already completed demographic information.

Behavioral Perspectives on Home Energy Audits Page 218

Q61 [DO NOT READ] Record Gender.If you can't tell: "Because the quality of phone connections sometimes makes it difficult to tell, I have to ask you your gender. Are you male or female?"

 Male (1)  Female (2)

Q62 Of the people who regularly live in your home, do any…[Read options 1-3, pausing after each to allow for a response, Select all that apply]

 Have home remodeling skills (1)  Fix objects and appliances (2)  Have time for home maintenance (3)  None of the above (4)  Don't know (5)  Refused (6)

Answer If Did this respondent complete a pre-audit interview? No Is Selected Q63 How many years have you lived in this house?

 Less than a year (1)  Number of years: (2) ______ Don't know (3)  Refused (4)

Answer If Did this respondent complete a pre-audit interview? No Is Selected Q64 Approximately how many years do you plan to stay in this house, based on current plans?

 Number of years: (1) ______ For the rest of my life (2)  Less than a year (3)  Don't know (4)  Refused (5)

Answer If Did this respondent complete a pre-audit interview? No Is Selected Q65 Including yourself, how many people currently live in your household?

 One (R lives alone) (1)  Enter number (2) ______ Refused (3)

Answer If Did this respondent complete a pre-audit interview? No Is Selected Q66 Including yourself, please tell me how many people in your household are in the following age categories:

Behavioral Perspectives on Home Energy Audits Page 219

Number in household (1) 4 and under (1) 5 to 17 years (2) 18 to 24 years (3) 25 to 34 (4) 35 to 44 (5) 45 to 54 (6) 55 to 64 (7) 65 to 74 (8) Over 75 (9) Refused (10) Answer If Did this respondent complete a pre-audit interview? No Is Selected Q68 And which category includes you?

 18 to 24 (1)  25 to 34 (2)  35 to 44 (3)  45 to 54 (4)  55 to 64 (5)  65 to 74 (6)  Over 75 (7)  Refused (8)

Q69 Please stop me when I reach the category that best describes your yearly total household income before taxes in 2010. IF NEEDED: Your best estimate is fine. READ OPTIONS UNTIL STOPPED

 Less than $15,000 (1)  $15,000 to less than $25,000 (2)  $25,000 to $35,000 (3)  $35,000 to $50,000 (4)  $50,000 to $75,000 (5)  $75,000 to $100,000 (6)  $100,000 to $150,000 (7)  $150,000 and more (8)  Don't know (9)  Refused (10)

Q70 Which of the following groups best identifies you? Be sure to collect Hispanic heritage [Check all that apply]

 Asian or Asian American [inc. Chinese, Filipino, Japanese, Asian Indian, Korean, Vietnamese] (1)  White or Caucasian (2)  Black or African-American (3)

Behavioral Perspectives on Home Energy Audits Page 220

 American Indian or Alaskan native (4)  Native Hawaiian or Pacific Islander (5)  Spanish, Hispanic, or Latino (6)  Other, multiracial: (7) ______ Don't know (8)  Refused (9)

Q71 What is the highest level of education you have completed?

 Less than 12th grade (not high school graduate) (1)  High school graduate or GED (2)  Some college or other post-secondary education (3)  Associates Degree or Technical degree (AA or AS) (4)  Bachelors Degree (BA, BS, AB) (5)  Some post- graduate (6)  Master's Degree (7)  Other professional or doctoral degree (8)  Don't know (9)  Refused (10)

Q60 Thanks so much for telling me about your audit experience. Would it be OK if I contact you again in a couple of months just to check about any recommended work you may have completed?

 Yes (1)  No (2)  Maybe (3)

Q72 Thank you so much for taking the time.

Q82 This is the last page, please make sure all data is correct before submitting the survey.

Behavioral Perspectives on Home Energy Audits Page 221

Appendix K: Homeowner Post-Audit Interview Closed-Ended Responses

This section contains responses frequencies for closed-ended interview questions. Questions for which there is no closed-ended data do not appear. Similarly, response options with a frequency of zero do not appear. Unless otherwise specified, all questions have 31 responses.

Q84. Did this respondent complete a pre-audit interview? (n=21)

Count Percent Yes 13 62% No 8 38%

Q11. Did the audit meet your expectations? (n=30)

Count Percent Yes 26 87% No, Why not? 4 13%

Q13 Did you accompany the auditor during the audit process? (n=30)

Count Percent Yes 28 93% No 2 7%

Q16. Did you learn about these things from talking to the auditor or from the audit report itself? (n=29)

Count Percent Talking with the auditor 10 34% From Scorecard or report 4 14% All of the above 17 59%

Q17. Before getting the audit, had you done anything to your home to reduce energy use? If so, what had you done? (n=17)

Count Percent Yes 13 76% No 4 24%

Q18. Overall, would you rate the people in your household as “very,” “somewhat,” or “not at all” concerned about how much energy you use at home? (n=30)

Count Percent

Behavioral Perspectives on Home Energy Audits Page 222

Very 16 53% Somewhat 13 43% Don't know 1 3%

Q21. Did your home’s Energy Score surprise you? [If yes] What was surprising about it? (n=29)

Count Percent Yes. What was surprising about it? 13 45% No 14 48% Don't know 2 7%

Q22. Does your Energy Score make any difference in the upgrades or home repairs you might want to do? (n=29)

Count Percent Yes 19 66% No 7 24% Don't know 3 10%

Q24. Would these Energy and Carbon scores be useful to you during the following occasions? If so, how would they be useful?

Yes No Count Count When making remodeling decisions? 19 5 When selling your home? 19 7 Would it be useful to you to see the energy and carbon scores of a home you were 25 2 considering buying?

Q75. Was any information missing; something that you would like to see in the report? (n=29)

Count Percent Yes 11 38% No 15 52% Don't know 3 10%

Q30. In general, did these recommendations make sense to you? (n=28)

Count Percent Yes 28 100% Q32. Did your energy auditor give you a prioritized list of upgrades? If so, what did they recommend doing first? (n=28)

Count Percent Didn't prioritize 5 18%

Behavioral Perspectives on Home Energy Audits Page 223

Prioritized, what was first? 19 68% Prioritized but can't remember first on list 2 7% Don't remember 2 7%

Q33. Have you completed any of the recommended upgrades listed in the report? (n=28)

Count Percent Yes 9 32% No 19 68%

Q36. How will you get upgrades done - do yourselves, hire a contractor, do some yourself and hire the rest to be done? (n=19)

Count Percent Do it yourself 2 11% Hire a contractor 11 58% Depends on the measure 6 32%

Q38. Are there any recommendations in the report that you’re not likely to take? (n=18)

Count Percent Yes 10 56% No 7 39% Don't know 1 6%

Q39. When it comes to making decision about home upgrades like the ones recommended in the audit report would you say that someone else in your household will play... (n=17)

Count Percent A major part in the decisions 10 59% A minor part in the decisions 3 18% No part in the decisions 4 24%

Q40. Compared to you, how familiar do you think they are with the audit results and upgrade recommendations? (n=15)

Count Percent Not at all familiar 2 13% Less familiar than you 6 40% About the same level of familiarity 5 33%

Behavioral Perspectives on Home Energy Audits Page 224

More familiar than you 1 7% Don't know/refuse 1 7%

Q41. Thinking again about any upgrades you might complete, have you thought about how you would pay for these? Would you… (n=18)

Count Percent Pay out of pocket 13 72% Finance with a loan 4 22% Don't know/refuse 1 6% Other: 1 6%

Q43. If you had not had the audit, would you have done that [any of those] upgrade[s]? Which would you have done even without the audit recommendation? (n=9)

Count Percent Yes 6 67% No 3 33%

Q45. Did you do these upgrades yourself, hire a contractor, or some combination of the two? (n=9)

Count Percent Yourself 2 22% Hired a contractor 6 67% Other, please specify: 1 11%

Q53. Again, I don’t need specifics here, but how did you cover the cost of these upgrades? Out of pocket/financed via a loan? (n=9)

Count Percent Out of pocket 5 56% Financed 2 22% Other: 2 22%

Q61. Record Gender (n=29)

Count Percent Male 15 52% Female 14 48%

Q62. Of the people who regularly live in your home, do any … (n=27)

Count Percent

Behavioral Perspectives on Home Energy Audits Page 225

Have home remodeling skills 7 26% Fix objects and appliances 8 30% Have time for home maintenance 15 56% None of the above 10 37%

Q63. How many years have you lived in this house? (n=17)

Count Percent Less than a year 1 6% Number of years: 16 94%

Q64. Approximately how many years do you plan to stay in this house, based on current plans? (n=17)

Count Percent Number of years: 11 65% For the rest of my life 5 29% Don't know 1 6%

Q65. Including yourself, how many people currently live in your household? (n=17)

Count Percent One 4 24% Enter number: 13 76%

Q68. And which [age] category includes you? (n=16)

Count Percent 25 to 34 2 13% 35 to 44 4 25% 45 to 54 1 6% 55 to 64 4 25% 65 to 74 4 25% Over 75 1 6% Q69. Please stop me when I reach the category that best describes your yearly total household income before taxes in 2010? IF needed: Your best estimate is fine.. (n=28)

Count Percent Less than $15,000 1 4% $35,000 to $50,000 2 7% $50,000 to $75,000 6 21% $75,000 to $100,000 6 21% $100,000 to $150,000 5 18% $150,000 and more 5 18% Don't know 2 7% Refused 1 4%

Behavioral Perspectives on Home Energy Audits Page 226

Q70. Which of the following groups best identifies you? Be sure to collect Hispanic heritage [Check all that Apply]. (n=28)

Count Percent Asian or Asian American [inc. Chinese, Filipino, Japanese, Asian Indian, Korean, 5 18% Vietnamese] White or Caucasian 25 89% Spanish, Hispanic, or Latino 1 4%

Q71. What is the highest level of education you have completed? (n=27)

Count Percent Some college or other post-secondary education 2 7% Associates Degree or Technical degree (AA or AS) 2 7% Bachelors Degree (BA, BS, AB) 8 30% Some post- graduate 4 15% Master's Degree 7 26% Other professional or doctoral degree 4 15%

Q60. Thanks so much for telling me about your audit experience. Would it be OK if I contact you again in a couple of months just to check about any recommended work you may have completed? (n=28)

Count Percent Yes 27 96% Maybe 1 4%

Behavioral Perspectives on Home Energy Audits Page 227

Appendix L: Post-Audit Survey Questions and Responses Closed-Ended Frequencies This section contains responses to closed-ended survey questions. Questions for which there is no closed-ended data do not appear. Similarly, response options with a frequency of zero do not appear. Open-ended responses appear in the following section. Unless otherwise specified, all questions have 134 responses.

Q2. Next, I will read some statements regarding the energy auditor's visit to evaluate your home. Please use a scale of 1 to 5, with 1 meaning that you "strongly disagree" and 5 meaning that you "strongly agree”.

1 Strongly 2 3 4 5 Strongly Don't Refused disagree agree Know Count Count Count Count Count Count Count The energy auditor clearly communicated with 0 1 4 24 100 4 1 me during the home visit The energy auditor did a complete review of my 1 2 5 31 94 1 0 home I felt comfortable asking the energy auditor 0 0 2 12 120 0 0 questions about my home I have confidence in the energy auditor's work 2 5 7 31 86 2 1 and recommendations

TIME. Thinking about when the energy auditor was at your home, which of the following best describes your amount of involvement with the process. By 'involvement' we mean how much time you spent following along and observing the process. Were you involved… (n=131)

Count Percent The entire time 36 27% Most of the time 41 31% Some of the time 45 34% None of the time 7 5% I was not home 2 2%

Q3. Did you receive the report, including the scores, results, and recommendations for your home?

Count Percent Yes 124 93% No, did not receive anything 7 5% No, received the report and not the score card or vice-versa 1 1% Don't know 2 1%

Q4. How familiar would you say you are with the scores, results, and recommendations from the energy audit? (n=124)

Count Percent

Behavioral Perspectives on Home Energy Audits Page 228

Not familiar at all 9 7% Slightly familiar 20 16% Pretty familiar 64 52% Very familiar 30 24% Refused 1 1%

Q6. How did your home's energy score compare to the average Seattle home? Was it... (n=114)

Count Percent Worse than average 47 41% About average 22 19% Better than average 28 25% Don't know 17 15%

Q7a. If you were selling your home, would you be willing to show potential homebuyers the energy score for your home? (n=114)

Count Percent No 14 12% Yes 94 82% Don't know 6 5%

Q7b. If you were buying a home, would you like to see the energy score for homes you are looking at? (n=114)

Count Percent No 4 4% Yes 108 95% Don't know 2 2%

Q8. As you may have noticed, the Score Card provides both an Energy score and a Carbon score for your home. Would you say that the energy score is more important, the carbon score is more important, both are equally important, or neither are important? (n=114)

Count Percent The energy score more important 42 37% The carbon score more important 2 2% Both are equally important 60 53% Neither are important 3 3% Don't know 7 6% Q9. Next, I'm going to ask you some questions about the energy audit report, and the upgrades recommended in the report. In terms of providing useful information about the current condition of your home, did you find that the report had... (n=114)

Count Percent Little to no useful information 4 4%

Behavioral Perspectives on Home Energy Audits Page 229

Some useful information 29 25% Lots of useful information 80 70% Don't know 1 1%

Q10. Regarding the recommended energy upgrades to your home, would you say that you understood... (n=114)

Count Percent Some of the recommendations 7 6% Most of the recommendations 32 28% All of the recommendations 75 66%

Q11. Have you completed any of the upgrades recommended from the energy audit? (n=114)

Count Percent No 53 46% Yes 61 54%

Q12. Considering all of the upgrades recommended by the home energy audit, which upgrades are you most likely to do? (n=53)

Count Percent Enter Response (please specify) 51 96% Don't know 2 4%

Q12a. What is appealing to you about these particular upgrades?

Count Percent Easy to do myself/ourselves 10 19% Can reduce home energy costs 27 51% Keep home at a more comfortable temperature 15 28% Improve the air quality of home 1 2% Increases home's value 2 4% Avoid or Reduce Waste 3 6% Other (Please Specify) 25 47% Nothing 1 2% Don't know 1 2% Refused 1 2%

Q12b. Are these upgrades something that... (n=53)

Count Percent You would hire a contractor to do 26 49% Someone in your household would do 11 21% Some you would do yourselves, others you would hire a contra 12 23%

Behavioral Perspectives on Home Energy Audits Page 230

Other (Please Specify) 3 6% Refused 1 2%

Q12b1. Do you know specifically of any contractors who perform these kinds of upgrades? (n=38)

Count Percent No 12 32% Yes 26 68%

Q12b2. How did you find out about them?

Count Percent From your energy auditor 12 46% Newspaper, television or radio 2 8% From a friend or colleague 4 15% Phone book 1 4% Internet 6 23% Other (Please Specify) 10 38% Don't know 1 4%

Q12c. What is the likelihood that you will do at least one of these particular upgrades? (n=53)

Count Percent Definitely will not 1 2% Not sure 7 13% Probably will 8 15% Definitely will 37 70%

Q12c1. Approximately how many months from now do you expect to start the first upgrade?

Count Percent Less than 1 month 5 11% 1 to less than 3 months 12 27% 3 to less than 6 months 8 18% 6 to less than 12 months 9 20% More than 1 year 3 7%

Behavioral Perspectives on Home Energy Audits Page 231

Count Percent Other (Please specify) 2 4% Already started one or more upgrades 4 9% Don't know 1 2% Refused 1 2%

Q13. Considering all of the upgrades recommended by the home energy audit, which upgrades are you least likely to do? (n=53)

Count Percent Enter Response (Please specify) 40 75% Don't know 13 25%

Q13a. What is NOT appealing to you about these particular upgrades?

Count Percent Won't reduce home energy costs 3 8% Too expensive 26 65% Other (Please Specify) 22 55% Nothing 3 8%

Q14a1 – Q14e1. Next, I'm going to ask you about the recommendations you have completed and how long ago they were done. What is the first recommendation you have completed? (n=61)

Count Percent Enter Description of 1st Recommendation 61 100% Enter Description of 2nd Recommendation 42 69% Enter Description of 3rd Recommendation 22 52% Enter Description of 4th Recommendation 11 50% Enter Description of 5th Recommendation 2 18%

Q14a2 – Q14e2. Approximately how many months ago was that completed? (n=61)

Count Percent Less than 1 month 28 46% 1 to less than 3 months 52 85% 3 to less than 6 months 53 87% 6 to less than 12 months 3 5% Refused 2 3%

Q15. Did you or someone in your household do the upgrade(s), or did you use a contractor? (n=61)

Behavioral Perspectives on Home Energy Audits Page 232

Count Percent Used a contractor 23 38% Someone in your household did 28 46% Some upgrades done yourselves, and some done by a contractor 9 15% Other (Please Specify) 1 2%

Q15a. What contractor(s) did you use? (n=32)

Count Percent Enter contractor company and/or name (please specify) 29 91% Don't know 2 6% Refused 1 3%

Q15. How did you find out about your contractor(s)?

Count Percent From your energy auditor 13 41% From your utility (utility company or utility bill) 2 6% Newspaper, television or radio 1 3% From a friend or colleague 11 34% Internet 5 16% Other (Please Specify) 10 31% Don't know 1 3%

Q16. Next, I have a few questions regarding the overall audit process. How much would you say you learned about each of the following from your home energy audit? The current condition and energy performance of your home.

Count Percent Not much 5 4% Some 26 19% A lot 103 77% Q16d. How much did you learn about the amount it would cost to improve your home's energy performance?

Count Percent Nothing 14 10% Not much 19 14% Some 51 38% A lot 48 36% Don't know 2 1%

Q16e. (How much did you learn about) the energy upgrade options that are best for your home?

Count Percent

Behavioral Perspectives on Home Energy Audits Page 233

Nothing 2 1% Not much 8 6% Some 43 32% A lot 80 60% Don't know 1 1%

Q16f. (How much did you learn about) the resources that are available to help you complete upgrades, such as contractors, loans, or other experts?

Count Percent Nothing 16 12% Not much 29 22% Some 63 47% A lot 24 18% Don't know 2 1%

Q16b. (How much did you learn about) the amount of money you could save on energy bills by improving your home's energy performance?

Count Percent Nothing 8 6% Not much 12 9% Some 69 51% A lot 44 33% Don't know 1 1%

Q16c. What other benefits did you learn about that could result from improving your home’s energy performance?

Count Percent Keeping home at a more comfortable temperature 21 16% Reducing drafts in home 12 9% Improving air quality of home 6 5% Fixing safety issues (radon, combustion safety, inadequate a 3 2% Fixing moisture issues 2 2% Improving value of home 9 7% Save money/Lower bills 25 19% Better for environment/Reduces carbon-footprint 13 10% Other (Please specify) 36 27% NONE 36 27% Don't know 15 11%

Behavioral Perspectives on Home Energy Audits Page 234

Q17. Overall, can you identify anything that you didn't get out of the program that you hoped to when you first signed up for it?

Count Percent Enter Response (please specify) 61 46% Nothing 71 53% Don't know 1 1% Refused 1 1%

Q18. What could be changed or added to the audit, score card, or report that might help you decide about making changes to your home?

Count Percent Enter Response (please specify) 80 60% Nothing 43 32% Don't know 11 8%

PRIOR. Had you completed any energy upgrades to your home prior to getting the energy audit?

Count Percent No 36 27% Yes 95 73%

Q19. Did you already have any plans in mind for specific energy upgrades when you had the energy audit?

Count Percent No 53 40% Yes 81 60%

Q19a. Have you changed your plans as a result of the audit? (n=81)

Count Percent No 35 43% Yes 46 57%

Q19b .Have you made any new upgrade plans as a result of the audit? (n=53)

Count Percent No 17 32% Yes 34 64% Don't know 2 4%

Q19c. What did you learn from the audit that caused you to change your plans? (n=80)

Behavioral Perspectives on Home Energy Audits Page 235

Count Percent Enter Response (please specify) 78 98% Nothing 2 3%

Q20. Did going through the home energy audit cause you or other household members to change how you use energy?

Count Percent No 99 74% Yes 33 25% Don't know 1 1% Refused 1 1%

Q20a. What changes have been made? (Multiple Responses Allowed) (n=33)

Count Percent Turned off lights 10 30% Turned off appliances/electronics 6 18% Turned down heat 11 33% Turned off heat 2 6% Washed clothes in cold water 1 3% Other (Please Specify) 24 73% Q20b. What about the home energy audit experience caused these changes? (Multiple Responses Allowed) (n=33)

Count Percent Seeing your home's Energy or Carbon score compared to the av 2 6% Actions were suggested by the auditor 5 15% Learning about the problems with/performance of my home 9 27% Seeing the 'Low cost or no cost strategies' listed in the re 5 15% Other (Please Specify) 19 58%

SEX. Record R'S gender, as observed.

Count Percent Male 79 59% Female 55 41%

HEAR. How did you hear about the home energy audit?

Count Percent Utility billing insert (e.g., Seattle City Light insert) 39 29% Newspaper, television or radio 10 7% Family member 4 3% Friend or colleague 30 22%

Behavioral Perspectives on Home Energy Audits Page 236

Contractor 4 3% Internet 20 15% Realtor 1 1% Home show 1 1% Seattle City Light (in general) 15 11% Other (please specify) 20 15% Don't know 9 7%

SKILL. Of the people who regularly live in your home, do any…

Count Percent Have home remodeling skills 57 43% Fix objects and appliances 60 45% Have time for home maintenance 90 67% None of the above 31 23%

RES. How many years have you lived in this house?

Count Percent 1.00 5 4% 2.00 7 5% 3.00 4 3% 4.00 7 5% 5.00 6 4% 6.00 6 4% 7.00 3 2% 8.00 3 2% 9.00 4 3% 10.00 5 4% 11.00 8 6% 12.00 3 2% 13.00 4 3% 14.00 7 5% 15.00 4 3% 16.00 4 3% 17.00 5 4% 18.00 4 3% 19.00 1 1% 20.00 6 4% 21.00 3 2%

Behavioral Perspectives on Home Energy Audits Page 237

Count Percent 22.00 2 1% 23.00 2 1% 24.00 2 1% 25.00 2 1% 26.00 1 1% 27.00 1 1% 28.00 1 1% 30.00 3 2% 33.00 1 1% 34.00 1 1% 35.00 3 2% 40.00 1 1% 41.00 2 1% 43.00 1 1% 44.00 1 1% 45.00 1 1% 56.00 1 1% Less than a year 8 6% Refused 1 1% STAY. Based on current plans, approximately how many years do you plan to stay in this house?

Count Percent 2.00 7 5% 3.00 1 1% 4.00 2 1% 5.00 18 13% 6.00 2 1% 7.00 3 2% 8.00 3 2% 10.00 27 20% 15.00 7 5% 20.00 13 10% 21.00 1 1% 25.00 1 1% 30.00 6 4% 40.00 2 1% Less than a year 2 1% For the rest of my life 27 20% Don't know 12 9%

HHMS. Including yourself, how many people currently live in your household?

Count Percent One (R lives alone) 14 10%

Behavioral Perspectives on Home Energy Audits Page 238

2 47 35% 3 32 24% 4 27 20% 5 8 6% 6 3 2% 8 1 1% Refused 2 1%

Q23A. Including yourself, please tell me how many people in your household are in the following age categories: (n=120)

0 1 2 3 RF Count Percent Count Percent Count Percent Count Percent Count Percent 4 yrs or younger 100 83% 10 8% 8 7% 0 0% 2 2% 5-17 70 58% 36 30% 10 8% 2 2% 2 2% 18-24 99 83% 14 12% 3 3% 2 2% 2 2% 25-34 109 91% 7 6% 1 1% 1 1% 2 2% 35-44 80 67% 19 16% 19 16% 0 0% 2 2% 45-54 60 50% 31 26% 26 22% 1 1% 2 2% 55-64 79 66% 25 21% 13 11% 1 1% 2 2% 65-74 101 84% 12 10% 5 4% 0 0% 2 2% 75 & older 108 90% 7 6% 3 3% 0 0% 2 2%

R_AGE. Which of the following age groups are you in?

Count Percent 25 to 34 4 3% 35 to 44 32 24% 45 to 54 43 32% 55 to 64 33 25% 65 to 74 14 10% 75 years and older 6 4% Refused 2 1%

Behavioral Perspectives on Home Energy Audits Page 239

INCOM. Please stop me when I reach the category that best describes your yearly total household income before taxes in 2010. - IF NEEDED: Your best estimate is fine.

Count Percent $15,000 to less than $25,000 1 1% $25,000 to less than $35,000 4 3% $35,000 to less than $50,000 9 7% $50,000 to less than $75,000 14 10% $75,000 to less than $100,000 21 16% $100,000 to less than $150,000 39 29% $150,000 or more 27 20% Don't Know 1 1% Refused 18 13%

RACE. Which of the following groups best identifies you? (Multiple Responses Allowed)

Count Percent White or Caucasian 121 90% Black or African-American 4 3% Asian or Asian-American 7 5% American-Indian or Alaskan Native 2 1% Spanish, Hispanic, or Latino 3 2% Multi-Racial Other (Please Specify) 1 1% Refused 1 1%

EDU. What is the highest level of education you have completed?

Count Percent Less than 12th Grade (not a high school graduate) 1 1% High School Graduate or GED 1 1% Some College or Other Post-Secondary Education 8 6% Associates Degree or Technical Degree (AA or AS) 4 3% Bachelors Degree (BA, AB, BS) 51 38% Some Post-Graduate 4 3% Master's Degree 45 34% Other professional or doctoral degree 19 14% Refused 1 1%

Behavioral Perspectives on Home Energy Audits Page 240

Open-Ended Responses This section contains responses to open-ended survey questions. Where multiple respondents made the same comment, the number of respondents is indicated in parentheses following the comment.

Q1. Thank you. Your responses will be kept completely confidential and your participation is voluntary. You can stop at any time and skip any item you don't want to answer. In general, why did you decide to sign up for the energy audit of your home?

Response Due to the high amount of electricity that my records show that I use, even though I think I'm being responsible, not turning lights on and things like that. Even though I have the family programmed and cooperating with watching usage and using power strips, I couldn't seem to put a dent in my total energy use. I pay bills online and saw the reference to the energy audit, and thought it might be a good way of discerning how the heat is dissipating in the house. The house was built prior to the 1970s with no thought to energy conservation at that time. It's on a hillside, no attic or basement. I thought maybe the audit would get me closer to finding out what I need to do. The auditor provided the information I needed, especially the information that came later by email. Energy and cost efficiency Even though we had changed over to gas heat, the electric company said we were using energy above average for pier homes that were using electric heat. Find out how drafty my house was For the environment and saving costs High electricity bills (2) High power bills and temperature differentials I always felt that I had cold spots and I wanted to check those out. I am an energy freak. My background is architecture and I am concerned with energy use. I am interested in saving the earth's resources, and saving money. It seemed like a good value for the money. I did it to save money and to try to use less energy. I had a $1,900 energy bill one month, when a normal bill should be about $200. I had an oil furnace that was giving me problems. Before changing to gas, I decided to get the energy audit. I had excessive dust. I had taken a number of steps myself and I thought that somebody who does this regularly would know more steps to take. I had to change my furnace. I was trying to decide whether or not to go with electrical or natural gas. I needed to know how it would affect my home and which one was worth it. I have an older home and was curious to how it was doing. I have wanted to do an energy audit for a long time. I like the discount that was given on the audit. I wanted to make my home more energy efficient and comfortable. I hoped to improve my energy consumption, reduce it, reduce my footprint, and save money. I just moved into the house. It has an oil furnace, so I wanted to make it as energy efficient around insulation and other items as I possibly could, having this albatross of a furnace around my neck. I just wanted to know more about my home's efficiency. I kept getting letters from the city comparing my energy use to my neighbors. I knew that my house was not efficient. I had gone through a self-audit that the utility provided, but I didn’t get enough answers. I knew that our house was pretty inefficient and that there were a lot of things that needed to be done but I had no idea where to start and so I wanted to find a way to plan what we could do that would be cost-effective.

Behavioral Perspectives on Home Energy Audits Page 241

Response I knew that we needed energy saving work done. I knew there was very little insulation in the ceiling and there was probably next to nothing in the crawl space. I wanted to see how I could use my funding more efficiently and not throw the heat away. I knew there were measures we wanted to take on the house, but did not know what specific things to do. I know my house is not as efficient as it should be and this is the first affordable modern audit available. I saw it offered and we were interested in being energy efficient, so we made an exploratory call to the city and it was pretty easy. I signed up because my house felt a little drafty. I signed up for the audit because there was a rebate and I thought they could answer some questions about how efficient our home heating was. I signed up in the hopes that I could figure out ways to save money on my energy and hopefully help the environment. I signed up to make the house more efficient and save energy and money. I signed up to reduce costs, save money, and save energy. I thought it was a good idea because money is always factor. What do you do first? I want to improve the energy efficiency of our home. I wanted additional information about my home. I wanted information that might tell me things I could do to lower our energy use. I wanted my home to be comfortable in the winter. I wanted to be more efficient and have it cost me less money to heat my home. I wanted to find out about energy leakage in my home. I wanted to find out if there was a way to both save money and help the environment. I wanted to find out what needed to be done to improve my energy efficiency. I wanted to get help to prioritize making improvements. I wanted to have my house insulated for some time and I was looking for rebates and deals that would help with that process. I wanted to know how efficient our home was. I wanted to make sure I was being efficient and to reduce my bills. I wanted to see if I could save some money. I was concerned about my high heat bills. I was curious to see how our house performs. I was curious to see what it would show after receiving very high energy bills. I was curious. I wanted to save energy and wanted to know what they would come up with. I was interested in determining whether or not the home was energy efficient. I was interested in potential savings. I was interested in saving energy and money. I was just curious to how our home could be improved a little bit. I was looking to remodel and wished to address any problems during the remodel. I was not having good heating and I just put in new windows. I was recommended by City Light to have it done because I have high electrical bills. I was watching the electrical bills for years. In general, the house is very cold in the winter and the electrical bill is very high. I wanted to reduce the bills but also make the house more comfortable. I wished to cut down on energy costs and increase the comfort of the home. I am also concerned about wasting energy and conservation of the environment.

Behavioral Perspectives on Home Energy Audits Page 242

Response I wished to find out the most cost effective improvements. I wished to improve the value of my home, reduce energy and save money. In the bigger picture, I am concerned about the environment and climate change. I would like to tighten up house energy usage and conserve energy. Improve both the heating and cooling of our home and to take advantage of the education that the audit would provide and the subsidy by the company City Lights. Increase the energy efficiency of my home Interest in conserving energy It is an ancient house that needs a lot of help. It was a good deal. It is good for the environment. It is an old house and we thought we might be able to save some money. We are into being as green as possible. It made good dollars-and-cents, a good offer for a fourth of the cost, and a good way to get up to speed immediately on where the priorities should be for what would be most cost-effective strategies to tackle, for quick fixes and planning ahead for long-run options. It seemed like a good idea to try to make the house as efficient as possible. It was a good deal money-wise and also something that we had talked about doing for some time. It was a new house and from the records shown from the previous owners it seemed like they were going through a lot of oil in the winter. It was free and sounded like a really good idea. It was free, and we knew we needed insulation. We figured since it was free and available. We would check what else might be worth doing at the same time. It was inexpensive, we had not done it before, and we had recently done a lot of work on our house and wanted to see what could have been done and what more we could do. It was part of a campaign in which we had a rain garden installed. It's part of my job to understand better the programs that are going on. Also, I wanted to see if I could improve the efficiency of the home and save some money. It's the first home I've ever owned and I wanted to know more about it. I don't know much about what to do. Looking for ways to save energy My electric bill is ridiculous. My heating bills are too high and I wanted to reduce the costs. My heating system and energy bills are not good. I wanted to see what I could find out. My wife actually signed it up and she misunderstood what the price would be. She thought it would be a good price. My wife and I have been working for several years to reduce our energy footprint and we know a lot of information. We wanted an outside party to evaluate how well we have done and see what else is needed. My wife wanted me too. We have an old house and just wanted to verify that what we suspected as far as energy efficiency. One of my friends had recommended doing it. Our bills are really high. Our electric bills were higher than our neighbors. Our home is an old bungalow that we have fixed up already. In spite of that, it's still drafty, so that was the main reason. The price of the audit was good with the city offering the rebate and I wanted to learn more about my house. Our house is ten years old. We wanted to improve our use of energy. Our house tends to be cold. We have a lot of widows and I was not sure about what places in the house should be worked on first.

Behavioral Perspectives on Home Energy Audits Page 243

Response Reduce energy usage and to improve the comfort of our home Save energy Save energy and help the climate Save money Save money in heating costs Save some energy The audit was a discounted price and I wanted to know what it would show. The heating costs were so high. The landlord took care of it. The utilities are expensive. There was a discount available through our utilities company. The house is also old and is not efficient and I knew that my bills were too high. This house is over 80 years old and I have only lived in it for a few years. I was unsure which walls were insulated or not and hoped the energy audit would help me figure that out so that I could know what steps I could take to improve the energy efficiency of the house. To check out how the energy is working inside the house, how we can improve saving heat, and keep the value of the house up To get a better understanding of what my home energy value is and to see what I can live with and live without To get an energy audit To see if there was anything basic that could be done to help me out with the efficiency To see if there were any cost effective things we could do to save energy To try to reduce my heating bills. I am also involved in the energy field, so I am very aware of issues of global warming all around. We are in an older home and we have been trying for years to maintain the efficiency and temperature. We are looking at doing some remodeling and want to incorporate some of the recommendations into the remodel. We felt like the heating system was generating enough heat, but there were still some comfort issues and drafts. We wanted to figure out how to best use our money to address energy issues. We had already done some things and wanted to know more specifically what other things we could do. We had always thought it was a good idea and the fact that most of the audit was going to be paid for helped us to make up our minds. We had just purchased a home and thought it would be a good time to do it, to help us prioritize improvements. We had made some improvements already to our home and we were not sure where to go next. We had one for another townhouse and had some work done and it showed a lot of improvement so we decided to have one done to our own home. We have a couple of cold rooms and wanted to know why. The house is new, and we want to save energy. We have a now 101 year old house so it was not built with energy efficiency in mind and we are trying to be responsible and save money. We have an older house and I know that it is not as efficient as it could be and I was looking for solutions. We have not been able to heat our house efficiently. We just bought the house last April and we wanted to identify things we could take care of to reduce the cost of the house. We also want to do the right thing. We try to be environmentally conscious, so to reduce the cost of our household and the cost of the city, the overall use of energy. We are in a public utility so we were keeping overall costs down. We just wanted our home to be more energy efficient and we knew we didn't know how to go about doing it.

Behavioral Perspectives on Home Energy Audits Page 244

Response We learned that there was a rebate available and felt that was a good idea. We own an old home and there's a lot we don't know about it. We thought the energy audit might help us understand more about the history of our home. It was really for saving energy, to see if there was something major we could do that would be cost effective. The audit seemed very reasonably priced and affordable. We purchased a home that is about 50 years old and it was obvious that it was not very energy efficient. We wanted to find out what needed to be done to reduce our use of energy. We were looking for validation that things like replacing the windows would help and to see if anything was missing. We wanted to have the audit before we made choices for the remodel. We wanted to know what energy we were losing from the home; air intake and heat loss. We were doing a remodel and wanted to take advantage of the program. We were getting ready to replace a roof on our house and we thought that it would be timely to address energy bills and undertake work.

Q2d1. Why don't you feel confident about the auditor's work or recommendations?

Response After he left I asked him some follow-up questions, which he never answered. It took quite a long time to receive the report. I don't have any reason to doubt the data, but he struck me as a bit unreliable. After the inspection was over, I realized that our chimney flute was open during the audit. So that means the measurements and estimations that she made were inaccurate. He never sent me any type of record on it. He was supposed to e-mail me, but never did. He was really in a big hurry. He had something else scheduled so he kind of had to get out of here very quickly. She really recommended nothing and we live in a hundred year old house. The recommendations were all pretty vague. I did not feel he was accurate. He was very detailed about some leaks and vague about others. Too much of it was boiler plate, standard paragraphs and not specific enough. They were not very specific. They said that everything was in normal range. They did not target any specific window, socket, or really say whether the walls or attic were insulated or not. They did not comment on the percentage of heat that might be lost due to the fireplace. I did not feel like it was worth it. The report could have read from a self-help book, it did not seem like it came from an expert.

Q2d2. Do you have any other comments about the energy auditor's work or recommendations?

Response Great about offering follow-up contacts and services He came back once we had the work completed and he spent some time beyond just looking. He really seemed to care that we got what we were looking for and that we understood everything. He did a very good job of communicating. He did an excellent job. He said we need some insulation on top of the house, recommended a company, and said if we wanted he would volunteer to come watch the guys do it and make sure they did it right. I would recommend him anytime. He was very nice, did a tour, let me watch, then sat down for a couple hours while writing things up and explained everything to us. I would hire him again anytime. He exceeded my expectations in terms of his deliberateness, the level of detail, the follow-up, the rational explanations of what was happening. He found something that I had somebody come to look at in our furnace system. There was a duct that wasn’t connected and was causing hot air to spill into the attic. He spotted that and we had it fixed. He seemed very knowledgeable, and was not at all pushy in terms of a sales approach. It was just informative

Behavioral Perspectives on Home Energy Audits Page 245

Response without any pressure. He seems very qualified and professional. I just want to check on some of the recommendations. He took a bunch of infrared photos and it would have been great to get copies of those. He was a pro, like Ghostbusters. He blinded me with science. He was a very personable and polite gentleman. He was absolutely thorough. He was awesome. He was complete. He was excellent. He was great to work with. He was great. The report reflected the comments that the auditor made during the survey. He was really helpful. He made himself available even after he sent us the report if we had any questions. He was terrific. He was very helpful and made himself available long after the audit and we have asked him questions since then. He was very professional and I would recommend him without any reservations. He was very thorough, knowledgeable, and very educational, in the sense that he imparted a lot of good information that we didn't already know. He was very friendly and professional. We would highly recommend him. He was very tolerant of my four-year old. His verbal recommendations and the information I learned from him in person was really, really helpful. The form I got from the city was not. I think the form was just too restrictive, especially these old houses are unique, and I didn't find that the characteristics of my house fit into the format of the printed audit that I got. I don't understand the information on my City Light account. My auditor did not give me much information. I expected somebody who was going to go into the back corners of the attic to check out the moisture situation. I felt like I could have done the energy audit because it was so simple. I felt they were top-notch. I guess I expected more information about little things like caulking around the windows. There basically was nothing I could do other than get a fireplace insert. I wish there were more things I could do. I never got the report from him. He did not turn off the furnace before he tested the air flow pressure. There was a switch to flip in the furnace. I never received a copy of the audit. They did do a good job and informed me well, but never sent the promised copy. I really looked into finding someone that I thought would be really good. I think that he was a very good communicator, explained things well and honestly, and had a very good technical understanding of what he was doing. I thought he was efficient. I thought he was really professional. He came back after some of the work was done and tested to see and make sure that it was effective and put together a list of items for the contractor to do. I thought it was very professional and he gave us a lot of good projects to work on. I thought that most of it was very good, a few things were overzealous but overall it was good. I thought there would be more follow-up, like a call. I wanted more detail. I wanted them to document more specifically the air leakage locations instead of a general recommendation. I was impressed and I liked him as well. I was impressed with the results. It is all rather confusing as to what to do, but it is still nonetheless very helpful.

Behavioral Perspectives on Home Energy Audits Page 246

Response I was very enthusiastic about the auditor's work. I was very impressed. I would have liked a local contractor name along with the costs and benefits of each recommendation. I would have liked that there was recognition of where my house is, who my energy providers are and what types of rebates, costs, and programs are possible for my personal case. I would have liked them to do a little more on the appliances. I would recommend him to anybody. I think he did a really good job. It really did not come back with recommendations that were useful. Our electrical bill stills runs high. It really didn't identify all of the electrical sources that we could cut down on other than light bulbs, and our heat is oil. It was very well done. It would have been nice to have a list of where I could get the things he was suggesting, where would be a good resource to do that, like specialists. The energy auditor was very knowledgeable about his recommendations and work. The energy auditor was very thorough and spent several hours doing the audit. He exceeded my expectations. The fundamental flaw, in my view, of the audit is that the recommendations are not tied to solid estimates to get the work done. The range is too wide and the information is not adequate for a contractor to do an estimate to get the work done. Virtually every contractor I talked to would not do it based on the audit, but wanted to do their own walk through. The only thing that they recommended was a heat pump; one on the main floor and one upstairs. Every auditor seemed to have some business, that they were going try and sell me something. There's no way to get around that, but that was how it was. The recommendation was for a heat pump. We do not have central heating so we cannot have a heat pump. The report could have been clearer about the work recommended. I was satisfied overall. The two auditors were very professional. The work he did appeared to be quite good, certainly he was congenial and answered questions. There were a couple of things that he said that I doubted and I thought were not right. They took longer and were very thorough. They were absolutely wonderful, very professional, and had a lot of information and suggestions for me. They were great. They were personable and accommodating. It was a pretty cool experience. They were very clear, thorough, nice, and the process was very easy. They were very creative and approached it with a very open mind. They were great at figuring-out how to "keep the outside in and the inside out." They were very thorough and their recommendations were really good. Very knowledgeable, seemed like he invested in equipment We loved it. It was awesome. We researched him before we selected him through Angie's list. He was highly recommended and we appreciated his knowledge and report.

Q5. What was your reaction to the energy score?

Response A little surprised with one of them, related to how much energy was lost through the walls of the house. Otherwise, nothing too surprising. All approximately what I expected to find, because I was there while they were doing it. No surprises. All were approximately what I expected to find. I was there while they were doing it. No surprises.

Behavioral Perspectives on Home Energy Audits Page 247

Response Annoyance As I recall, the overall score wasn't good. That's because we have oil heat, which sort of damned us from the get go. I felt bad about that, like there was nothing I could do. We learned that we have no insulation in the house. Confusion Disappointed, but not that surprised Energy score was kind of like whatever. The score did not matter much to me. Gratitude I accepted it as the truth. I was happy because there were not issues to work on. I did not do very well, but my house was built in 1892, so it was not much of a surprise. I did not place much stock in the energy score because a lot of it was calculated based on things he looked up rather than things he actually calculated. I did not believe it, to be honest. I was more interested in comparing our big old leaky house to other big old leaky houses, rather than small new houses. I expected it. I felt it was probably right on target. I felt kind of lost towards the end. It would have been nice to get recommendations for specific contractors and people to help us do the recommendations. I felt like it was too low given how I live. I found them useful. I guess on the one hand I was surprised that our electricity consumption was not the biggest source of energy waste. I thought that unplugging lights, turning off our computers would have a measureable effect. I found out that it was insignificant in relation to the affect that fixing areas where I need to insulate more would have. I had no reaction, it is what it is. I kind of expected what they told me. I know that I keep the house up. I have re-built things and have done roofing. I felt like the house was in very good shape but there are places that I could never look. I knew the insulation was good because the houses that were affected by the airplane noise got new windows and insulation replaced. But the score was not anything that I did not expect. I learned that we needed to improve our insulation. I looked at and understood that it was an average number. I really do not remember. I thought it was pretty accurate. I was a little bit surprised. I did not even realize that in the crawl space there is no insulation. I was a little surprised, but not too much. I expected a bad score. I was a little surprised. I was a little surprised. I thought we were a bit better than we were because we have done a fair amount of energy improvements over the years. I was appalled. It is a hundred year old house and yet I was surprised by how high the score was. I was happy that I did well compared to other homes and a little sad that other homes did so poorly. I was not surprised about the score. I was surprised where the heat was going out. Heat was leaking through the basement. I was not surprised since we knew that our house was in need of upgrades. I was not surprised, I agreed with it. I was not surprised; it was generally a favorable score. I was not surprised. (2) I was not that surprised.

Behavioral Perspectives on Home Energy Audits Page 248

Response I was not too surprised. I was pleased. I was quite pleased. It could have been much better or perhaps much worse. I was satisfied. I was slightly surprised. I was somewhat surprised in certain ways. I was surprised at how inefficient my home was. I thought it was in much better shape. I was surprised it was so bad. I was pleased with the level of details in the report. I was surprised it was so good. I was surprised that it was higher than we thought it was going to be. I couldn't tell you the exact number, but it was higher than the number they are recommending homes move towards. I was surprised that my house did well because it is kind of old. I was surprised that the energy score was so high. I was surprised that we did much better than Seattle Public Utilities indicated we would. I was surprised. My home uses a limited amount of energy. I was surprised. My score was better than the average. I was very disappointed in my energy score. Clearly there is much to be done. I recycle about 80% of the air in my house. I wasn't surprised because I knew that it is an old drafty house that loses a lot of energy. I expected it to be bad like it was. I wasn't surprised, but it wasn't very good. It sort of confirmed my beliefs that the house was pretty leaky and not efficient. I wasn't surprised. In general, it was what we expected. Indifference It confirmed our feelings that the house was not energy efficient. It confirmed what I suspected. It could have been better, but I was not surprised. It didn't mean very much to me, because a score is a comparison. I don't have a recollection of the score right now. The score itself didn't mean much, but the details mattered a lot. It opened my eyes to what the house needs. It seemed like it all made sense. There was some enlightening information. It validated some things I had thought about. I was surprised when they used their infrared thing and there was some real energy loss. I was also surprised that we could use more insulation in our attic. It was about in range of what I thought, although I thought I would be a little lower in energy because insulation and energy work was done earlier. As it turns out, he did not do the job he was supposed to do. It turned out our insulation work was done with R19, when he should have used R30. It was about what I expected. (2) It was alright. It was appropriate. It was better than I had expected. It was better than I thought it would be. It was disappointing, but not surprising. It was excellent. I'm glad we have it so we know what the energy is like in the house.

Behavioral Perspectives on Home Energy Audits Page 249

Response It was expected. We knew we had problems. It was interesting, but difficult to know what to do with it. It was lower than I thought it would be. It was maybe what we thought. It was not surprising because the auditor had gone over it with me while visiting the house. It was pretty much what I expected. I would say it was useful. It was pretty much what I had expected, though we did better on the air infiltration aspect. It was reasonable, probably where it should be. It wasn't very valuable to me. It was very helpful information. Somewhat overwhelming, we have a hundred year old house. It was what I expected. It was what I expected. I thought I would be maybe a little better insulated. It was what I had suspected because we have a fairly small house and have done a number of things to insulate. It was what I suspected. Kind of what I expected My reaction to the energy score was interesting. I had a beef with the way the data was given to us. I was surprised that all they did was show us an average, not our specific data. It was invalid for our home. I would like to get our actual data. Neutral No particular reaction, about what I expected. I was interested to read it but no particular reaction beyond that. No surprises. It was actually a little better than I expected. Not a surprise, it was expected Not surprised (4) Our home performance was lower than I expected it to be. Our reaction was that it more or less confirmed what we already suspected but it was good to get specific information. Positive Surprised Surprised, I thought it would be worse. That really we couldn't benefit much from the minor changes that were recommended The energy score was about what I expected. They were what we expected knowing the last updates of the house. We also had stayed in the house for a full- year and knew that the house is colder in the winter. We had several things to do, and that was what I was trying to get done. It was fine. We were a bit below average, which surprised me. We were doing better than I thought we were. We were not surprised because we know the house well. It was a benefit to see it as a score. We were surprised. The house was built to pretty high energy standards and we have pretty low energy bills but he rated the house differently. Well, one can always do better.

Q8a. Why is to you?

Response

Behavioral Perspectives on Home Energy Audits Page 250

Response Concerned about the environment, global warming, think globally act globally. Energy score is more important to us because it's dollars and cents. I guess the other one matters too, there may be some correlation, but not sure how. Energy score is the amount of power I'm consuming. It impacts both the carbon score, and the amount of money I could be saving. I am not up to date with why the carbon score is important. I can do more about the energy score than the carbon score. I do not pay a lot of attention to my carbon footprint. I do not trust or understand the carbon score. I get tired of this carbon stuff. It is a term I do not really understand. It has never been put in a term one can truly measure to me. I look at it as a more specific and direct cost. I think it is something that I understand more. I think it ties in more to things we can control. It costs me money. It directly reflects the construction of the home and savings we might receive from doing the improvements. It feels more accurate. It has a more direct, tangible impact back on the costs and expenses. Carbon emissions are just one component of a home's overall environmental impact. It has more personal application. It has to do with how much I'm going to have to pay for oil. It is more immediately affecting the dollar costs. It is more important because it relates to carbon and it also relates to the costs. It is much more representative of my house and reflective of what is important to me. The carbon score is based on where the energy comes from but there is no guarantee that that is what my carbon footprint really is. It is saving me money. It is something I feel more in control of and also it relates more directly to my bill as I perceive it. It more directly translates into costs, home energy bills. It reflects costs. It related directly to the pocketbook. It seems like it is more important long-term wise. It seems to be a more accurate reflection of reality without further education about the carbon score. It translates easier into dollars. It makes more sense from a household budget point of view. It's a greenhouse, world, climate-changing thing. While energy is important, some energy sources are better than others. Money in the pocket. One is a surrogate for the other. I can do something about the cost. Saving money That determines the amount of energy loss we are having or not. That was the thing we could do something about. The bills are really high. The carbon one was not that high; maybe because I was more focused on the energy part. The carbon score rates a very low risk issue. The energy score is more important for economy and home comfort. The energy score is important for the comfort of my home. It is related to the carbon score. The energy score is where I think I can make the impact. I can do something more about it with my house.

Behavioral Perspectives on Home Energy Audits Page 251

Response The energy score relates more directly to energy costs for my home. The energy score relates to actual money you pay. It is hard to quantify how much carbon would cost you. The energy will save us money, but the carbon score does not seem to. Those are things that we can detect the difference. I do not have confidence that carbon scores are something that we can really impact on. To not utilize so much natural resources We use oil to heat our home and are concerned about the cost of doing so.

Q12. Considering all of the upgrades recommended by the home energy audit, which upgrades are you most likely to do?

Response Add insulation to the walls or add a solar water heating system Adding insulation in the attic, insulating the ventilation ducts Additional insulation in the attic and basement area Air leakage issues Air sealing in the attic, insulation in the attic, blown insulation in the walls and replace washing machine At this stage, I am not likely to do any of the upgrades. Attic insulation and stopping the cold air leaks Caulking inside rooms, insulation in the crawlspace Change light bulbs Fireplace insert He suggested putting perimeter insulation between foundation and floor joists, insulating wall and space between joists in attic, and then closing off one area of my attic. I had not considered these and thought they were very useful suggestions. Heating system, sealing heat loss because the ceiling is open to the roof causing heat loss, and new roofing I am most likely to add some insulation in the storage room. When we eventually replace the roof, we will check in to how to better insulate it. I also want to improve the windows of the downstairs by putting in double glazed windows. I am most likely to do the insulation in the crawlspace, switch to a ductless heating system that has a unit outside of the house, replace the heat pump, add solar heating panels, and weatherstripping. I am not sure. There are issues with all of them. I am concerned with their impact to the home and moisture issues. It is complicated. I don't have recommendations to complete. I might need to do minor caulking on the roof and fix the air exhaust vent. The air exhaust vent is leaking a little. I will do the duct sealing and the attic insulation. I would add insulation to the bedroom and living room. I would like to do air leakage recommendations, duct ceiling recommendations and ceiling and attic insulation. I would like to do all of them. I am a single mom with one income, so it depends what I get help with. I would like to fix the crawlspace holes and seal the duct work. I am losing thirty percent of my heat by the crawlspace. I would like to fix the insulation in the attic and fill in the chimney. I would like to fix the opening in the attic. I'll try to do all of them, probably the cheaper ones first like attic insulation. The main reason I haven't done the big ones, like replacing the furnace, is they are going to cost a lot of money. Insulate my crawlspace and the attic

Behavioral Perspectives on Home Energy Audits Page 252

Response Insulating the attic, blocking open spaces in between the attic and the living space Insulating the crawlspace, attic, and downstairs Insulating, weatherstripping, and closing holes in the floors around the radiators Insulation (3) Insulation and air sealing for the attic Insulation for the attic, sealing the windows and replacing the furnace Insulation in the attic and crawlspace Insulation in the outside walls Insulation of the attic Insulation upgrade in the attic, replacing the furnace from an 80 to a 90% efficient, and sealing up the door and windows Insulation, plugging air leak gaps and window upgrades Lower cost options for ceiling and air loss, rather than more costly insulation and so on. Cracks, crevices, little escape places, rather than having whole floor in crawl space insulated. These are low investment and high payback options. It depends on finding out more about these weatherization dollars I just found out that they are offering in our area from those Obama programs. New furnace and caulking the windows Probably just the weather sealing because the others are longer term and more expensive. Probably none because virtually all of the recommendations were not able to generate savings in energy to any significance. Our house is fairly new and generally pretty energy efficient. Reduce airflow upstairs, to reduce loss of heat and movement of air upstairs Replace the furnace Sealing for air-exchange, replacing the furnace and heating ducts, weatherstripping, and purchasing an Energy Star dishwasher and refrigerator Sealing the ducts. Specific issue for cold rooms, we have decided what to do about them. Some of the things having to do with leaks in the frame We are most likely to finish insulating the attic and sealing certain access points in the attic and crawl space. We are most likely to insulate the attic; however we need to get it tested first. That will probably slow us down. We are losing the most energy at the pull-down attic. We are not very enthusiastic about changing the entrance because it will interfere with us entering and exiting the attic. We bought pipe insulation and ordered new bay window, but that is not yet installed. It will be an ordeal because of installation and dry rot. We will also do insulation as recommended on floor in the attic and the ceiling in the cellar, and get a balloon for the fireplace. We will do all of them. We will upgrade the insulation in the attic, seal air leakage, change our hot water heater, and change our dimmable lights to LEDs. We will work on the insulation and sealing the house.

Q12a. What is appealing to you about these particular upgrades?

Response Age of furnace means it is due for replacement in next 5 years Ducting was free, and solving the problem we hope. Electric bill is twice what others pay for larger houses. Heat always rises

Behavioral Perspectives on Home Energy Audits Page 253

Response I think that making the recommendations will require additional work. I'm not ready to make changes. Improve efficiency of home It will be quick to fix at low cost. Low up-front costs with higher returns. Monthly bills are a drag, up-front costs are expensive. Oil heat is very expensive, and it's 1000 or 600 each time they fill the tank. Not too expensive Not very expensive and useful Reduce energy use, our carbon footprint, and lower the cost of heating the house. Replace older equipment Return on investment, both time and money. Save energy and reduce carbon output Saving energy The lower-cost improvements were more appealing. They are cheap. They are straight forward and don't require any major capital expenditure. They are the least expensive of the repairs. They have the best cost-benefit. Those made the report something glad that I did. To reduce our carbon footprint We are losing heat. We had not considered them, but they make a lot of sense. We had always thought it is cold in those particular areas.

Q12b. Are these upgrades something that…Other (Please specify):

Response I will get a contractor friend to teach me how to do insulation. I would ask my neighbors. It depends on what Seattle Housing can do for me, they will probably use contractors.

Q12b2. How did you find out about them? Other (Please specify):

Response As part of the recommendations from the remodel we are currently doing Flyers at Home Depot Found them from the list and did research before choosing auditors; chose same contractors who did the audit. I am a real estate agent. It is part of my job. I have known my handyman for over 20 years. I went to an energy fair and they had a table that talked about the audits. I later went to a meeting at Shoreline City Hall that was sponsored by City Light. Puget Sound Energy recommended them over the phone. Puget Sound Energy's list Researching solar hot water heaters and Puget Sound Solar

Behavioral Perspectives on Home Energy Audits Page 254

Response We are in the industry.

Q12c1. Approximately, how many months from now do you expect to start the first upgrade?

Response Summer (2)

Q13. Considering all of the upgrades recommended by the home energy audit, which upgrades are you least likely to do?

Response Air exchanger All of them Crawlspace insulation Full floor insulation Furnace upgrade He suggested replacing our oil with geo-thermal recirculating heat pump. I am doing all of them. I would say that none of them are least likely. I am least likely to take the siding off and put foam board under the siding. I am not likely to replace the appliances, CFL lights, and windows. I am not planning on doing any of the upgrades. I will not do the hybrid water heater and window upgrades. I will not likely replace all the windows. Installing solar thermal systems Insulate the crawlspace Insulate the walls (2) Insulation everywhere Insulation in the basement Insulation under the floor in the crawlspace. LEDs Lighting recommendations None (2) None, I would like to do all of them. Outer wall insulation Replace my furnace Replace our water heater with a heat pump Replace the refrigerator Solar panels The in-line hot-water heater The roof because we have insulation in the roof already, but it is not as efficient. There were not any. They all were relatively easy and made a lot of sense, so I probably will do most of them. They didn't recommend upgrades that were really not possible, for instance I have a 1940s house, so it is probably not very possible to spray foam into the walls. What they suggested are very doable, and they didn't suggest anything undoable. We will need to put off replacing storm windows with double pane.

Behavioral Perspectives on Home Energy Audits Page 255

Response They only gave me one and I am planning on doing that. Wall insulation Water heater replacement, heat pump We are least likely to put on solar photovoltaic. We will not do work around the fireplace this year. Window coverings Windows and extra foam insulation

Q13a. What is NOT appealing to you about these particular upgrades? Other (Please specify):

Response A lot of work Dirty job Fairly invasive Getting into the crawlspace and handling insulation upside down I already have window coverings up and I would have to redo them all. I will need to use an electric water heater and for the windows, it is not cost effective. It would require a lot of modifications for it to work. Lack of clarity in the report Long return on investment Longer payback time Low payback Overkill; they measured humidity after we had a week of heavy rain and we were right on the border, in a typical situation I don't think we need it. Poor cost-benefit That would require gutting the house or punching holes into every wall. The quality of energy lights are not as good quality as regular lights. There are no others. There is a very thick siding on the walls and there is some insulation in it already. There is no return on upgrading. They may not be too effective. We have brick exterior and plaster interior walls, so it will not be easy to do. We have insulation in there already. Who wants to put insulation in? Getting them done is not appealing, but they are not hard.

Q14a1. Next, I'm going to ask you about the recommendations you have completed and how long ago they were done. What is the first recommendation you have completed?

Response Added insulation to the attic Additional attic and wall insulation Attic insulation Balloon for the chimney

Behavioral Perspectives on Home Energy Audits Page 256

Response Better insulated our doors Blocked off air that was seeping into attic Caulking Caulking and insulating the basement Caulking on the back door Ceiling caulking Changed all the light bulbs in the house to compact fluorescent, vented the dryer out of the attic, CO detector, testing knob and wire tubing Changed out a door slider Crack sealing Ducts sealed and insulated Energy efficient bulbs Enhancing the attic insulation Fixed ductwork joints in the attic Fixed the air duct that was leaking air Fluorescent lights Gas furnace Had my new triple-pane windows sealed properly. We just installed them before audit, but they had not been caulked properly and were leaking all around the edges. He found asbestos where we needed to seal the seams and fill in insulation. I have already had someone come over here from an asbestos company and they made further recommendations. Improve insulation in crawl space Improving attic insulation Insulated attic Insulated attic hatches Insulated roof Insulating around whirlpool and master bath Insulating the crevices in the attic Insulating the receptacles Insulating water pipes Insulation added to the attic and all of the walls that did not already have insulation in them Insulation added to the walls and to the attic Insulation around the attic door and trying to put weather insulation around our outdoor doors Insulation in the basement Insulation in the crawl space Insulation of furnace room Insulation of the house Insulation under the kitchen Insulation work Made our door close properly Mini split heat pump New insulation around the windows New insulation below our house

Behavioral Perspectives on Home Energy Audits Page 257

Response New windows in the whole house Plugged up holes and gaps using foam Plugging some air-gaps in the basement walls Redo the crawlspace, put down a new vapor barrier, rodent traps Replaced and upgraded furnace Sealing around the windows and doorframes Sealing points of leakage in the attic and basement Switched the light bulbs to energy efficient Tent on the attic access Upgrading weatherstripping around several doors Ventilated the bathroom fans Wall insulation Weather sealing upstairs in the attic Weatherizing interior and exterior doors Weatherstripping doors (2) Weatherstripping in the attic

Q14b1. What is the second recommendation you have completed?

Response Add vents or roof jacks to the attic Added attic door Added insulation to crawlspace Air seal and weatherize under floor air ducts Air seal the roof, walls, and windows Air sealing in several parts of house, especially the basement Caulking around the places where I put paneling up Ceiling, doors, windows, and chimneys Changed 80% of light bulbs to CFLs and LEDs Chimney balloon and everything related to air and ceiling insulation Fixed holes in the attic Fixed penetrations into attic Floor installation in one area Furnace Installed fluorescent light bulbs Insulate rim joist Insulating a wall and space in downstairs basement Insulation of the water heater New cat door New furnace New insulation put around the joints of the pipes, venting around the house was caulked and sealed, water pipes wrapped

Behavioral Perspectives on Home Energy Audits Page 258

Response New weatherstripping around all of our doors Replaced all of the exterior bulbs Replaced door lintels in basement Replaced light bulbs (2) Replaced windows Sealed and caulked vents and opened areas Sealed the house, the leaks and insulated the walls in the attic and garage Sealed the light fixtures Sealing air leaks (2) Sealing around windows and chimney Underside of the floor insulated Updated our crawlspace, insulation, and duct work We added caulking up to the attic and in a front closet below the attic stairs. This was where they found the most significant flow of air out of heated portion of house into unheated portion. Weatherstripping (2) Weatherstripping around doors Weatherstripping downstairs, including the door to the basement Weatherstripping in ceiling and attic Windows retrofitted

Q14c1. What is the third recommendation you have completed?

Response Ducts cleaned Energy efficient fans in the bathrooms Estimate done for weatherizing doors and windows Fixed a sliding door that was leaking air Foam-seal air entry points Furnace was short-cycling, so I had it serviced and replaced filter, and installed a new programmable thermostat. I am now on my third replacement programmable thermostat. House insulation around heat ducts Installed a glass door on fireplace Installing energy efficient windows Insulated space by furnace in basement Insulating and improving insulation in parts of the basement Insulation in the attic and exterior laundry room Insulation inside the house to isolate warm areas from cold areas Knob and tube wiring removed from the attic Removed extra refrigerator Replaced furnace filter Sealed air leaks in crawlspace Sealing air space in the attic Sealing leaks from the attic There was water in the crawlspace which got resolved. We changed the slant of the roof which diverted

Behavioral Perspectives on Home Energy Audits Page 259

Response downspout. We made a new cover for the crawlspace which prevents rainwater from draining into it. Weatherstripping Weatherstripping backdoor

Q14d1. What is the fourth recommendation you have completed?

Response Added blown insulation to the walls Added insulation to the attic Attic, cupboards, and chimney caulked, sealed, and insulated Closing off downstairs fireplace flue, sealed and insulated it, added a chimney cap Estimates done for sealing and insulating remaining ducts downstairs Got an Energy Star washing machine Insulate hot water pipes Insulated foundation wall separating habitable space from crawl space Replaced light bulbs with compact fluorescents Replaced regular light bulbs with energy-saving ones Use ceiling fan when cooking

Q14e1. What is the fifth recommendation you have completed?

Response Added air sealing to double hung windows

Q15. Did you or someone in your household do the upgrade(s), or did you use a contractor? Other (Please specify):

Response I used a contractor for the heat pump. The air leaks we have been doing ourselves.

Q15b. How did you find out about your contractor(s)? Other (Please specify):

Response Added air sealing to double hung windows Costco I had used them before. (2) Puget Sound Energy and some kind of network of contractors that recommended each other They have always done the work on our furnace and they came to fix something that they should have already taken care of. Took bids on furnace. Researched potential matches using city's audit list of prescreened contractors before choosing a local auditor. Used same contractor previously for work at home, and got recommendations from existing contractors for new kinds of work. We saw ads in Puget Consumer Co-Op Natural Market Newsletter.

Behavioral Perspectives on Home Energy Audits Page 260

Response We have a handyman on call. Word of mouth plus bids

Q16c. What other benefits did you learn about that could result from improving your home's energy performance? Other (Please specify):

Response Dust reduction Energy conservation General understanding of some of the issues involved like recommended amount of air to pass through house. Hot water venting safety, technical things like vacuum fan and air pressure gauges demonstrating cold drafts through normal heat exits I also learned that the investments in the repairs would cost too much to save in the end. I am not sure what to do first and which would be the most cost effective. I felt empowered and curious to work in and repair the crawl space under the house. I have shared information from the program with other people and recommended the program. I learned about changing the heat sources. The auditor introduced me to heat pumps. I learned information about my carbon footprint, ways to improve comfort, and learned about options to improve quality of lighting. I learned so much. The report was comprehensive roadmap. As a bonus, I learned about critters getting into crawl space in spite of barrier. I learned that there was no barrier between my ceiling board and the roof. This is very important. It would actually rot out stuff up there, my house was getting damaged and I was experiencing heat loss. I will be putting in that barrier to save my house and save energy. I now have a to-do list to improve my home. I understand my house a lot better. I can now see where there are problems. Idea of insulating walls to save energy (2) Infrared camera detecting heat leaks under wood trim It displays a fairly wide range of action fairly systematically. It categorized things that could be done but it doesn't really provide enough information about what the goal in each category is. It is not very hard to make our house more energy efficient. Keeps house quieter Light bulbs Light improvement, less energy but more light with LEDs More energy efficient More information about comfort and carbon benefits Protect pipes from freezing, window coverings to keep it cool in summer and warm in winter, different heating systems that might be useful Relative ranking of the improvements in terms of payback time That it would probably cost about ten times as much to find out what I want to know. I thought I was getting an expert. The auditor specified about changing to different light bulbs. The benefits for the furnace repair The ducts and some other things in the basement were rated poorly. The existence of a carbon score.

Behavioral Perspectives on Home Energy Audits Page 261

Response There were more leaks than I was aware of. It confirmed how to improve a space we were undecided about. The auditor was good at considering aspects of house, lifestyle, and usage. They replaced incandescent bulbs with fluorescents and we found out that we are a candidate for solar. We have a crappy furnace. We looked at windows, ceilings, corners, cellar, and the crawl space. They were specific about advantages of insulating specific areas. Some things I learned were that the bathrooms were vented to the outside. The grates in crawl space were closed so they opened them for ventilation. Which upgrades were the easiest and the most cost effective

Q17. Overall, can you identify anything that you didn't get out of the program that you hoped to when you first signed up for it?

Response A direct connection to utilities and their specific programs A more complete thermal reading and a better understanding of where all the heat leaks are coming from Better and specific recommendations for energy savings Get a before and after breakdown of how helpful the audit was, a projected analysis of what I am saving now and how that will look in the long haul He did not send me my report and that was the main thing. Help with the follow through I cannot say that there is anything because I really did not know what to expect. I could have used more specific recommendations on insulation. I would have like more information and recommendations for people who could do the work. I did not get subsidies for the insulation. I feel like we learned a fair bit, but it would be nice to know specific resources to get specific, more obscure fixes like a chimney balloon. I got everything I expected. I got more than I expected. I was very pleased with everything they pointed out and their professionalism and affability. I had hoped that I would be able to find out more about how much electricity various appliances use in the house. I had hoped to get some more specific information about electrical appliance usage. I had thought that they would mention more about the windows; however they said that they were okay. I have a kiln, and the contractor didn't really have any advice on energy savings for the kiln. I have mixed feelings with including the quotes. It feels like a sales push. I haven't gotten the report yet. I hoped to get a clearer more detailed report that was easier to act on. The report has all sorts of different sections but you have to jump back and forth to get a picture of what it is. On a page for attic and ceilings, they should have associated actions and notes. It just says "See General Notes" everywhere. It is not a very tight report. I hoped to get contractor recommendations with a specific certification called BPI. I asked the auditor for recommendations during, before and after the audit, but he didn't respond. I hoped to get resources or references to online, reputable ways to evaluate weatherization contractors. A follow- up with how to do it yourself, education would have been nice. Had I not been home and able to follow the auditor through the house, I probably wouldn't have gotten much out of the audit, nor would I recommend it to other people. I'm a person who believes in this, so that was disappointing. I think I was looking for a little bit more feedback. The guy just forgot to get back to me on something but I did see

Behavioral Perspectives on Home Energy Audits Page 262

Response him later and get it figured out. I think it would be helpful to have more research information regarding house energy efficiency online that I could read about that is provided by a government agency so that way I know it is not biased. I thought I would get some report that would actually truly reflect correct information and that I would get recommendations for people that could help me. I thought the report format would be understandable, but I do not understand it. I thought that electricity would have played a bigger role. A lot of what I have read in the press says that computers and TVs waste a lot of energy, but really it's not much of an energy waste. That's also partly because we live in the Northwest and have access to hydroelectric power. I thought that I would get answer to questions and learn new things. I thought that it would be a little more thorough inspection. Like going into places where most people cannot access on their own. Most people have no clue that they have moisture in their attic. I thought that they would tell me to change the windows to double paned windows and they did not. I thought the process of having the auditor come in was really good. Largely subsidized by the city, nice guy, good job, learned a lot. We were excited to do things that would make a difference. If the auditor at the end of the process had asked if we wanted some bids, we probably would have done some of the upgrades right away, but instead the follow up from the city was not as good, just the report, and it didn't make clear how to proceed in a timely way. I thought there would be more things I could do. I was disappointed by the energy score. I had heard about it, but I found the actual score confusing. I kind of questioned the methodology of it. The guidance was excellent. I was hoping it would identify specific things that could be changed to improve our electric bill. Instead we got specific things to improve our heating bill, which does not help us at all. I was hoping that the company that was doing the energy audit could go ahead and do the ceiling insulation. It looked like they were too busy to do that. I was looking for an order of projects to do and I received that not from the auditor but from a handyman who came after. I was reasonably impressed with how it all went, it lived up to expectations. I wish the auditor could have told me how many amps my swimming pool pump was drawing. I would have been interested in basic information about solar water heating, tankless water heating and other deeper retrofit measures. I also didn't get good information about the relative costs of estimates, rebates, or tax incentives. They did not do a good job of going over costs and financing. I would have liked more clarity on what types of things would be better to do on your own and which would be better with a contractor. I would have liked to get three good estimates to do the work and a firm idea of what the costs would be for those estimates. I would like to have another energy audit after I have made improvements. It would be good to have contractors, it would be good to know if I am supposed to be forcing insulation into my walls. I do not know where to begin with the holes in my basement or how to seal them. I did not get any possible solutions. I also got a furnace that is now heating my basement, making it too warm. It would be nice if they listed what you need to fix, the contractor who could do it, and the possible costs. The language was very generally about the price, but dollar figures would be great. It would have been nice to know the contractors that can do some of this stuff. More photos of the different walls with the heat gun they provided some examples, and I was looking at the gun as we walked around, but I don't remember which walls were worst. There were only a couple of examples in the final report. No, because I really did not know what to expect.

Behavioral Perspectives on Home Energy Audits Page 263

Response No, I got a chance to ask questions and follow-up. No. I was afraid it would just be some guy spending 15 minutes and leaving, and I was thrilled that it took several hours and the guy was knowledgeable, technical, and believable. I trusted him. Nothing, I was very happy with it and recommended it to other people. It is such a cost effective way to figure out what needs to be done instead of sitting around talking about it. Recommendations for some things that are above and beyond like if the house is good for solar thermal. The amount of improvement with each upgrade was a little unclear, it would have been nice if there was more precise or detailed information about upgrades and the typical recoup on investment time. The biggest disappointment was that the rebates that make homes more energy efficient have ended. I guess that is up to Congress. The biggest frustration was that it took much longer than we expected to receive the certificate. It seemed like it took a lot longer for the program to get up and running. The detailed energy reports were not as easy to understand as I hoped they would be. The house is colder after the audit was conducted. It may not be his, the auditor's fault. I was hoping to learn tricks to keep the house warmer, but it seems to be the other way. The mailed report pretty much and more details on how I could have saved money. The more quantitative assessment of the actual energy use There was no information about our all-electrical based appliances; no assessment of our dishwasher, washing machine or refrigerator. There were things that I thought might be wrong with the house that are not. We expected that there would be some specific things we could do and we really didn't get out of it what we thought we would. They had shown the equipment for infrared cameras and they did not have them available during the audit. Considering that I did not get an adequate job done from the previous contractor, I would have liked to know if he left any holes in the wall. We have a mysterious air-flow problem due to having a very old house. We told them about that, and they did not put a lot of energy in helping us solve that issue. We did not want just a boiler-plate audit and we wanted something specific to our home. We wanted to know why our energy bills were so high. We weren't necessarily advised on the best way to seal the air leaks. The auditor told us our relatively new Energy Star water-heater was bad.

Q18. What could be changed or added to the audit, score card, or report that might help you decide about making changes to your home?

Response A follow-up option with contractors or with the auditor themselves after the changes have been made to meter and track how many were performed and how effective they are A list of things that would be most effective to do immediately A lot of the changes that he recommended were pretty nonstandard so having information available that could help us do them either on our own or with help would be very helpful. A one-page comparison of the energy changes that a person could make; and you could base it on energy saved, money spent, all over a period of time; and they can then figure out what things are worth it and not Accurate data on the individual energy use Although the numbers are there, and you can certainly juggle them around and look at them, it would be nice to have them in a grid form where you can compare the initial cost range, payback time, and so on. Also provide a really quick eyeball graph of the top priority recommendations and likely outcomes. An exact cost for insulation, the percentage of heat that it might save, fireplace inserts, if necessary. I needed a

Behavioral Perspectives on Home Energy Audits Page 264

Response whole lot of answers and there are endless questions. As mentioned earlier, making a direct connection for how I can improve my house with my specific utility provider. Better cost information, maybe focus on the tax rebates that are available Clear breakdowns of the cost benefits for each of the suggestions Cost estimates would have been helpful, but hard to do in the period of time they were here, or without getting contractors in to do the estimates. That's what we're going to do now. Detailed quotes For each item, if they could give me the payback time for the cost of making the change. For me, it would have been helpful to have an introduction to the process. Looking at the whole energy audit process in a general sense, I jumped into it without knowing what to expect. Having a summary saying these are the benefits you might expect like lower energy bill, comfort, light needs and heat needs. Give me a realistic cost and a variety of contractors that the actual auditor has experience with good ratings and reports to the city Have information flow into the specific energy performance score pages so it doesn't just say See General Notes. It just wasn't as refined as it could be to be more helpful. I am getting a fireplace insert. I don't think the score card is very helpful to me because I am too detail oriented. I had decided I needed to make changes before the audit, so the scorecard just gave me ideas. I have been to several places to look into costs. It would be an advantage to have some information sheets with the costs, for instance the cost of insulation materials, and cost of time for installing insulation. I think he went around and took pictures with the report. There should be pictures included. I think just making it more easily accessible. I remember that the report was really difficult to find through the website. If it was an e-mail attachment it would be better. I think that it helps to have some resources in writing, like where to go to test something or contact about completing the work. That would get me in the right direction. I thought it was excellent. My only question when I got the report was how much might have been biased based on what the company could provide in terms of upgrades. Maybe they could give a second opinion in the report? The company didn't recommend that we replace windows in the basement that I thought were letting in a lot of air and they recommended insulation. I just wondered if that was connected to the fact that the company that did the report did insulation. I would have appreciated an additional two or three contractor referrals; maybe a total of four. I would have liked better communication with the people who provided the audit report. I did not hear back from them with the issue of the chimney flue being open after sending one e-mail. I would have liked more detail about each item that needs to be corrected. I would have liked to talk to a contractor in the audit process. I would have liked three estimates in the audit and clear information about financing and rebates. I would have liked to know about the ceiling under the roofing and the flooring and if it is rotting or not. I would like a better cost/benefit analysis and recommendations for the order of improvements. I would like information on government incentives for high cost improvements. I would like low interest rates available to make improvements. I would like to compare the cost and carbon footprint of using oil, gas, and electricity as primary sources of heating my home. I would like to learn about incentives and tax credits for the home improvements. I would like to see more helpful information about electricity usage for appliances and other devices. I would like to see the energy upgrades for the energy company and gas company to have listed together so that

Behavioral Perspectives on Home Energy Audits Page 265

Response the house owner could compare. If I saw the report, I could probably respond to this question better. If Seattle City Light was able to give them a three year history of my consumption to use that numerically to better estimate the costs of the potential changes. It would be a nice addition if there was more of a direct financial consequence in the data. In my case there was nothing really. If my circumstances had been different, I would have liked easier access to home improvement loans and guidance on how to get them. It is pretty thorough. I don't know who does specific work. If I ask a contractor to come here it will take 8 years to do it because it is such a small project. It might have been nice to have a resource guide. It was easy to understand. It would be good if they were ranked in order of importance. It would be nice if they listed what you need to fix, the contractor who could do it, and the possible costs. The language was very generally about the price, but dollar figures would be great. It would be nice to know what the carbon footprint is of the new products that would be put in the home in order to make the home more efficient. One of the things I often think about in terms of lowering carbon emissions, is the action someone is proposing I take, by doing it, am I contributing less carbon into the atmosphere? Or am I actually contributing more carbon? What's the carbon footprint of my new sliding-glass door, vs. leaving the old door in place and continuing to heat with a little more gas? Kind of the cradle-to-grave analysis of what I'm doing, accounting for the carbon emissions embedded in the object itself. That would be a very valuable piece of information for any consumer that wants to put in new, energy efficient products. It would have been nice to be referred to someone who can help you take action on the recommendation. I would also like more information on deeper retrofits like solar hot water. It would help if they had an amp meter so I wouldn't have to hire an electrician to measure the pool's energy use. List of qualified contractors in the area Longer access to the report and score card. I do not know if that is a program issue or of the auditor. After I logged in, my results had expired. More detail on fireplaces More information about costs for the upgrades More information about retrofitting older homes More information about some of the options, more environmental information about the options More lists of contractors, seems like that was pretty limited More specific records and notes about problem air infiltration areas No, everything was in there. I asked him while he was doing the audit and he said no, it would all be in the report. It's all here, from basement to attic. Not sure, it was pretty thorough. One of the problems we have is oil heat and a lot of the audit system is driven by reducing your intake. Oil doesn't compare well to gas or electric and they don't have access to how much oil has been used in the past. That's the one part of information I don't have. Our house is really leaky and I think the form could allow more detailed information about how to improve the air tightness of the house. The recommended upgrades detail is a good list and what I would refer to for making improvements. Provide some follow-up, like a call telling me what I should do next. A list of people I could call that could answer some questions. Rate the actual efficiency of kitchen appliances as well as specific draws on the energy used by the furnace and fan. I would also like to know if the previous installment of insulation was done properly.

Behavioral Perspectives on Home Energy Audits Page 266

Response Simply the report Some sort of priority order in which to do upgrades Some type of appliance questionnaire Sometimes they have programs where you get a big discount when you do what they say. I was hoping to find a refund on my bills. A list of contractors from the auditor would have been nice. Source of grant money Specific recommendations to the electric appliances Stating what should get done first The audit assumed that price was the way everything should be measured. Something about the carbon savings as opposed to the money savings, system change-out options like a new furnace The audit was pretty well designed. We had already done a lot of research, so we didn't get as much additional information as someone new to the process would. The city should provide better rebates for people that live in the North Seattle area. The cost of making such changes could be included in the report. The most effective thing would have been for the auditor himself to come back, because we already have a relationship. Provide some research or recommendations on how to get the next steps done. The problem is decision-making and I am not sure about what could be added to help that. There could have been a more complete list of home energy contractors attached to the report and an outline of recommendations prioritized by effectiveness with a very rough cost associated with them. There wasn't a lot about what various options cost, or potential savings of either energy or cost. After talking to them, they can't give you hard numbers and don't want to commit to something they can't do, so maybe it's not reasonable to include it. But it would be very helpful. They gave general ideas about fixing particular areas, but they couldn't derive specific information as to how effective that would be. They may have already done this, but the difference in saving on bills They should provide more information on alternative equipment and projects, specifically on the different heating systems available. They were very conservative and did not want to tell me about huge investments that I would have to make. To know the difference between a contractor that provides energy audit services and the auditor who only does energy audits. The latter does not have their interests involved with businesses. We heard from both and it seemed like they were both speaking different languages, because one was not concerned with selling a product but informing us. Everyone should know this difference.

PRI_A. What energy upgrades did you do?

Response A new front entry door and the windows on the west side of the house. About half of the windows were replaced, as well as the furnace and water-heater. All new windows, insulation, new furnace As part of recent remodels, I had done double-paned windows, some wall insulation, and CFLs. Attic insulating Attic insulation, new washer, dryer, and dishwasher, all fluorescent bulbs Changed about half the light bulbs in the house, put in some insulated windows, new furnace Changed the windows to thermal pane windows and placed acrylic storm windows. Door weatherstripping, window striping, replacing a furnace for a more energy efficient model, and placing a

Behavioral Perspectives on Home Energy Audits Page 267

Response damper on the fireplace chimney Double-pane windows and changed appliances to Energy Star During a previous remodel, I put in additional insulation and had fixtures upgraded. I also replaced a few bulbs with efficient ones. Extensive sealing of air leaks High efficiency furnace, new water heater, refrigerator, washer, dryer, all appliances had been replaced I bought some energy saving appliances and redid the upstairs windows with double glazed windows. I have added attic insulation, efficient lighting, and window shades. I have replaced some windows and filled out gaps in the outside of the home. I have kept the furnace tuned up. I have added wall insulation, crawl space insulation, and a high energy efficient gas furnace. I have lived in the house for over 20 years and done too many upgrades to list. I have replaced the furnace. I have replaced windows with storm windows or double-pane windows. I have added weather-stripping. I have storm windows, programmable thermostat, I insulated the attic and the garage, I weatherstripped the doors, installed a bunch of compact fluorescent lights and even an LED light for a hallway. I've set all of my computers to sleep when we are not using them and we had blue cellulose insulation in the walls. I hired a contractor to insulate my walls and ceiling. I had also contracted through Puget Sound Energy to have the basement roof insulated. I insulated the attic. I put in more energy efficient lighting. I put in new thermal windows. I put in new windows and doors. I replaced all of the windows to better insulate. I replaced half my single-pane windows with double pane windows. I got appliances that are more energy efficient. I put in a more energy efficient roof by replacing the dark asphalt roof with white PVC. I replaced the windows. I switched to a high efficiency gas furnace and I upgraded several windows. I upgrade the furnace system 2 years ago, and got a new heat pump system and tankless hot water. It had more to do with use of gas and electricity, than with energy audit motivations. I also started with small things and awareness, like light bulb replacement and cooking style adjustments to use microwave more and stove less. Installation of four triple-pane windows in two bedrooms Installed more efficient furnace and on demand water heater, insulated walls that had not been insulated before. Insulated all the walls, part of the floor and ceiling Insulated ceiling of garage and storage area, added more insulation in attic Insulated some electrical outlets Insulated the attic Insulated the basement walls and ceiling including the garage, upgraded windows, addressed water intrusion, switched from oil to gas furnace Insulation around our hot water heater, changed light bulbs, new Energy Star appliances Insulation in attic and fluorescent light bulbs Insulation in our attic and basement ceiling, we had put in some new windows Insulation in the attic and around the window seals Insulation of attic and walls, high efficiency furnace and hot water heater Insulation was added to the ceiling of the garage Insulation, caulking, storm window, solar PV and solar hot water

Behavioral Perspectives on Home Energy Audits Page 268

Response Insulation, caulking, window replacements, and weather-stripping Insulation, replacement of windows, built another floor with attention to energy, furnace was changed out to more efficient furnace about 15 years ago New furnace, heat pump, new appliances, window covering New heating and cooling system, insulation, weatherstripping, and more energy efficient appliances New heating system, solar hot water, weatherization work and mostly insulation work New insulated windows, insulation in the attic New windows New windows, new furnace, new ducting New windows, replaced the oil furnace with a gas furnace, a new refrigerator Oil heater was installed Prior to the audit we added wall insulation, upgraded windows, added ceiling insulation, did a limited amount of sealing, and upgraded appliances. Putting in double-pane windows and adding insulation to the attic Putting insulators behind outlets and receptacles Replaced furnace with super-efficient model, replaced water heater with more efficient model, installed insulation in a lot of parts of the house. Eight years ago, I did a complete gut and remodel of my house, so many of these things happened during this process. Replaced single-pane metal windows, rewired outlets, and switched lights to dimmer switches Replaced some windows, insulation was blown in the outside walls, replaced appliances with energy efficient ones, switched to fluorescent light bulbs Replacing incandescent bulbs and replacing our windows with dual pane windows Sealing doors and windows Solar tube installed, plugged the chimney, some weatherstripping Some weather-stripping and replaced some windows Storm windows and attic insulation The Airport package includes insulation under the crawl space, in the attic, around the windows and the walls. I also closed-off the porches so that they are now not part of the house, just porches. Thermal panes put in for the windows and insulation in the attic Upgraded furnace and the windows Wall and ceiling insulation in the attic Wall-to-wall carpeting We added insulation in the attic and the basement. We bought a more efficient furnace. We changed our light bulbs and we had put some insulation in, double windows We did a significant remodel where we insulated the space between the garage and the main part of the house, upgraded the furnace, and upgraded insulation. We double-paned all of the windows upstairs and replaced four sliding doors. We got a better water heater. We got a more efficient furnace in 2007. We have put in windows, insulation up in the attic, and caulking around leaks. We have put insulation into the walls of two rooms. We have also put in two new windows and a new slider. We have replaced light bulbs, added water saving faucet devices, and updated our windows. We use an energy efficient washer and dryer and turn down the heat.

Behavioral Perspectives on Home Energy Audits Page 269

Response We installed a new water heater and put in a new furnace about 4 or 5 years ago. We installed Energy Star appliances and have started with replacing about half of the windows. We put in new, more efficient furnaces, a new efficient hot water system, and a new roof. We re-wired the house, replaced the light-bulbs, and replaced a 60-year old boiler and some insulation work. We remodeled and put in double paned windows. The ceiling is insulated; walls are 2x6 instead of 2x4's. We put in a heat pump. As things have come along and we have been able to afford them we figured it would be a good investment. We remodeled our kitchen and added new windows. We are putting in natural gas heat and a new furnace and water heater. We replaced all of the old windows. We replaced all of the windows and we put in some insulation. We replaced the garage door. We replaced the water heater, bought a new fridge, appliances, and made sure they were high efficiency. We replaced the windows and switched to energy saving lighting. We shut off the connection between the upstairs and downstairs so that the heater is not running all night trying to keep both floors warm. We started to put in some insulation in the attic. We worked on wall and attic insulation. We wrapped the ducts under the house ourselves. We installed three- paned windows in most of the rooms. We also sealed the doors with new thresholds and a new furnace. Weatherstripping When we did some of the remodel we insulated and put in double paned windows throughout the house. We also added weather-stripping to the doors and put in what is called a mud-room in the back to prevent drafts. We installed Energy Star appliances; the washer and dryer, the dishwasher and the laundry machine. I am cheap and did not want to spend a lot of money wasting resources.

Q19c. What did you learn from the audit that caused you to change your plans?

Response Additions to insulation that we will add to our plans Getting information about what things were the biggest problems and which ones should get addressed first. He pointed out that some things were probably not as cost-effective and doing something else first would be a better idea. How leaky the house is I added new plans as a result of knowing more details, about what specifically needs to be done. I bought an energy efficient blind. I did not change plans exactly, but did make plans based on audit. I learned what areas that I could improve energy efficiency and where to throw money first. That's what I plan to do. It doesn't tell you whether to improve home energy or take a vacation to Europe, however. I did not get any answers. The conditions are nothing I want to live with, it is just too cold. I did not think that it was worth the investment to change my roof and insulation. I found out that there were higher priorities than the windows. I was quite surprised how much air sealing was needed, especially in the basement. I considered doing a door test when I remodeled and after seeing the results of the audit, I really wish that I had. I have created a priority list of the recommendations that need to be done. I know now that I need to do something with my basement and do some duct work. All of the heat is going into the basement and going up into the house.

Behavioral Perspectives on Home Energy Audits Page 270

Response I learned about an option to work with Seattle Housing Program that could possibly help me do upgrades. I learned about leaks and gaps in the structure of which I had not been aware. I learned from the contractors that came to give quotes, that the windows were not needed to be changes in comparison to other things. I learned that I was losing heat around two of my doors, so I added additional weatherstripping around those doors. I learned that it would be more cost-effective, simpler, and easier to seal off our basement from the crawl space, than to seal it off from the house. I also learned that I have a lot of small steps to take before I finish off the basement, to make my house tighter, before taking on some big projects. I learned that the crawlspace insulation and duct work would save the most energy in my home. I learned that these heat pumps work at a very low temperature and that I would not need to have a backup, and I also learned about the lack of barrier between my ceiling and my roof that would be a huge change in terms of energy and cost savings and preserving my house. I learned the extent to which simple things can help a lot. Learning how much air I am turning over was a real eye- opener. It has caused me to focus on the basics like weatherstripping. I learned what the most important projects were and which ones to prioritize. I learned what was most effective to do first. I may also build a better seal for attic access, because I learned how much heat was escaping. I now have a greater understanding of the difference between sealing holes and insulating the home. I had not made enough steps toward sealing. I realized how much the vents are leaking from the furnace and now it feels like the bigger priority. I was losing energy through basement, rather than house walls. I will be more likely to focus on attic insulation, rather than just the floors and walls. Improvement on where to put the insulation, more information about the furnace options helped me decide to purchase the furnace sooner. It helped me prioritize what needed to be done. My daughter found a coupon for the discount on the audit. So we did it. We followed the recommendations from the audit. Now we find that the house is warmer, and we are saving on light thanks to the City Lights bulb replacement program. Prioritize the areas that will give the most bang for our buck with energy improvements Prioritized specific places to put more insulation Probably to select doors that are even more efficient than what we originally planned The amount of air and filtration was more than I had expected. The audit created the plans. The audit helped me choose between various upgrades. The auditor didn't recommend that I do the things I had planned to do in terms of replacement of windows and insulation in the basement. The big problems or energy wasters were different than I thought. The cost was not as high as we had thought. The degree to which our current space is energy inefficient The draft test that he did where we could really see where things were leaking, it was a very powerful visual for us. The efficiency is not bad in my home. The changes I would like to make are too expensive. I cannot afford them. The energy loss in the house; we needed to do some insulating. The furnace replacement wouldn't make as much of a change as I thought it would.

Behavioral Perspectives on Home Energy Audits Page 271

Response The insulation is the most important aspect to fix. The insulation of main floor walls is not particularly effective; taking other steps would provide better heat efficiency. The insulation underneath the building was insufficient. The main thing I learned is that our ceiling insulation was way lower than it should have been. I didn't think it was time to replace the furnace, but I learned that it was. The overall importance of the attic insulation and good seals in the attic The size of the house The specifics already mentioned, about air travelling through joists, and the absence of baseboards in the new addition we built is a big source of air getting in. There is some money savings There was a lot to do. It would not cost that much money, it would make the house more comfortable, and lower bills. There were a few health and safety issues that had higher priority, less expense and greater benefit to us to perform first. The larger projects could be spread out over a couple of years. There were other things to do that were worth doing. There were things, like the insulation, that I can do myself. They told me I was leaking heat up the fireplace. We are going to do the insulation differently than what we had planned on. We couldn't put insulation up there until we redo all of the wiring. We didn't commit to anything until we got the report from energy auditor. We just bought this house and we wanted to do things in the correct and most cost effective order as recommended by the energy auditor. We didn't have any plans. They pointed out the windows and validated that we were losing energy. We had just been laissez-faire, so the audit forced us to stop dilly-dallying. This winter was a bad winter, so the audit sealed the deal. We had a very low energy score and that we needed to improve it. We had thought that we were losing energy through our back doors and we found out that we were not. We had started to replace those and we ended up not going through with it. We knew our house was losing a lot of heat, we just did not realize how bad it was and that the basement was the cause. We learned about some of the more important areas that need work. We learned specific places where we were losing energy. We learned that for our house there were other areas, aside from the windows, that we were planning to replace that would be cheaper. He said that it would be cheaper to add in insulation in the attic and seal the gaps between basement and the main floor. This saved us a lot of money. We learned the extent of the energy loss. We learned where the problem areas were. We learned that the air leaks in our house were very significant. They also brought to our attention an issue with the duct work and a gas leak in the water heater. We thought leaky doors would be a major issue, but apparently not. We thought that windows were the answer because they feel cold, but do not actually lose as much heat as the roof does. We want to change to another furnace. We will do a tighter seal for the attic. It will be high quality. What we learned about the furnace means we have to do that first, it is malfunctioning. Where my home was inefficient

Behavioral Perspectives on Home Energy Audits Page 272

Response Where things we could afford to fix are and needed attention Windows did not need to be changed. Yes, I chose to get my oil heat.

Q20a. What changes have been made?

Response Changed lighting Changing the type of light bulbs Followed up on simple fixes like gaps and leaks in doorways Have teens take shorter showers. I started using Energy Saver light bulbs. I try to take shorter showers. Increased awareness Installed light bulbs Less water has been used in irrigation Modified programmable thermostat from factory energy saver setting to more accurately reflect when I need heat, and at what temperatures. More mindful about shutting the door and turning off the water My husband will actually put on a sweater. New shower heads Power strips for peripherals like TV and computer Put timers on some of the lights Rely on the heat pump instead of the furnace Turning off wall plugs We changed the settings on the water heater. We close doors to block off unused areas of the house. We close the door to the basement and don't heat the basement by closing the ducts. We more aware of when we're using energy, like closing doors during the day and opening them again at night to cool things off. We take shorter showers and are more aware of general energy conservation. We used to turn outlets off in the house but we found we found we were only reducing comfort. We also replaced all of the incandescent with compact fluorescent lights. Wearing warmer clothes and shutting the doors to rooms that are not being heated.

Q20b. What about the home energy audit experience caused these changes? Other (Please specify):

Response Air circulation Basically learning how to improve our habits Behavior changes matter now, there is a more tangible connection between the audit and what I expect will happen when we follow through on changes. Following the auditor around and seeing these places where air is entering our house. General information from the report

Behavioral Perspectives on Home Energy Audits Page 273

Response If I am going to pay a certain amount of money to repair aspects about my home, than it means that I am more committed and aware. Increase in sensitivity by having more hard facts about where you may be wasting resources or money. It was all in one place and relevant. Infrared cameras that showed where drafts and leaks were Just awareness More information on the upside of doing that and viable alternatives More knowledge of actual costs of unused energy Savings Seeing how wasteful we are. The auditor gave me the energy saver light bulbs. The auditor's knowledge was excellent. The information presented The information that was given caused the changes. The knowledge of the energy that we were wasting The whole audit process caused me to think about my energy usage and expand my environmental awareness.

HEAR. How did you hear about the home energy audit? Other (Please specify):

Response At work, I work for a utility billing company. Bowing provides a grant for making a Rain Garden, and through them we learned about Seattle City Light's energy audit. Chinook book coupon Community meeting Earth Day fair in Shoreline, Sustainable Works table Flea Market fair Green Home Fair, Phinney Ridge I called the city after getting hostile letters from them about our energy consumption. I work for City of Shoreline. We are doing a Sustainable Works Program. Then I found out that a colleague's husband works for Revolution Green Power. He offered to come out and do the audit for me. Landlord Local neighborhood blog My wife was at a community event, there was a booth or kiosk. Neighborhood blog on computer Neighborhood meeting PTA Puget Sound Solar Rain Garden Seattle Tilth, Harvest Festival booth Sustainable Wallingford The House that Saved the World video

Behavioral Perspectives on Home Energy Audits Page 274

RACE. Which of the following groups best identifies you? Multi-Racial Other (Please specify)

Response Indian

THEND. Thank you very much for your time. Your responses will be very helpful in making home energy audits more effective. Do you have any questions or comments about the survey?

Response A credited follow-up audit would be very interesting to see how the recommended upgrades we did actually worked out and impacted our efficiency. Double thumbs up. It is a great program, a great motivator, reasonable price especially given the normal cost. The discount is a great way to get people hooked. If the special price makes people interested in having the audit done, then once they've spent the hundred dollars, they are probably more likely to do something about it, to make the most of that initial investment. I think that everyone should be required to do this. It is amazing information. I wish you would have called at a time closer to the audit so then I would have remembered the information better. I would have not had the audit done if it was not subsidized by the city. It provided very valuable information. I would strongly urge those who are doing it to put more attention on the second half of the job. To get people to do things, we are halfway there. It is a great program, keep it up. It is very long and I do not think you are going to get very many people through it. It seems difficult to me to know what to do with the results of the audit. There ends up being about 15 things to do. It is difficult to calculate a cost-benefit ratio with all of the recommendations given. It was a little long. My cell phone number is [omitted]. One of the factors that will cut down on the energy efficiency measures that we take is that there aren't enough federal and state incentive plans. It is just way too expensive. Prior to the audit, we had already done a fair number of improvements including attic insulation, double glazed windows, ceiling air infiltration and a new furnace. These items that I just listed are inexpensive and very effective. The next stage of additional energy efficiency improvements would require taking apart the siding on the building and would be very expensive. To some extent, we have reached the maximum point for payback at the moment. That explains why even though we have had the audit we may not choose to pursue additional measures. The program is great and verified the measures we took. The auditors were very busy, so it took 2 months to finally get the audit emailed to me. I had to email them 3 or 4 times. The scores were of less value than I thought they would be, but the recommendations were fantastic. The survey must be very important but they should have conducted it closer to the time of the audit. This survey was very informative for me in a good way, to review what was done. We are heavily involved in the process of trying to figure out how to replace our oil burning heater. It's very expensive transitioning to gas. We have to pay for them to dig up the street. We did everything that the energy auditor told us to do. We were very pleased with the energy audit and were glad that we had completed it. You did not ask how many square feet the house is. You should ask how old our house is, it seems like that would affect the responses.

Behavioral Perspectives on Home Energy Audits Page 275

Behavioral Perspectives on Home Energy Audits Page 276

Appendix M: Retrofit Survey Questions and Responses Closed-Ended Frequencies This section contains responses to closed-ended survey questions. Questions for which there is no closed-ended data do not appear. Similarly, response options with a frequency of zero do not appear. Open-ended responses appear in the following section. Unless otherwise specified, all questions have 159 responses.

Q1. Since the energy audit, have you discussed your energy score or upgrade recommendations with any of the following people...

Count Percent Other household members 130 82% Other family who don't live with you 70 44% Work colleagues 60 38% Friends 103 65% Neighbors 53 33% Contractors 84 53% Your Energy Auditor 74 47% Anyone else (Please specify) 6 4% None of these 6 4%

Q1a. Did you learn anything from them that was new or different than what you learned from the energy audit? (n=153)

Count Percent No 112 73% Yes 41 27%

Q1c. In general, when discussing your upgrade recommendations, did people... (n=153)

Count Percent Encourage you to do upgrades 64 42% Discourage you from doing upgrades 1 1% BOTH encourage and discourage you 22 14% NEITHER encourage nor discourage you 61 40% Don't Know 5 3%

Q2. Have you done any other follow-up research on your home energy score or upgrade recommendations?

Count Percent No 87 55% Yes 72 45%

Behavioral Perspectives on Home Energy Audits Page 277

Q2a. What other research have you done?

Count Percent Books (in general) 4 6% Internet (in general) 34 47% Seattle City Light Website 4 6% Other (Please specify) 46 64%

Q2b. Did you learn anything from your research that was new or different than what you learned from the energy audit? (n=72)

Count Percent No 38 53% Yes 33 46% Don't Know 1 1%

Q2d. Did you find that this research has generally... (n=72)

Count Percent Encouraged you to do upgrades 36 50% Discouraged you from doing upgrades 8 11% BOTH encouraged and discouraged you 12 17% NEITHER encouraged nor discouraged you 16 22%

Q3. Have other household members been involved in the home energy audit process?

Count Percent No 82 52% Yes 77 48%

Q4 . Was there any one person in your household who was particularly interested in your home's energy performance? (n=77)

Count Percent No 10 13% Yes 67 87%

Q5. Was any part of the audit process particularly helpful in convincing household members to consider completing energy upgrades? (n=77)

Count Percent No 24 31% Yes 52 68% Don't know 1 1%

Behavioral Perspectives on Home Energy Audits Page 278

Q6pa. When we previously talked to your household, you indicated that you had completed the following upgrades: Since then, have you completed any other recommended upgrades? (n=26)

Count Percent No 18 69% Yes 8 31%

Q6p1a – Q6p2a. What is the first other recommendation you have completed? - Previous upgrades mentioned in the last survey were: (n=8)

Count Percent Enter Description of 1st Recommendation 8 100% Enter Description of 2nd Recommendation 3 38%

Q6p1b – Q6p2b. Approximately how many months ago was that done? (n=11)

Count Percent Less than 1 month 2 25% 1 to less than 3 months 4 50% 3 to less than 6 months 4 50% 6 to less than 12 months 1 13%

Q6rt. Have you completed any of the upgrades recommended in your home energy audit report? (n=133)

Count Percent No 57 43% Yes 74 56% Don't know 2 2%

Q6r1b – Q6r5b. Approximately how many months ago was that done? (n=74)

Count Percent Less than 1 month 9 12% 1 to less than 3 months 16 22% 3 to less than 6 months 46 62% 6 to less than 12 months 76 103% More than 1 year 1 1% Don't Know 5 7%

BILLS. Considering all the upgrades you have done since having the home energy audit, how much do you think these upgrades have reduced your energy bills each month? (n=100)

Behavioral Perspectives on Home Energy Audits Page 279

Count Percent Not much 46 46% Somewhat 30 30% A lot 10 10% Too early to tell 11 11% Don't Know 3 3%

DIFF. Besides possible cost savings, have you noticed any other differences in your home since completing the upgrade(s)? (n=100)

Count Percent No 29 29% Yes 69 69% Too early to tell 2 2%

DIFF2. Have you changed how you use energy in response to these differences in your home? (n=69)

Count Percent No 39 57% Yes 28 41% Don't Know 2 3%

DIFF3. What changes have been made? (n=28)

Count Percent Turned off lights 3 11% Turned off appliances/electronics 1 4% Turned down heat 19 68% Turned off heat 1 4% Changed AC settings 3 11% Other (Please Specify) 11 39%

BEDO. When thinking about your energy upgrade(s), what are the most significant benefits or downsides that come to mind? (n=100)

Count Percent Enter Response 95 95% Nothing 2 2% Don't know 3 3%

SAT. Overall, how satisfied are you with the upgrade(s) that you have completed? Please use a scale of 1 to 5, with 1 meaning that you are Very dissatisfied and 5 meaning that you are Very satisfied. (n=100)

Count Percent 2 1 1%

Behavioral Perspectives on Home Energy Audits Page 280

3 15 15% 4 32 32% 5 Very satisfied 52 52%

Q7. Are there any upgrade recommendations that you started, but are still working on?

Count Percent No 120 75% Yes 38 24% Don't Know 1 1%

Q7a. Which upgrades have you started, but not completed? (n=38)

Count Percent Enter Response 37 97% Don't know 1 3%

Q8. Considering all of the upgrades you have completed, or are in the process of completing… Did you or someone in your household do the upgrade(s), or did you use a contractor? (n=109)

Count Percent Used a contractor 51 47% Someone in your household did 40 37% Some upgrades done yourselves, and some done by a contractor 17 16% Other (Please Specify) 1 1%

Q8b. How did you or will you pay for these upgrade(s)? (n=109)

Count Percent On credit 15 14% Bank loan 6 6% Cash or savings 91 83% Some other way (Please specify) 2 2% Don't know 1 1% Refused 2 2%

Q8c. What was the approximate total combined cost of all of the upgrades you have completed?

Count Percent .00 1 1% 3.00 1 1% 5.00 2 2% 20.00 1 1%

Behavioral Perspectives on Home Energy Audits Page 281

Count Percent 25.00 1 1% 30.00 1 1% 50.00 5 5% 60.00 1 1% 75.00 1 1% 99.00 1 1% 100.00 6 6% 150.00 5 5% 200.00 3 3% 250.00 1 1% 300.00 3 3% 350.00 1 1% 400.00 1 1% 500.00 4 4% 600.00 1 1% 700.00 1 1% 800.00 3 3% 1000.00 2 2% 1100.00 1 1% 1200.00 3 3% 1400.00 1 1% 1500.00 4 4% 1800.00 1 1% 2000.00 6 6% 2100.00 1 1% 2200.00 2 2% 2500.00 2 2% 3000.00 9 8% 3500.00 1 1% 4000.00 1 1% 5000.00 2 2% 6650.00 1 1% 7000.00 2 2% 7500.00 1 1% 8000.00 1 1% 9000.00 2 2% 10000.00 3 3% 11000.00 1 1% 12000.00 1 1% 13000.00 2 2% 17000.00 1 1% 20000.00 4 4% 22000.00 1 1%

Behavioral Perspectives on Home Energy Audits Page 282

Count Percent 25000.00 1 1% 55000.00 1 1% 90000.00 1 1% Don't Know 4 4% Refused 2 2%

Q8d. Why did you decide to do this/these upgrade(s)? (n=109)

Count Percent Improve home's appearance 5 5% Increase or preserve home's value 10 9% Improve fresh air flow/access 3 3% Improve indoor air quality or health issues (mold, radon, etc.) 2 2% To be more efficient 27 25% Save energy and resources 48 44% Replace older equipment 11 10% Improve home's comfort 31 28% Product rebate was available 1 1% Reduce utility bills 25 23% Improve safety (fire danger, etc.) 4 4% Easy to do 19 17% Because of the audit (PROBE before selecting: Are there any other reasons?) 7 6% Other (Please Specify) 31 28%

Q9. Are there any recommended upgrades that you have not started yet, but have decided to do in the future?

Count Percent No 59 37% Yes 95 60% Don't know 5 3%

Q9b. Is there any particular reason why you have decided to wait to do these upgrades?

Count Percent Need a contractor/someone else to do 7 7% Too expensive 45 47% Not sure if they're worth it 4 4% No time/household schedule 26 27% Disruption 2 2%

Behavioral Perspectives on Home Energy Audits Page 283

Count Percent Waiting for current equipment to break 2 2% Other (Please Specify) 29 31% Don't know 3 3% Refused 3 3%

Q9c. Are these upgrades something that... (n=95)

Count Percent You would hire a contractor to do 50 53% Someone in your household would do 16 17% Some of them you would do yourselves, and others you would hire a contractor to do 28 29% Don't know 1 1%

Q9d. Approximately, how much do you think this/these upgrade(s) will cost?

Count Percent 40.00 1 1% 50.00 2 2% 100.00 2 2% 150.00 1 1% 200.00 2 2% 350.00 1 1% 400.00 1 1% 500.00 4 4% 600.00 1 1% 800.00 3 3% 950.00 1 1% 1000.00 4 4% 1200.00 1 1% 1500.00 3 3% 2000.00 11 12% 2500.00 2 2% 3000.00 8 8% 3500.00 1 1% 4000.00 3 3% 5000.00 7 7% 5200.00 1 1% 5555.00 1 1% 6000.00 2 2% 6500.00 1 1% 6600.00 1 1% 7000.00 3 3% 10000.00 6 6% 11000.00 1 1%

Behavioral Perspectives on Home Energy Audits Page 284

12000.00 1 1% 15000.00 5 5% 20000.00 1 1% 45000.00 1 1% Don't know 12 13%

Q9e. Why have you decided that you will do this/these upgrade(s)?

Count Percent Improve home's appearance 3 3% Increase or preserve home's value 5 5% Improve fresh air flow/access 3 3% Improve indoor air quality or health issues (mold, radon, etc.) 1 1% To be more efficient 34 36% Save energy and resources 55 58% Replace older equipment 8 8% Improve home's comfort 32 34% Product rebate was available 1 1% Reduce utility bills 25 26% Improve safety (fire danger, etc.) 1 1% Easy to do 3 3% Because of the audit (PROBE before selecting: Are there any other reasons?) 4 4% Other (Please Specify) 20 21%

Q11. Other than the specific upgrades recommended from the energy audit, have you made any other home improvements since completing the energy audit? This might include replacing or adding appliances, as well as remodeling projects.

Count Percent No 94 59% Yes 65 41%

Q11b. Were these improvements something you were thinking about doing before you had the energy audit? (n=65)

Count Percent No 16 25% Yes 48 74% Don't Know 1 2%

Q11c. Did you do these improvements with the energy upgrades, or separately? (n=50)

Behavioral Perspectives on Home Energy Audits Page 285

Count Percent All done with upgrades 8 16% Some done with upgrades and some done separately 6 12% All done separately 36 72%

Q11d. Did having your home energy audit influence your decision to do these improvements? (n=65)

Count Percent No 40 62% Yes 25 38%

Q12. Has your household experienced any major changes since your energy audit was completed? Changes could include things like the addition of a new household member, changes in work schedules, or someone staying home more often than usual.

Count Percent No 130 82% Yes 29 18%

Q13 .Since going through the audit, have you or other members of your household changed how you use energy - for example, by changing your use of lights, heat, or cooking?

Count Percent No 101 64% Yes 58 36%

Q13a. What changes have been made?

Count Percent Turned off lights more often 27 47% Turned off appliances/electronics 11 19% Turned down heat 15 26% Turned off heat 3 5% Used space heaters 2 3% Changed AC settings 1 2% Used different heating equipment 2 3% Unplugged items 4 7% Other (Please specify) 30 52%

Behavioral Perspectives on Home Energy Audits Page 286

Q13b. Did going through the energy audit cause your household to make these changes? (n=58)

Count Percent No 19 33% Yes 39 67%

Q13b1. What about the home energy audit experience caused these changes?

Count Percent Actions were suggested by the auditor 9 23% Learning about the problems with/performance of my house 15 38% Seeing the 'Low cost or no cost strategies' listed in the energy analysis report 1 3% Other (Please Specify) 18 46%

Q14. Is there anything about the home energy audit or the process of doing energy upgrades that you would like to see improved?

Count Percent Enter Response 81 51% Nothing 77 48% Refused 1 1%

SKILL. Of the people who regularly live in your home, do any…

Count Percent Have home remodeling skills 37 37% Fix objects and appliances 46 46% Have time for home maintenance 67 66% None of the above 26 26%

RES. How many years have you lived in this house?

Count Percent 1.00 1 1% 2.00 7 7% 3.00 5 5% 4.00 5 5% 5.00 4 4% 6.00 10 10% 7.00 2 2% 8.00 5 5% 9.00 5 5% 11.00 5 5% 12.00 9 9% 13.00 4 4%

Behavioral Perspectives on Home Energy Audits Page 287

Count Percent 14.00 1 1% 15.00 3 3% 16.00 1 1% 17.00 2 2% 18.00 1 1% 19.00 1 1% 20.00 1 1% 21.00 2 2% 22.00 3 3% 23.00 1 1% 24.00 1 1% 25.00 3 3% 26.00 1 1% 27.00 1 1% 29.00 1 1% 30.00 2 2% 31.00 1 1% 32.00 1 1% 33.00 3 3% 34.00 1 1% 35.00 3 3% 36.00 1 1% 40.00 1 1% 47.00 1 1% 49.00 1 1% 65.00 1 1%

STAY. Based on current plans, approximately how many years do you plan to stay in this house?

Count Percent 1.00 2 2% 2.00 2 2% 3.00 2 2% 4.00 2 2% 5.00 9 9% 6.00 2 2% 7.00 1 1% 8.00 3 3% 10.00 13 13% 12.00 1 1% 15.00 7 7% 20.00 15 15%

Behavioral Perspectives on Home Energy Audits Page 288

25.00 3 3% 30.00 5 5% 40.00 1 1% 50.00 2 2% 100.00 1 1% For the rest of my life 25 25% Don't know 5 5%

HHMS. Including yourself, how many people currently live in your household? (n=101)

Count Percent One (R lives alone) 18 18% 2 39 39% 3 22 22% 4 16 16% 5 5 5% 6 1 1%

Q23a. Including yourself, please tell me how many people in your household are in the following age categories: (n=83)

0 1 2 3 4 Count Percent Count Percent Count Percent Count Percent Count Percent 4 yrs or younger 65 78% 12 14% 6 7% 0 0% 0 0% 5-17 58 70% 16 19% 7 8% 2 2% 0 0% 18-24 77 93% 5 6% 1 1% 0 0% 0 0% 25-34 71 86% 7 8% 4 5% 0 0% 1 1% 35-44 55 66% 8 10% 20 24% 0 0% 0 0% 45-54 57 69% 12 14% 14 17% 0 0% 0 0% 55-64 53 64% 14 17% 16 19% 0 0% 0 0% 65-74 72 87% 8 10% 3 4% 0 0% 0 0% 75 & older 82 99% 0 0% 1 1% 0 0% 0 0%

R_AGE. Which of the following age groups are you in? (n=101)

Behavioral Perspectives on Home Energy Audits Page 289

Count Percent 25 to 34 5 5% 35 to 44 27 27% 45 to 54 19 19% 55 to 64 31 31% 65 to 74 16 16% 75 years and older 3 3%

INCOM. Please stop me when I reach the category that best describes your yearly total household income before taxes in 2010. - IF NEEDED: Your best estimate is fine.

Count Percent $15,000 to less than $25,000 2 2% $25,000 to less than $35,000 1 1% $35,000 to less than $50,000 7 7% $50,000 to less than $75,000 13 13% $75,000 to less than $100,000 21 21% $100,000 to less than $150,000 26 26% $150,000 or more 17 17% Don't Know 1 1% Refused 13 13%

RACE. Which of the following groups best identifies you? (n=101)

Count Percent White or Caucasian 89 88% Asian or Asian-American 6 6% Spanish, Hispanic, or Latino 1 1% Multi-Racial Refuse to Specify 1 1% Refused 4 4%

Open-Ended Responses This section contains responses to open-ended survey questions. Where multiple respondents made the same comment, the number of respondents is indicated in parentheses following the comment.

Q1. Since the energy audit, have you discussed your energy score or upgrade recommendations with any of the following people... Anyone else (please specify):

Response Classmates at school Clients from real estate Gave a talk about it in Haines Alaska Other energy auditors, utilities and teachers of energy auditing Realtor (2)

Behavioral Perspectives on Home Energy Audits Page 290

Q1b. What did you learn?

Response A contractor told me the technical aspects of the situation. From the contractor I learned about outside walls and R-values and the best solutions. I got more ideas on work that needed to happen. I got references to some good contractors and learned that there is a wide range of costs. I learned about my ceiling, specifically, that I need insulation. When I do it, the guy who did the audit said he would come and watch the guys do it and make sure I am not ripped off. I learned about the challenges of insulating walls. Not only that, I also worked with a friend insulating the walls at an apartment complex he owns. So I understand the process now. I learned about using foam sealer, the process of installing insulation, and that I would need to approach it from the inside rather than the outside. I also learned a bit about the kind of insulation and how they block the joist pockets with a bag. I learned different things I could do on my own. Simple things I could do as a home project as opposed to the big things done by a professional. I learned examples of how I could do the improvements. I learned more about the construction of doors. I learned more details about what could be done and the costs. I learned more in-depth options on the upgrade possibilities. I learned that my home actually came in pretty good. I learned that the cost of things might be a little bit different than what they projected. I learned that there was some dry rot in some of the sidings that was originally the fault of the builder. I learned that there were some points of leakage of hot air out of the house. I learned the relative importance of some of the suggestions and why insulating the attic may be more important than insulating the walls. I thought that the house was fairly tight, but at is turns out there are many areas that need additional insulation. Mostly about the attic zone, the effect of whether to install a fan for example and the necessity of having spaces sealed up. My house is not very energy efficient. I also learned about how air travels through the house and how this affects my energy rating. Our house is losing a huge amount of energy through the basement. Some friends said that there are good deals for insulation being offered from Puget Sound Energy. The auditor did not conduct a draft analysis so I spoke with someone who does those audits. He told me to check with the auditor and the auditor told me that the temperature differences in the house did not call for the test at that time. The contractor gave me more ideas than the auditor. For example, air sealing in the garage and storm windows. The contractors told me that they were unsure if I needed to add drywall to the inside wall that I am insulating. The costs and payback time for PB systems. The energy audit identified problems but did not tell me how to solve them. I learned about the source of air leakage and how to seal the house. I am learning that people assume that you are going to spend a lot of money fixing these problems because they want to charge a lot of money to fix them. I found out that the Seattle City Light’s response to this problem does not really have good support for providing good assistance or recommendations for contractors or information to implement these solutions. There can be moisture problems in older homes because they do not have moisture barriers, so the blown-in insulation can get wet and build mold. So I did not want to chance that. There is a variety of valid approaches to the problem.

Behavioral Perspectives on Home Energy Audits Page 291

Response There were some unusual air infiltration areas. Also, we had completed some things on our own before the audit and for the most part, they seemed to be pretty effective. For example, insulation in the basement area. They all argued with each other. The auditor suggested some things and the contractors said something different. It was frustrating and cast doubt on whether any of them knew what they were talking about. We had a contractor look at the insulation in the attic and they measured the amounts differently. We had an insulation contractor come out and he had different opinions on some of the things that should be done than the auditor. We knew that older homes needed attention for energy retrofit and that the less expensive changes can have some benefit, but the zero energy solution is not practical. We learned about different heater options. We learned more possibilities for dealing with moisture issues. For instance, we installed a fan in the basement to suck up moisture. We learned that our neighbors and friends were not necessarily concerned with the energy score. We learned that warm air was escaping from the chimney and from the unsealed area in the basement. We spoke to a contractor and got a quote for all of the recommended repairs that came out of the energy audit. What I learned from my school was that the tankless water heater which was recommended in the audit is a good step to take because I have natural gas for water heating. I learned how to distinguish between different types of water heating and how they apply to different houses. I also learned that the air exchanger will allow for controlled air-flow in our house because it is right on the border of being too tight. We will keep on tightening up the house and come up with a mechanical type of air-exchange instead of opening-up windows or expecting cracks to breathe for us.

Q2A. What other research have you done? Other (please specify):

Response CM50 score Equipment research and cost research I am just trying to figure out which ones to do first. I called contractors to get estimates. I conducted a rate of return analysis and looked at current level of usage. I contacted a number of companies that could do the sealing and insulating. I contacted companies to learn more about the prices. I did comparisons with other utility bills, plug-load comparisons, and priced out different insulation options. I got bids from contractors. I had a contractor come over and talk to me about more and better attic insulation. I had contractors do some bids for insulation, sealing, and replacing a heating system. I have been in contact with contractors. I have looked into financing options. I have went out to home improvement stores. I investigated heat pumps. I looked at windows and went to the home show. I looked into some of the other insulation options and went to the store to price it out. I also followed up with the furnace folks on potential carbon monoxide issues. I looked into where leaks are in homes, the cost-effectiveness of adding insulation to brick houses with wallpaper inside, what the payback for that is, and compared that with furnace upgrades.

Behavioral Perspectives on Home Energy Audits Page 292

Response I looked up general pricing. I looked up information on different heating systems. I looked up insulating under the house, if I needed a new roof, if they add insulation to some of the panels, and how to insulate the walls. I looked up the kinds of insulation that can be applied and which were the greenest and the most effective. I researched pricing issues and what it takes to get the materials and do it through conversations with contractors. I researched through implementing. I shopped online for specific items that were suggested. I spoke with contractors on making some of the changes. I spoke with other contractors about the prices of the recommendations. I talked to a contractor at a home show about dual water heater problems. I talked to quite a few people active in the LEED program. I talked to some expert friends. I talked with contractors to receive bid information. I talked with people. I went to Home Depot, looked at products, and talked to my father-in-law who does a lot of carpentry stuff. I went to the gas company website to check annual usage and looked up the insulation value of concrete. Insulation, climates, and weather Phone calls Talked with different contractors The Home Handyman Magazine The HomeWise website through Seattle City Light Through articles in magazines and different things We did research on improvements and options. We had an insulation specialist visit the house. We have talked to contractors. We learned about the life of our furnace. We looked into how to improve the energy efficiency. We looked for alternative solutions to the upgrades the auditor said. We need to put in new insulation, so we learned how to do that.

Q2c. What did you learn?

Response I did an economic difference. I looked at the cost of performing the energy upgrades against the cost savings of having an efficient house and it just did not pencil out. The cost of performing the upgrades would not save me enough money to be worthwhile. I leaned that the opinions about the heating recommendations the auditor said were very different from what everybody else said. I learned about air sealing, duct sealing, and how to put in new thresholds in my doors. I also learned that some contractors are pretty flaky and that I had some work done that was done incorrectly. I learned about different methods for insulating the crawlspace. I learned about how effective other appliances may be in comparison and that it may not be cost effective to get a new water heater. If we were in our twenties and had a lifetime to recoup the differences, then the water heater would be worth the initial cost.

Behavioral Perspectives on Home Energy Audits Page 293

Response I learned about hybrid electric water heaters. I learned about specific options for types of insulation, different options for replacing our windows, and different levels of efficiency I learned about the different types of light bulbs and other items. I learned about the issue of having to remove all insulation as opposed to supplementing it. I learned about the process of blowing in insulation in the attic and how that works. I learned about types of materials for insulation and some product specifics. I learned I could replace one of the water tanks with a much smaller one. I learned more about the ductless heat pump. I learned more details about pricing and options. I learned more details than the surface review found on the energy audit. I learned more details, but nothing different in general. I learned more in depth information. I learned more specific information about brands and equipment. I learned more specifics about how heat pumps operate. I learned of the different types of insulation and options available. I learned some costs. I looked at the ductless heating systems to try to get some information for how they would fit into the house and how they would perform. I learned that I am not saving as much with my new furnace as the auditor claimed I would. Additionally, very few people had good ideas on how to insulate a 1940s house with brick walls and wallpaper on the inside that doesn't cost $10,000, because you can't drill holes through either the wall or the wallpaper. I learned that our older Energy Star refrigerator is outdated. I also learned that if I were to buy a new one the next level of Energy Star would be significantly better, but I cannot tell if it would make sense to upgrade at this time. I learned that perhaps I needed to have more venting in the attic that was not part of the audit. I also learned that things are really expensive. I learned there are different ways to accomplish the same things as what the energy audit recommended. I translated the leakiness of my house into the square footage of the hole responsible for that leakage. It showed us where to focus. The energy audit did not mention anything about tankless water heaters, so I wanted to know what I could do with one. The audit also did not cover renewable power or electric vehicle outlets so I researched those as well. Also, I checked if I could get smart meters in Seattle. There are a couple new products, foil-backed stuff, that you can put in that eases the cooling in the summer. I also looked into the types of attic ventilation you can get, even though that was not recommended. We are learning that some of the upgrades are really expensive and difficult to implement and that others are not. We decided that our primary energy loss was through outside air infiltration, mostly in the basement. We learned more complete information than what they have given to us.

Q5a. What part of the process was helpful?

Response All of the information was interesting and convinced us to do what we could. Comparing effectiveness of the basic upgrade options. Determining that we did not have any insulation in certain walls in the house. Discussing the things that could be fixed with the least amount of costs. Discussing with the auditor what the upgrades would be and what his recommendations were.

Behavioral Perspectives on Home Energy Audits Page 294

Response Finding out how little insulation we have. Going over it verbally with the auditor. Having the person here to talk to and ask questions of. In my house, having the process of the audit completed created awareness. It defined the potential energy upgrades to be done and identified the best returns on investments. It was all helpful. Knowing upgrades that we could make that would really increase energy efficiency and save us money. Pointing out various options and prioritizing them in terms of effectiveness and in terms of energy savings and cost. Prioritizing the best investments to make. Putting the fan on the front door to create negative pressure that revealed the drafts or energy leaks. Seeing the numbers on the amount of energy we are losing. Testing and being able to see the results from the testing. The air infiltration testing The auditor gave a variety of options that we could do. Some were easier to do than others. The best part that came up was the draftiness in the house and what to do about it. Also, the air leaks around the dishwasher, electrical outlets, and doors. The blower door test was the most informative. The blower test (4) The blower test showed a lot of places where we are experiencing leaks through the floor, windows, and skylights. The cost-benefit and the detailed description of all of the changes and improvement. Also, the estimated cost per foot for each changed item. The discussion after the audit with the auditor, the infrared photos that he took, and the graphs that showed potential energy savings. The end result and the amount of money that was needed to complete the work. Also, the benefits related to the costs. The final report and the blow test The imaging convinced my wife that we had to do some insulation work. The infrared pictures The infrared test on our walls that showed us how much more heat was leaking through than the non-insulated part of the walls. The part about little hidden air leaks was interesting. The part where they test the leakage of the home The report The report told us what our score was and with it were recommendations of what to do. The reporting and showing numbers quantified values for what is going on in our house helped both my wife and I. The results of the blower door test and the thermographic photos were helpful. The testing for air infiltrations. The time it took for air to exchange was probably the most useful piece of information. The time spent with the auditor and the overall discussion of the various aspects of home energy use. The whole process The whole results were very helpful. The whole thing was pretty persuasive.

Behavioral Perspectives on Home Energy Audits Page 295

Response The window test The written report with specific suggestions was really helpful. Walking through the house with the auditor We knew exactly what we needed to do. When the energy auditor pointed out drafting issues by creating a more effective ceiling. When they mentioned that the crawlspace should be insulated. We did not know that could cause the energy heat leaking. When they showed us where the drafts are from.

Q6p1a. What is the first other recommendation you have completed?

Response Caulking to stop air leaks Changed a few other light bulbs Changed the storm door Insulated the attic Insulated the outside walls Insulation was inserted into the floor joints for the basement Placed winter covering around doors Sealed and insulated two furnace ducts

Q6p2a. What is the second other recommendation you have completed?

Response Another solar tube Removed dog door that had a lot of air leakage Weatherized doors

Q6r1a. Next, I'm going to ask you about the recommendations you have completed and how long ago they were done. What is the first recommendation you have completed?

Response Added insulation in the wall stud space Added insulation to the crawlspace Added insulation to the house Added weatherstripping around the front door Additional insulation Additional insulation in the walls and the attic Air sealing Air sealing and installed a ventilation time clock in the basement Air sealing in the crawlspace and attic Caulked around fireplace

Behavioral Perspectives on Home Energy Audits Page 296

Response Closed some of the venting holes in the kitchen Control the heat pump Eliminated air infiltration in the basement Finished the downstairs basement Fixed air leaks, added insulation between the framing of the windows, between the eves and the house, around the molding in the attic, and caulked the recesses around the lights Had the bathroom exhaust fan vented into the attic without actually being vented to the outside Inserted an inflatable device in the fireplace to retain heat Installed a more rubberized roof over the entire house Installed ductless air pumps Installed external on-demand hot water heater Installed insulation in the basement and exposed places Installed insulators in the electrical outlets Installed new furnace Installed new heat pump Insulate the crawlspace Insulated an exposed return air duct Insulated and weatherized two doors into the attic, filled in insulation that was missing in the attic, and put foam around places where pipes were coming into the attic Insulated around wiring in attic Insulated exterior walls Insulated garage Insulated some of the walls Insulated the attic (3) Insulated the basement Insulated the crawlspace Insulation Insulation between the ceiling and attached garage Insulation in first floor walls and basement Insulation in the attic Insulation in the walls, crawlspace, and attic Insulation of exterior walls and attic Insulation was put in the knee wall and part of the attic Lock on the heat pump (2) New furnace New water heater New windows on the south and west side Patched some leaks in the basement with wood and siding Purchased a new energy efficient refrigerator Purchased a new gas furnace Put some high-efficiency light bulbs in Repaired dysfunctional bathroom vent Repaired large 48 by 52, 100-year-old wooden window

Behavioral Perspectives on Home Energy Audits Page 297

Response Replaced furnace with a 95% high efficiency gas furnace Replaced gas furnace Replaced incandescent bulbs in the kitchen with LED bulbs Replaced most of the light-bulbs with CFLs Replaced the bay window Replaced the ceiling fan Replacing incandescent light bulbs Seal on the furnace door Sealed gaps around the fireplace Sealed gaps in the envelope on the second floor Sealed off the fireplace Siding with insulation Some enhancement of insulation The pony-wall or basement joist was reinsulated Turned down hot water temperature Upgraded the insulation Weatherstrip the access hatch to the attic Weatherstripping Weatherstripping around some doors Worked on the windows

Q6r2a. What is the second recommendation you have completed?

Response Added insulation in the hole to the sink Added insulation into the walls and the crawlspace Added weatherstripping Added weatherstripping around the front and back doors Added weatherstripping to attic door Additional caulking primarily around the doors Attic hatch air infiltration Blocked air leakage in laundry room Caulked around windows, floor boards, and around the doors Caulked windows Closed off part of the crawlspace Increased the number of CFLs Installed a balloon for the fireplace Installed CFLs Installed CFLs, insulated electrical wall outlets, checked water heater temperature, and wrapped it with extra one- sided thermal wrap Installed energy efficient lights

Behavioral Perspectives on Home Energy Audits Page 298

Response Installed high-efficiency furnace Installed independent air supply to the furnace room Installed solar attic fan Insulated around the front door Insulated the ceiling in the laundry room Insulated the crawlspace, caulked air leaks, and caulking the registers Insulated the hot water heater Insulation and new plastic on the ground in the crawlspace Insulation in the basement Insulation in the ceiling of the basement New furnace New heat system Purchased a new washer and dryer Redid the duct work for the dryer Replaced bathroom fan Replaced incandescent lights with CFLs Replaced light bulbs Replaced the electrical panel Replaced windows (2) Sealed areas where electrical wires go through the floor Sealed holes in the siding that leaked air into the bedroom Sealed off the crawlspace Sealed the wall joists in the attic Storm door Switched from oil heating to gas furnace Updated furnace Upgraded the furnace Upgraded water heater We had a professional insulator come, but then later decided to stop after finding some prohibited things in the walls. There is netting with rock wall inside the walls and it prevents someone from poking a blower in for the insulation.

Q6r3a. What is the third recommendation you have completed?

Response Added insulation in roof Added insulation in the attic and to some areas in the crawlspace Added insulation to the attic Added more attic insulation and a new roof with a different venting system Changed the dryer vent Heat ducts were re-wrapped and sealed Installed a new water heater Installed door sweeps

Behavioral Perspectives on Home Energy Audits Page 299

Installed gaskets in outlets Installed humidistat in two bathrooms Installed tankless water heater Insulated light switches and outlets Insulated supply ducts in mechanical room Insulated the water heater with a wrap Insulated two exterior doors New roof Replaced a light bulb with a LED light bulb Replaced light bulbs with LED dimmable light bulbs Replaced some windows Replaced thermostats for baseboard heaters Resealed basement door Sealed gaps in upstairs doors to un-insulated closet space Sealed the can lighting in the kitchen Weatherstripping Wrapped a hot pipe with an insulation wrapper

Q6r4a. What is the fourth recommendation you have completed?

Response Added insulation in the floor Duct sealing Fixed the vent for the stove and bathroom Purchased a new clothes washer Replaced new windows Sealed the window

Q6r5a. What is the fifth recommendation you have completed?

Response Purchased energy efficient hot water tank Resealed all the ductwork in the garage

Diff1. What differences have you noticed?

Response A little more consistent temperature A lot less dust and less drafty Aesthetically it is nicer to look at. Cleaner air and more even temperature Comfort Comfort and more even temperature throughout the house Comfort, air flow, and temperature

Behavioral Perspectives on Home Energy Audits Page 300

Response Comfort, lack of drafts, warmer, there are no cold pockets, and more convenient lighting During the winter the house is a lot more comfortable. Hot water heater is no longer leaking. House had a funny smell and now there is no more smell. I do not have to run the dryer as long since the washer works better. Also, my bathroom is a little warmer. In the winter I do not have a lot of air drafts. It feels warmer and the heat does not come on as much. The exterior walls feel warmer. It has actually felt cooler. It is more comfortable and warmer. It is not as hot as it used to be. It is warmer in the basement now. It might be a little bit more comfortable. It takes a long time for the light bulbs to light-up. It takes longer to get hot water. Less drafty (3) Less drafty and furnace not working as hard Less uncomfortable, less air leakage Might be less drafty Modest draft differences More comfortable (3) More comfortable and less drafts More comfortable in the kitchen More comfortable temperature-wise and less drafty Much warmer, uniform warmth, and less drafty My home is warmer. One of the rooms is actually warmer because of the crawlspace insulation. One of the rooms retains its temperature much better. Our home in general is more comfortable. It is an older home and had a lot of dampness issues that seem to have improved. Our house is less drafty and a little quieter Our second floor seems less hot. Substantial comfort levels in the upstairs area and reduced the drafts downstairs The air quality is much better and in general warmer. The bathroom vents vent much better. The comfort level is better. The comfort level is much better and even. There are not extreme hot and cold places. Before the audit, I was also having problems with rodents. These places have now been sealed and there is no longer a problem. The furnace does not turn on as often and the home is more comfortable. The furnace is more pleasant, and there is less noise from the street. The heat is much more efficient. The house heats better with the window replacements. The air is better in the house because we have a better filter. The heat is steadier. The house holds heat better and is warmer.

Behavioral Perspectives on Home Energy Audits Page 301

Response The house is a lot warmer. The house is more comfortable and we do not need to turn the heat on, like in the past. The house is much warmer and colder. The house is not as drafty as it was before. The house is not as drafty. The house is warmer in the winter. The house remains warm during the in-between seasons. The house retains heat better. The kitchen is warmer and there are fewer drafts by the doors. The new light bulbs are inadequate in the canned lighting. The temperature is much more even, it is warmer and more comfortable all around and there are not any drafts. The walls around the attic hatches are less cold. The windows open unlike the old ones, which is better in terms of glare and fading of the carpets. The house is also less drafty. Using a lot less fossil fuels Warmer (3) We are more aware and alert to our use of electricity and heating.

Diff3. What changes have been made? Other (please specify):

Response Changed lights to energy saving lights Conserving water and electricity I leave the lights on in the basement a little, but more because they have all been changed to CFLs. It is easier to program furnace usage throughout the week. More appliances and lights are on timers The thermostat is automated, so the heating is a bit stricter. Use wood stove We are more aware about recycling and using energy in the house. We are now on the look-out for heat leaks in the house. We are saving water in a 50 gallon tank outside of the house. We switched to CFL in most light fixtures.

BEDO. When thinking about your energy upgrade(s), what are the most significant benefits or downsides that come to mind?

Response A drop in heat consumption A feeling that we have accomplished something on the list A significant benefit is that the house is less drafty and the basement is warmer. Benefits are the cost savings and utilizing less energy. Better information about the house Comfort and cost (2)

Behavioral Perspectives on Home Energy Audits Page 302

Response Comfort is the benefit and the expenses are the downside. Comfort, cost savings, and upfront cost Cost Cost and disruption Cost is a downside and benefits are warmth, house value, and saving energy in general. Cost reduction and the environment. Initial investment costs Cost savings and long-term value of the home comfort levels Cost savings of the long run Cost savings, environmental footprint, and long return on investment Cost savings, more even temperature, lukewarm or cold feeling air coming from vents instead of warm air. Cost, energy efficiency, and comfort Cost, energy savings, and greenhouse gas emission savings Cost, keeps the house warmer, and more environmentally friendly Eliminating draft Energy savings Environmental impact and long-term cost savings Eventually everything will pay for itself. The best benefit is that the cost savings have been tremendous, even though it was initially expensive. Expense (2) For the downsides, the expenses were very high and that I am still producing a carbon footprint, as I went from an oil furnace to a natural gas furnace. The benefit is the comfort that has come along with these changes. Having a warmer house, saving energy, and saving costs Having to hire somebody to do them, needing to have someone walk us through the prioritization of recommendations, and having the information so that we know what next steps to take. Heat retention Hopefully we're losing less energy and our energy bills will be less than they would be otherwise. I had to repaint parts of the outside of the house. I have not had to put any more oil in my oil furnace since last fall. All my bills are down. I would say that there is more efficient heating in the house. The downside is that the water takes longer to warm. Improving home comfort, less energy, less cost, and less carbon Increased comfort and some cost savings Increased comfort, energy efficiency, peace of mind. The cost and time involved. Initial startup costs and saving the planet It is warmer and less drafty, resulting in lower energy bills. The Downside is the cost uncertainty and losing storage space. It took me a long time to find an LED bulb that was reasonably affordable, dimmable, and had only one light. There were very few options. It will be a selling point that we did energy upgrades. Our heater does not have to work as hard. Keep the house warmer Knowing that I had done the right thing after the audit my work was verified. Learning how poor quality our home is in today's market. It has very poor windows and insulation with very costly repairs. Less fossil fuel use saved us money. Lower energy and higher comfort; I cannot think of a downside other than you have to pay for it.

Behavioral Perspectives on Home Energy Audits Page 303

Response Lower heating bill, less drafts, better temperature control, and lower lighting bill Lowering the price of our bills Many of them are expensive and time consuming if you want to do them yourselves. The benefits are that they keep your house warmer. No downsides No downsides, I appreciate doing my part to reduce energy waste and appreciate that the house is more comfortable. Over time, I anticipate a significant reduction in my energy bills. Quality of space in terms of being less drafty and maybe less dusty Reduced energy consumption Reduced energy use, financial benefits Reducing utility bills and teaching our kids how important it is to conserve energy. Saving a lot of money Saving energy and comfort Saving energy and fewer drafts Saving energy, money, and feeling more comfortable. Cost Saving energy, upfront cost Saving money is the best benefit and increased home comfort. Also, having the sense of rodent control because things are sealed up a bit better. Saving money on heat Saving money, increased comfort in the house, a generalized sense of good for the Earth. Since my drafts are gone, I expect to reduce my heating in the future. Slight reduction of electrical use, and comfort The benefit is that the house is a little bit more comfortable. With the furnace, it has made the heat flow more evenly. Honestly, I was quite disappointed with the auditor. He did not think that these improvements would help change the costs of bills but would make the house feel more comfortable. I realized that afterwards, that the largest leakage problems are where the mail comes from, under the door, and in the basement windows. However, that was never mentioned in the report. Later, the auditor did not respond to my concerns. The benefit is that we live a more efficient life, and the downside is that it is tougher to close the doors. The benefit is that we now enjoy the house more. As for the downside, it is not real cut and dry. For example, if you do this one thing, than it will fix this other thing. There are particularities of each house and need to constantly track these things that are not quite perfect. The benefit is the reduced energy costs and increased warmth. The downside is that they had to drill on the outside of the house and now we must paint over the holes to cover it up. The benefits are cost savings, CO2 reductions, and comfort. In my opinion, there are no downsides. The benefits are energy and cost savings. The downsides include the cost barrier. The benefits are reducing our energy footprint because it affects the world, and more comfortable living. The benefits are the comfort level and the efficiency of saving energy, both in terms of cost and for the environment. The downsides were the costs that were accompanied. We were also very unhappy with the contractor that we worked with. The crew did not have a supervisor. They broke our trellis and doorbell, and always left the house in a mess. The biggest benefit is that we are doing our part to help the planet. The cost for doing something major like high efficiency washer and dryer, but the upside being that I feel like a greener and more responsible citizen. The cost savings, but downsides were cost to complete. The rebates were not really significant. The downside was that they had to go through my entire wall and that required much more cleanup. It does not

Behavioral Perspectives on Home Energy Audits Page 304

Response take as much to heat the house as it used to. The expenses and getting them completed. The fluorescent lights in some applications do not perform adequately. The house is less drafty. The house is more comfortable. The house is more comfortable. It cost us more money. The insulation was very expensive. I can only hope it is going to help. The long term cost energy efficiency for the benefits. The downsides are the expensive prices. The most significant benefit is reduced infiltration so we are more comfortable in our home. The sense that we are not being wasteful and added value to the home. The time it took to do it. The upside is saving energy and money. The downside is that there is now a lot more insulation in the attic so it is harder to do anything up there. There are no downsides, only caulking for about 5 feet of the wall. There are no downsides. The benefits are the comfort in the home. Using less energy and the cost to get things done. We are using less energy and the house is more comfortable. We are using the furnace less. We found that minor window leaks that have not been repaired are now major sources of infiltration. We are using an electric portable heater fan in the bathroom now. We have less reliance on energy.

Q7a. Which upgrades have you started, but not completed?

Response Air infiltration leakages, replacing attic doors, fixing the thresholds, and adding insulation in various places Blowing in insulation in the walls Blowing in insulation into the walls Changing all the light bulbs Hot water tank insulation and insulation in the attic I got a quote for the replacement of the furnace. I put window film on some of the windows, but not all of them. Improving insulation in the attic Improving the ventilation and try to come up with a way to further insulate the recessed lights Insulating Insulating the attic Insulating the attic and reinforcing the ducts in the furnace to be more efficient Insulation for sealing the rim joist in the basement and sealing the wall between the crawlspace and basement Insulation in the attic Insulation of the floors in the crawlspace and the attic and sealing the holes where pipes come through Isolate the airflow in the basement, appliance replacement, and pipe insulation

Behavioral Perspectives on Home Energy Audits Page 305

Response More insulation in the attic More insulation needs to be added in the knee wall. New heating system Putting insulation in our exterior walls Replacing our oil heater with a gas heater and insulating our doors Replacing the furnace for the heat pump Sealing gaps in the basement Sealing leaks in the ceiling, basement, garage, and attic Sealing the ducts Still working on insulation and air sealing. We are going to be getting solar thermal voltaics and solar hot water. The caulking around the windows. The vast amount of heat goes out the unfinished walls in the basement. We need to replace the plumbing and the wiring and then replace the walls. There is insulation in the crawl space that I still need to do. Ventilation of the attic Wall and ceiling insulating Wall insulation We are working on filling up the air leaks throughout the house. We bought materials for insulating the basement. We converted most of our light bulbs, but not all. We started insulation in the basement and fixed a hole in the wall. We need to restore two-thirds of the old windows in the house. We will do half of the total recommendations. The other half we will phase in throughout the next two years. Such as switching the venting from collapsible aluminum vents to hard vents, seal the floor and sealing the air intake. Q8. Considering all of the upgrades you have completed, or are in the process of completing… Did you or someone in your household do the upgrade(s), or did you use a contractor? Other (please specify):

Response I have done a lot of planning with contractors and research on my own.

Q8a1. Which upgrades did you do on your own?

Response Caulking windows Extra insulation, checking the hot water pipes, the prep work, and the follow-up work on the insulation. Insulating Insulating the attic door Insulation and sealing gaps Insulation work in the attic Insulation, weather-stripping, and insulation on the storm door Knee wall insulation Light bulbs, sealed the windows, and insulated the crawlspace Reinsulated around the door Repaired the leaks in the siding and basement

Behavioral Perspectives on Home Energy Audits Page 306

Response Sealed off the crawl space Some air sealing The attic hatches and adding insulation to the crawlspace The caulking and the insulation The dog door Weatherstripping and caulking

Q8b. How did you or will you pay for these upgrade(s)?

Response It was free. There was no money paid by the upgrade because it was a deficiency of a previous job. Initially we did pay with a bank loan.

Q8d. Why did you decide to do this/these upgrade(s)? Other (please specify):

Response As part of gaining livable space in the basement Believe in it philosophically Cheap to do Desire to partially remodel the house I am getting close to retirement and my house is all paid off. I am going to lease the house out and I needed to make it more comfortable. In addition to insulating the crawl space, I had to get rid of a rodent infestation, so I did them at the same time. In comparison with energy bills of the neighbors, which are sent out by the City of Seattle, the house always had very high energy costs. Although there are less people in the house, the costs of energy remained very high. It is good for the environment. Low cost, high benefit Most convenient and would make better use of the heat pump that we already purchased. Most return for the money Quality of living The auditor pointed out what was wrong and I agreed to it. The energy auditor suggested it. The hot water heater was leaking. They were affordable. They were not very expensive. They were recommended and because I could see that the energy was being lost when I was with the auditor during the energy audit. They were the best to do without investing a lot. They were the only ones that were remotely cost effective. To follow through with the audit. To make the house more quiet. To preserve the environment as much as possible. Two different contractors disagreed with the audit's results and so we did not replace the ducts.

Behavioral Perspectives on Home Energy Audits Page 307

Response Wanted to have the windows capable of being opened We paid for the audit, so we might as well fix what needs to be fixed. We were doing a remodel. We were in the process of doing another project and wanted to make sure everything was air sealed because we were going to be covering up the work. We were renovating the house. We were told our crawl space needed it.

Q9a. Which upgrades have you decided to do?

Response Add a fireplace insert Add ceiling and insulation to attic space Add insulation to a part of my house and have some work done on two exterior doors Add more insulation in the attic Add weatherstripping and insulating a built-in cabinet Adding insulation Adding insulation and trim along windows Adding insulation to both attic floor and basement ceiling and replacing the other windows Air sealing in the attic, adding insulation to the attics, potentially get insulation blown into the walls, and replace our old washing machine with a new Energy Star front loader Attic insulation (2) Attic insulation, heating system, window caulking, and window and door insulation Better insulation Blocking the second floor joist below the pony walls, repairing a hole in the fireplace chimney that needs to be filled, weather-stripping an exterior door, and installing a new exterior door to the basement Blow in additional insulation in the attic space and seal around our ceiling lights Blow insulation into the walls and complete the perimeter insulation of the attic with better membrane Crawlspace sealing, putting in fans in the attic, attic insulation work, and more weatherstripping Duct sealing and attic sealing Fix some siding, air infiltration areas, and replace a door Fixing insulation under the house and put insulation in under the new roof if purchased General sealing and insulation of the beams Get a heat pump water heater Get an on demand water heater, might do the ceiling insulation in the garage Get the fireplace damper fixed, add insulation in the garage ceiling area or below the floor in the kitchen and bedroom area, and replace the front door Improve attic insulation, use foam insulation on basements pipes, and improve taping of ducts on the heater Increasing insulation in attic Insert chimney balloons Insert for the fireplace Insulate better in the attic Insulate some of the corners of the basement Insulate the attic, caulk and weather-strip around windows and doors, and replace roof

Behavioral Perspectives on Home Energy Audits Page 308

Response Insulate the exterior wall Insulate the flat roof, do some wiring upgrades, and replace the bathroom and kitchen Insulate the garage ceiling Insulate the walls Insulate the walls and attic, replace windows, and convert to an energy saving water heater once the current one goes out and anything else that needs to be replaced in the future will be replaced with an energy efficient appliance Insulating attic and side walls Insulating the walls and changing out windows Insulating the walls and finish the basement Insulating the walls, replacing the heater, something with the fireplace, and increasing attic insulation Insulation and furnace Insulation and weatherstripping (2) Insulation blown into the walls Insulation in exterior walls, sealing of doors Insulation in the attic and in the walls Insulation in the attic and replacing an entryway door Insulation in the ceiling Insulation in walls and ceiling, air leakage improvements, upgrade the furnace and refrigerator, and transition from incandescent to CFLs to LEDs Insulation of the attic and between the floorboards Insulation of the crawlspace in the basement Insulation of the walls, more air sealing, reinsulating or fixing the insulation in the attic, insulating the garage ceiling, and air sealing the inside of the recessed lights Insulation to close up drafts in attic and crawlspace Insulation upgrades in the attic and insulation of the crawlspaces Insulation, heat system, hood vents, bathroom vents, insulation under attic walls, and additional moisture barrier Insulation, replace the hot water heater, and convert the wood-burning fireplace to a gas fireplace to use for local heating instead of general heating More insulation More insulation in the attic and insulating all of the interior walls. Also, we might put in some kind of air circulating system that pulls in air from outside. New insulation under the house, sealing up the heating ducts, and insulation in the attic area New windows and adding insulation in certain areas Put more insulation in the ceiling, replace kitchen door with a better-sealed one, replace the ceiling in the basement, and replace the refrigerator Reinforce insulation in the crawlspace and plug air leaks the kitchen area Replace appliances with Energy Star appliances Replace insulation in the attic, put vents in the roof, and some rewiring Replace kitchen door, add storm windows on main floor, and replace the refrigerator Replace some windows, roof insulation, and replace some appliances Replace the furnace (3) Replace the furnace, put more insulation in the attic, replace certain windows in the basement, and put more insulation in the basement

Behavioral Perspectives on Home Energy Audits Page 309

Response Replace the hot water heater Replace the refrigerator and plug-off the attic opening that will be involved with another remodel that we will do all together. Replace the windows (2) Replacing furnace and appliances Replacing the water heater and the furnace Replacing windows on the other side of the house Seal or reduce the air lost on the second story Seal some leaks around outlets and various places that they found Sealing attic spaces Sealing gaps in the basement and insulation in the walls of the house Solar hot water heater and solar panels Solar thermals Some further caulking or tightening up around doors Take out some molding and insulate behind it They recommended new appliances, but I will replace current appliances with energy efficient ones when needed. They wanted us to check to see if the roof was insulated in the future and change that if needed. They also suggested us to completely take out our window and replace it with something more convertible. Upgrade additional windows, insulate hot water pipes, add weather-stripping, fill-in cracks, and new heating system Upgrade light fixtures We added insulation to our attic prior to the energy audit, and the auditor said that it would make the house more efficient to add a few more inches. We installed a tankless water heater, a timer for the fan in the bathroom, and a mechanical air circulator. One of the recommendations was solar. Since we have the electric car, we are also very serious about solar to create clean energy for the car that we will be using. We may invest in the Seattle Community Solar project or install on our own rooftop. Weatherize things that I can do myself and which have a good return on investment. This includes caulking and sealing. Weatherstripping Weatherstripping, upgrading to ductless heating system, and install additional insulation

Q9b. Is there any particular reason why you have decided to wait to do these upgrades? Other (please specify):

Response Adding insulation to the pull down stair would interfere with useable space which is needed at the moment. Budget and priorities Deciding on the right method and option Difficult to get to Had trouble getting a bid from the contractor. I chose not to do it in the winter because the whole garage has to be cleared out and I was dealing with a water leak that I had to fix first. Also, there was an illness in the family. I could not find what I needed at the store because I want to find a way so that it will look good. I did not receive a callback from the auditor and the improvements were forgotten during the wait for the

Behavioral Perspectives on Home Energy Audits Page 310

Response callback. Inconvenient It does not matter right now. It matters in the fall, so I have got several months to deal with it. It is a little too confusing, there are a lot of options to explore, and it is hard to make a decision. Also, I am wondering if the technology is advancing too quickly. I do not want to replace something only to find out there was something more efficient I could have used. My mother died. Not as important Not in the house right now, so we are not around to get quotes or work on the house. Other things have taken precedent. The appliances are not currently in need of repair. The house is very low in energy use and there does not seem to be any urgent reason to make these changes. They are impractical Waiting for better weather (4) Waiting for the winter months to begin Waiting for things go on sale in the fall Waiting to do them all at once with a general home remodel Waiting to go through the process Waiting to tie everything in to happen all at once with the remodeling projects We are doing these improvements in conjunction with the earthquake retrofit project and are still waiting on some information for that. We are going to do it at the same time as when we get a new roof.

Q9e. Why have you decided that you will do this/these upgrade(s)? Other (please specify):

Response Cost savings Heat proficiency I am a true believer that we can make a difference to obtain the 2030 goals and to save the planet. I must walk the walk if I am going to talk the talk. I have not decided for sure. I think it is another place where drafts get in. If we could find the insulation and the machine on sale, than we would do it right away. In part because we need a new roof It is a good return on investment. It is going to give us our biggest bang for the buck. It is the right thing to do. It makes sense. It was suggested by the auditor. It would really improve our energy score. Long term house investment and long term savings The auditor gave me a prioritized list and these things were at the bottom of the list, both for cost and efficiency reasons. The demonstration of the draft was pretty dramatic.

Behavioral Perspectives on Home Energy Audits Page 311

Response There are health benefits that we would have, like reducing the growth of mildew. They will provide an immediate return on the investment. To have better durability in the systems used in the house. We will have to change the roof anyway because of the style of the house. The windows will be more convenient to use once they are replaced.

Q10a. Which upgrades have you decided not to do?

Response Add insulation in the attic to more current standards and insulation in the walls Adding energy saving light bulbs and closets Adding insulation to one of the rooms and getting a new roof Air source heat pump Appliance replacement Attic insulation Buy more appliances Changes to the heating system Changing the furnace Changing to a hybrid water heater Fireplace issues Floor or crawl space insulation processes Furnace replacement Improve the hot water heater Increase insulation in the attic Installing a heat pump type water heater. Insulate the floor of the basement Insulate under the ceramic tile in basement Insulating lights in the basement, insulating floor that is exposed to air underneath, and insulating ducts in the work room Insulation in the walls (8) Insulation in the walls and quilts in the colder bedroom It is not feasible to insulate the exterior walls because they have aluminum siding. Lower the temperature on the water heater and get rid of the second refrigerator Most of them were pretty straight forward, such as switching to solar panels. These things were mostly my interest in them and not directly recommended. New ducting New heat pump system because of the expense Not buy the new furnace, because we had a furnace guy come out say that we did not need a new furnace. He also said that our efficiency is much better than the audit says. We will also wait for our water heater to break until we replace it. Outside wall insulation Put in an energy exchanger Replace small refrigerator and the furnace Replace the electric furnace and put insulation around the window casings

Behavioral Perspectives on Home Energy Audits Page 312

Response Replace the furnace with heat sources in each room Replace the halogen lights with CFLs Replace the heating system Replace the remaining windows that are still single pane Replace the water heater Replace the water heater and furnace Replace the windows Replacing a fire door in the basement since it would be difficult to replace Replacing the hot water heater and the furnace Sealing of walls to floors, insulation of walls, and insulation of ceiling Some of the leaks in the brick, fans and mail slots which are difficult and expensive to fix Take siding off the wall and add hard foam insulation barrier Upgrade to double-paned windows, upgrade the heating system, and install a solar water heater Use a heat pump instead of the furnace We thought there was a water intrusion problem but now we think that is not actually the case. Will not insulate under the roof because it is not possible.

Q10b. Why have you decided to not do these upgrades? Other (please specify):

Response I already replaced it with a standard model. I did not have any place to put it. I disagree with the auditor's analysis on the efficiency of halogen versus CFLs. I do not want to know what they may find inside the house walls. It had a very low criteria and worth in terms of CO2 reduction and comfort, on the energy audit report. It is not possible. It is too difficult. It is too much work. It would take thirty years to pay off the differences of the new furnace. Low long term return My age No cost benefit Plaster is way different than sheetrock and so it is just not possible and the aluminum siding is impenetrable. The appliances are still functional. The house is made of brick and most of the older walls would need to be torn down. The impact on our lifestyle did seem in proportion to our energy savings. The payback period is too long. They were way beyond the scope of what I prepared to do. Unsure of wiring complications We do not think it is actually going to fix anything. We think installing the ventilation time clock will actually work better than what they decide Would lose too much headroom

Q11a. What home improvements have you completed?

Behavioral Perspectives on Home Energy Audits Page 313

Response Added a bathroom fan and repaired the deli board Added a ceiling fan to the other room, had some decorative trim put in our living room and painting Added a new range and exhaust fan, currently working on fixing the carport Added energy efficient dishwasher and seismically retrofitted house Added solar electric panels Additional bathroom, energy efficient refrigerator and a stove Bought a new clothes washer Bought a new refrigerator Bought a new screen door, repaired some framing around a door and trim, put up some new trim inside, and buying surge protectors CFL to LED upgrades for the lighting in the house and an electric vehicle charging station was set up in the house Converted to gas heat and got a new range Created dwelling space in the unfinished basement Doing a lot of work for the water leak in the basement Dryer duct replacement Electrical upgrades Exterior painting I got an earthquake valve on my gas meter. I have installed a domestic solar and hot water heating system. I have installed a domestic hot water recirculation system. I replaced a broken laundry machine. I replaced our hot water tank. Installed low-flow toilets Interior remodel of floors and basement area, work in the walls Kitchen remodel Kitchen remodeling project involved replacing floors and walls Landscaping, remodeling of the patio New decks, remodeled the kitchen, and replaced the roof New furnace and new water heater New gas range Outdoor insulation Patio construction Plan to replace some single pane windows Remodeled the doors in the hall, replaced bathroom toilet, replaced the fence Remodeling the basement Remodeling the kitchen Replaced a refrigerator Replaced a shower head Replaced a toasted oven Replaced refrigerator Replaced refrigerator and washing machine Replaced the furnace Replaced the water heater

Behavioral Perspectives on Home Energy Audits Page 314

Response Replaced the window Replaced water heater independently of energy audit, also replaced furnace around the time of the audit Replacing energy light bulbs Replacing some interior doors and landscaping and water system upgrades and exterior painting Seismic retrofit Small room was enclosed in the garage Some electrical upgrades for a stove that needed replacing Some major landscaping Taking out lawns and putting in raised organic vegetable beds Took out leaking water tank, replaced some pipes, earthquake straps around foundation Updated a bathroom We added insulation to rooms above the garage. We are completing rebuilding a backyard patio. We got a heater, a fuel pump type thing. We had one bathroom remodeled. We had to get a deck fixed that was rotting. We have added a much more efficient dishwasher. We have gotten new kitchen flooring, a minor kitchen remodel, a new energy-efficient refrigerator and one new energy-efficient window. We have replaced our refrigerator with a more energy-efficient refrigerator. We refinished the floors and installed cabinets. We replaced a toilet with a more water efficient toilet. We replaced our washer and dryer, installed a dual-plane glass door on our studio and hydronic heat. We upgraded all of the toilets to the water efficient ones. We're remodeling our kitchen.

Q11d1. In what ways did the energy audit influence your decision?

Response Gave me the evidence to go ahead with the project. The data told me how little insulation I had and what I needed. Giving me better information about the house Having the energy audit uncovered the problem. He explained the loss of efficiency and the extra energy that the old refrigerator expended. I decided to do the insulation but realized I wanted to do the seismic retrofit and it would be foolish to do the insulation before getting it pulled out for the retrofit and redoing the insulation. I didn't know I had a problem. I'm now looking at the energy usage of appliances when I need to replace older ones. It confirmed for us that it really needed to be done if we wanted our house to be a more livable space. Also, there wasn't anything else that could have a significant impact. It just gave scientific results of the house performance that are measurable rather than opinions. It made me think more about all the resources we are using in the house. It points out the problems and actually writes it down so you can refer to it at a later time and gives recommendations and possible people to deal with.

Behavioral Perspectives on Home Energy Audits Page 315

Response It was recommended by the auditor in a later conversation. It was useful in prioritizing things. Just to improve the efficiency of the house Made me aware of the lack of air quality More awareness of water use for the drip system Putting in the wall insulation required the removal of exterior shingles, so for that reason we decided to get the exterior painting. Reinforcing the benefits The audit reinforced what I had already investigated. The auditor recognized that it was a potential safety issue. There will be energy savings. They told us we needed to do it when we wouldn't have thought to do it otherwise. It really accelerated our timeline on getting all of this stuff done. We did the audit so that we could do the improvement in the most efficient way possible. We had to tie down before we could cover it up with insulation When we were talking to the auditor we wanted to do something that would take a bite out of our carbon footprint so it seemed logical.

Q12a. Could you briefly describe those changes?

Response A housemate moved out. Both my husband and I are working more than we were at the time. Building a backyard cottage which is a garage with a living space on top Changes in the work schedule Changes in work schedule (3) I am home more often because of my Parkinson's. I am retired. I got cancer and I am now home. I had a baby three months ago and we had a couple of roommates move out. I have been taking more time off. I have started working away from home more than I was before. I just retired and now I will have more time to complete these projects. I stay home more often than usual. I was laid off and I started my retirement. My daughter has returned home after traveling. My girlfriend stays here three or four days a week now. My husband is working out of town, commuting more, and not home as often. My niece and her friend are staying in the house. My son moved back in because of unemployment.

Behavioral Perspectives on Home Energy Audits Page 316

Response New member of the family and I have stopped working One of our kids moved out. There have been more people in the house. There were two additional people living with us, a change in work schedule, and I was laid off for a while. We had a baby and someone stays home more often. We had a baby two weeks after the energy audit. We had an extra person staying at the house for a while. We had another child.

Q13a. What changes have been made? Other (please specify):

Response Changed lights to fluorescent Changing to florescent Conserving water and changing light bulbs Got rid of cable box and put computers on power strips so that everything gets shut off when we are not using them Hot tub blanket I learned that by turning off the fluorescent lights you actually use more energy than leaving them on. I use more fluorescent bulbs now. Increased sensitivity to time and quantity of use Installed more CFLs Keeping doors closed where the drafts were and put up a wool blanket where one of the other drafts was More compact fluorescent and some LEDs as well Put appliances on power strips, use soaking hoses in the garden, and keeping water in the sink for rinsing off dishes Putting lids on all the pots when we cook and heating does not turn on as often. Replaced light bulbs (3) Replaced lights with CFLs and asking other members to be more aware of energy usage Replaced more light bulbs with CFLs Replacing traditional light bulbs more Started using the light bulbs I do not like Trying to be more conservative about our energy consumption Turn off the fan in the bathroom faster Using florescent bulbs Using more energy-saving light bulbs Water efficient taps We cook more. We got the rest of our lights changed out. We have high energy-efficient light bulbs. We turned the heat up and had lights on during the night because we had to keep getting up. Wear sweaters in the winter

Behavioral Perspectives on Home Energy Audits Page 317

Q13b1. What about the home energy audit experience caused these changes? Other (please specify):

Response Awareness and education, because they were pointed out during the audit Awareness of home energy use Becoming more aware of how to save energy without struggle. The energy auditor explained a lot to us. Being better educated about the efficiency of my house Going through the process of thinking about it It is good. Raised our consciousness as to how much energy we were using Recommendations from the auditor Saving energy Seeing the general loss of energy and reducing where we could The auditor brought the light bulbs and put them in. The auditor had some light bulbs and he put them in for us. The auditor helped to raise awareness. The auditor tested my energy strip and showed me that it drained a lot of energy to just keep appliances plugged in. The awareness of seeing the energy that we use They provided a number of them so I was able to try them and find out that they were okay. We started thinking about our behavior and how we can make changes. What the auditor had told me and he actually brought some bulbs to be replaced

Q14. Is there anything about the home energy audit or the process of doing energy upgrades that you would like to see improved?

Response A follow-up from the auditor about a year later A little more timely, it took a while to get the audit and even know if we were on the audit list after sign up. A tax subsidy for insulation A written reported handed to you, rather than presented online After you implement one of the recommendations, it would be nice if they did a follow-up audit to see how much you have improved the energy efficiency. Between signing up for the audit and being able to schedule the audit took close to a year and that seemed like far too long. Connect the audit directly to contractors who would do the work and give you three bids on each of the recommendations at the time. Financing for more people, recommended contractors, and case workers Follow-up Getting the results from the energy audit took about three months. It would be nicer if they could speed it up. Have more government assistance, such as working with companies that could give rebates for energy efficiency. I am hoping the company that does the energy audit can separate the improvement proposal. They propose more expensive solutions and the process makes it sound very severe just because they want you to choose them to do that for the money they could charge. I initially heard about the home energy audit in April of 2009. I was number 150, and was very high up on the list. I kept on calling the city and receiving letters, but it was more than a year later after they rolled out the audit. By

Behavioral Perspectives on Home Energy Audits Page 318

Response that point we had already made changes and were very busy. Then we got a letter that said that if we did not do the audit in six weeks, than it would not be available at all. I had to find our own auditor, instead of the city locating a legitimate, certified auditor. I felt like the city dropped the ball on this one. I like that there is the follow-up, but it might better serve the auditor if they do the follow up. The report I received from the auditor was not as valuable as the actual feedback I received at the time. I need somebody to help me interpret it and prioritize it and get some of these things done. I also need somebody that is going to point me toward contractors that will help me get it done. I never got the report, so it is hard to say what needs to be improved. If I had received the report it would have been more useful for me. I really liked that they had an infrared camera to show the structures in the wall. That was really impressive that they could show graphically where cold spots were. I then had more confidence in what needed to be done. I think that the fact that the city is subsidizing the audits is not sustainable. The reason is because electricity is not very expensive and it will reduce the amount of people who want an audit done. The costs seem to jump from the price of a compact fluorescent to an entire retrofit. So the project seems to have huge price jumps. I am not sure if people even had audits if they had a way to reasonably finance these larger improvements that will really make a difference. I think this survey is a great piece of follow up. Making sure there is some follow up and support for people. We are dedicated to it and I see how the regular homeowner might not have the time. I wanted someone who also had contractor knowledge. I was very impressed at how well the audit turned out, we were very happy with it. If you wanted to make it free that would be even better, but it is still pretty darn reasonable. I wish that the auditor did find these other areas of leaky air flow. I would be interested in knowing if the audit would include the complete carbon footprint of any improvements or new products that are introduced to the house. I would have been nicer to have been able to go from the energy audit to getting the work done. It actually was harder to do that because the person who did the energy audit dropped the ball and never got back to us and another contractor did not have the time to do it. I would have liked to have been able to recheck the chimney after fixing the rusted flue. I would have liked to see a paper copy mailed to me instead of receiving an email. The email had a link to the webpage that had our results on it and there was not particular information on how to contact the auditor. I would have liked to have contacted the auditor, because we realized the next day that the fireplace was open and that all of the numbers were inaccurate because of that. I replied to the email but no one ever got back. I would like a little more research on the auditor’s part. For example, what utilities I am connected to and what kind of rebates those agencies can offer. I would like a recommendation for contractors. Also, the auditor should have been on time and should have followed up when I had questions. I would like it to be cheaper. I would like it to continue for others. I would like specific recommendations and what might be a more efficient model or brand or criteria. I want to learn about how much heat the windows are losing. No percentages were given for anything. I would like to be able to measure the electrical usage of individual items in the house like the hot water heater or the furnace. It would be nice to do an infrared camera analysis. I would like to have it make recommendations to cut down our electrical use, maybe appliances, it could be anything. I would like to have more choices of auditors. You have no way of knowing who is better or what their qualifications are. I would like to see a lot more support for implementing the changes. I would like the energy audit to be more

Behavioral Perspectives on Home Energy Audits Page 319

Response specific about what exactly needs to be done. I would like to see better follow through from the auditor in terms of how best to take advantage of the recommendations. Opportunity lost for both sides. Ideally I would have liked to have the audit earlier when it was first announced. If the survey could come sooner after the audit so that the event would be fresher on my mind. When they do the infrared camera shots, they only had a few examples, so if they could have done more shots where the air was coming in on the final report, it would have been more helpful. It could be more expedited. It took over six months to be on the wait list before they got things processed. It could be more widely publicized. I have been telling friends about it and I do not know how much they have looked into it. I have been telling them how good it is and that it is subsidized. It is not available any longer. It is very expensive, very time consuming, and very confusing. It is overwhelming. It might be good to have follow-up at some point. I noticed that our electric bill has not really changed and I guess I would be interested to know why. It needs much better screening for the auditors. I contacted two people, one was amateur and the other was far more sales orientated. It seems like it takes a very long time to complete the entire process. It is not like you have the energy audit, and then two weeks later it is all done. There is a lot of back and forth communication between everyone involved, but I do not think that can really be changed. Sometimes it would be nice to know, if we replaced the heat-pump, if I could hear the unit working instead of just receiving it. Although they are energy efficient, they are so loud to me. It also takes like an hour to wash a load instead of twenty minutes. And these things you do not really notice until you have them in your house. They need to give a better idea of what to expect out of these appliances. It was not as good for prioritizing as I had hoped. I knew just about as much as they told me and in some ways I think I knew a little bit more. They did not know that much about sealing the tops of the walls from the attic or about insulating awkward spaces. It would be great to have concrete information about the differences that the changes would get me. I would love to know the differences that this would make in my electricity usage. I realize that this is hard to know and it really depends on the weather or how big the project is. It would be nice if the initial cost was not so high to encourage more people to do the audit. It needs to be more widely advertised as well. It would be nice to have a cleaner connection with who could do the work and if there is funding available. I know that information is out there, it just takes time to pick through it and find out what is relevant. It would be nice to have a follow-up. It would be nice to learn about how much electricity the appliances use in the house. Maybe by using a meter that can show how much it uses when it is turned off or on. It would have been helpful if they had given some ideas of contractors to do specific work. I enjoyed working with the auditor and I trusted him, but he said it wasn't part of his job. It would have been nice to be able to have utility data available from Seattle City Light and patterns that could be incorporated into the audit and suggestions. More contractor references would be useful and a way to evaluate their work, a rating system or something. More follow through More follow-up and better referrals: they recommended certain people to work with us and those people just did not follow-up. More help after the audit sorting through the myriad of options and technology My impression is that the auditor was experienced and professional and I appreciated the report. It would be nice to receive positive feedback in an updated report after certain changes have been made.

Behavioral Perspectives on Home Energy Audits Page 320

Response Return phone calls from contractors Subcontractor feedback would be nice. We did not use one of the contractors on the list because of feedback from someone else who used one. The actual audit report was somewhat complicated and was difficult to understand. The addition of a resource sheet for a homeowner who wants to do the upgrades on their own. The amount of time that I had to wait to get the voucher The audit criteria used to rate the energy performance changed between the baseline and the retest. So it makes it difficult to conclude if it was a good investment or not. The auditor list was random so I had no sense how to select an auditor, even though I am happy with the experience The auditor took a huge amount of time and was very chatty, could have been more efficient. The auditor would write a blurb about each part, but I wish that he wrote more detailed ratings. I was glad that I took note of what he said during the audit. He was also wrong about what could have been savings for us. The biggest thing missing was the lack of connection between the auditor and resources on how to get it done or who could get it done. The energy audit was not what I expected. It was too general and provided little guidance in cost benefits. The information that I got for the audit was not overly informative for me. I am already pretty well versed in energy consumption as an architect, so the report did not tell me anymore than what I know. The infrared sensor, he did not have it charged, so he could not use it. He was not going to set up the blower door because he said the house was too drafty and would be a waste of time. Most of his time was spent entering data into his laptop computer. I paid $100 and the service was worth probably $100, but it was not worth the total amount of money that he got compensated for. My impression is he wanted to get all the work done to produce the energy audit that he sent to me. He wanted to maximize his profit and minimize the amount of work he did. The initial energy audit report was sent to my junk mail, instead of using the personal contact of the auditor. This was difficult and resulted in a delay of when I had expected to receive the results. The lists of contractors could have been extended and maybe have a way to know which contractor is better to hire. The person who did the audit for us came from an insulation company so she focused a lot on that. If I were to do it again I would use an independent auditor. The process was great but the follow-up needs to be improved. We received a long, complex document and it would have been very helpful for the energy audit company to come back and explain the upgrades with options. They did not provide cost effectiveness payback period or comparison of which improvements would save the most energy. Timeliness of response after the request for the audit We had a hard time picking someone to do it, so we would like to see that process improved. We had a really good experience. The auditor was very thorough and professional. I was very pleased with his expertise and the time that he committed to the project. We never received a follow up. It would be nice to have a resource to find upgrade supplies and who we can contact for more information or find more information on utility benefits of the upgrades. What I hope would happen versus what did happen was different. There were several projects where I cannot do the work and would like a better connection between those who do the audit and those who can do the work, as well as the cost needed to make those change.

Behavioral Perspectives on Home Energy Audits Page 321

Behavioral Perspectives on Home Energy Audits Page 322

Appendix N: Details of Modeling Data Processing and Preparation

In order to complete the home energy modeling tool comparison runs and analyses described in the report Chapter 6 and Appendices O to T, a variety of data collection, preparation, and conversion steps were required. An overview of these steps is presented in Figure 19.

Figure 19: Major data collection, processing, and preparation activities for model comparison

MODEL MODEL RUNS DATA PROCESSING AND DATA COLLECTION COMPARISONS PREPARATION

EPS Auditor Inputs Audits and Post- Comparison of EPS and Input Mapping (EPS Auditor and Results Retrofit Home Energy Scoring Assessments Tool asset run results Inputs to Home Energy Scoring Tool and HESPro) Model Runs in: Converter Utility Comparison of EPS and Home Energy Scoring Home Energy Energy Use Tool Recommended Scoring Tool Behavior Survey Energy Use Energy Use Upgrades Response Behavior Survey Behavior Survey HESPro Mapping to Responses Recruiting HESPro Inputs Emulator Comparison of asset, Weather Analytics asset+weather, and house+weather+ Weather Data University of Oregon behavior run results Processing and Weather Data Solar Radiation including utility usage Synchronization Monitoring Laboratory

Utility Data Utility Data (pre- processing audit)- gas and SCL (electric) data (annualization, electric synchronization)

PSE (gas) data

Each step required certain assumptions with potential to alter the final results, but this processing is necessary to any analysis of this type. These steps are therefore documented below in some detail. Additionally, it should be noted that home energy modeling tools evolve, and therefore the modeling assumptions and results represent a snapshot in time. House and Behavior Data Collection Data collection steps are described in the main report, Chapters 2 and 6. Conversion and Mapping of Auditor Measurements to Inputs for the Home Energy Scoring Tool and HESPro To convert the data collected in the audits for use in the Home Energy Scoring Tool and HESPro, a Converter was developed in a Microsoft Excel spreadsheet. This Converter takes the house configuration data that auditors input into the EPS Auditor tool, and maps it to a set of inputs for the online Home Energy Scoring Tool and for HESPro. At the same time, the Converter generates inputs to the Home Energy Scoring Tool Emulator and HESPro Emulator.

Behavioral Perspectives on Home Energy Audits Page 323

This Converter requires a set of assumptions to handle uncertainties in mapping due to missing input data, conflicting input data, and input data of different ranges and/or categorization. These uncertainties are inherent to any conversion of inputs from one model into another, and limit the reliability of comparisons with this limitation. Certain EPS inputs exactly matched the inputs to the other tools, and several other measurements were collected by auditors specifically for use in this model comparison. Therefore, all inputs did not require conversion. However, the majority of inputs did require some level of conversion or the reliance on assumptions. These limitations can be avoided, but only by “natively” completing the original audit/assessment on each house for each tool to be compared. This was not an option for this project, but is recommended for future comparison work. Even when measured natively for each tool, model inputs would still be subject to other sources of uncertainty including auditor error and auditor measurement and interpretation variability. Overall, the need to convert many of the EPS inputs into Home Energy Scoring Tool inputs is a source of error in this analysis, and will necessarily degrade the results for the Home Energy Scoring Tool. We have therefore avoided directly comparing EPS and Home Energy Scoring Tool results in a competitive manner (i.e., does one tool do “better” than the other). Instead, the focus of this analysis is on similarities and shared issues.

The energy use behavior survey responses were also fed into the Converter, and used to generate operational inputs to inform model runs for HESPro (online) and the HESPro Emulator. Again, the form of the behavioral survey responses does not directly match the HESPro inputs; therefore, assumptions were necessary in generating the model inputs, and certain portions of the responses were insufficient to adequately specify inputs to HESPro—for example, lighting usage was underspecified and was not included in the mapping (the default lighting assumptions specified in the Home Energy Saver online documentation (Home Energy Saver: Engineering Documentation 2012) were instead utilized in these cases. The following inputs, converted from the energy use behavior survey responses were utilized to reflect operational characteristics in the model runs:  Occupancy  Heating and cooling thermostat settings, with different heating settings allowed for morning, daytime, evening, and overnight periods  Dishwasher, clothes washer, and clothes dryer loads per week  Clothes washer “typical” cycle temperature setting  Number of additional refrigerators and freezers  Whether someone is typically home during the day  Use of certain common electronics (televisions and set-top boxes, computers)

Other energy use behavior information was gathered but could not be translated into HESPro model inputs.

In addition, a set of EPS inputs and data gathered during the EPS audit were not accepted among the limited set of Home Energy Scoring Tool inputs. These inputs include:  Shading of the house  Water heater location  Water heater temperature setting  Refrigerator and clothes washer energy star status  Stove and oven fuel.

Behavioral Perspectives on Home Energy Audits Page 324

These inputs were utilized, in addition to weather and behavioral inputs, in the “House+Weather+Behavior” model runs so as not to unnecessarily hamstring these runs by withholding readily available house and equipment information.

Treatment of Missing and Ambiguous Home Energy Scoring Tool Inputs Data to inform several Home Energy Scoring Tool inputs was not available from EPS audits. It was therefore necessary to assume or default these particular inputs to generate scores. For a number of other inputs, some information was available but the correct mapping was still ambiguous; in these cases, assumptions were made to define a consistent mapping. The major missing or ambiguous information required for the Home Energy Scoring Tool is listed in Table 45. A variety of other inputs not listed here had minor ambiguities.

Table 45: Missing or ambiguous data informing Home Energy Scoring Tool inputs

Feature Description Missing or Ambiguous? Assumption or Default Used Stories and Foundation Type Ambiguous mapping of homes with Map as if daylight basement is a daylight basement not present Conditioned Unfinished Area Ambiguous mapping of conditioned Map as if no conditioned but but unfinished floor area unfinished floor area is present; modeling could underestimate energy use if conditioned unfinished areas are present in home Attic Missing Assume unconditioned attic Skylights Missing Assume no skylights are present Walls and windows Missing whether walls and windows Assume walls and windows are uniform across sides of house are uniform Roof construction, exterior finish, Missing Assume standard roof, and insulation level composition shingles, no insulation Room air conditioning Missing Assume room air conditioning is not present

The Home Energy Scoring Tool inputs used in the modeling are clearly degraded by the missing and ambiguous data. This contributed uncertainty to the modeling estimates. The resulting effect on model estimates is quantified in Appendix R, which found that for most cases, the missing information had only a small effect on model estimates.

Treatment of Missing and Ambiguous HESPro Inputs HESPro includes all Home Energy Scoring Tool inputs plus many more. Our input data set to inform HESPro was also larger as it included additional information from the EPS audits as well as behavioral survey responses. Still, we did not have adequate information for a number of HESPro inputs. Those missing inputs that overlapped the Home Energy Scoring Tool were treated with the defaults or assumptions listed in Table 45 above. For the remaining inputs that did not overlap with Home Energy Scoring Tool inputs, the HESPro default inputs were leveraged. Again, this process necessarily contributed uncertainty to the modeling estimates.

Behavioral Perspectives on Home Energy Audits Page 325

Emulator Development and Verification We developed emulators of the Home Energy Scoring Tool and of Home Energy Saver Pro for three primary purposes:  To allow Home Energy Scoring Tool and HESPro model runs under actual (modeled) weather conditions recorded in Seattle, to enable comparison of model estimates to utility usage while avoiding weather normalization procedures.  To generate detailed end use and gas and electric results for Home Energy Scoring Tool runs which were not available from the online tool during development. These detailed results were necessary for various aspects of the model comparison.  To allow bulk processing of model runs beyond the single-run capability of the online tools.

These Emulators consist of two basic components: a Microsoft Excel® front-end and the DOE2.1E simulation engine. The Excel front-end accepts model inputs from the Converter in the same format as inputs fed into the online tool. These inputs are then utilized for two purposes: 1. To calculate appliance, behavior, and domestic hot water energy usage 2. To generate a new set of inputs for the DOE2.1E hourly simulation engine (including inputs reflecting aggregate lighting, appliance, and hot water usage from the first step)

The front end was developed to generate three sets of outputs so as to separately emulate the Home Energy Scoring Tool, the HESPro online tool, and the HESPro tool with certain errors corrected, as detailed below. These are discussed below.

The DOE2.1E v124 hourly simulation engine was used as the back-end of the emulators, and replicates the simulator used in both the Home Energy Scoring Tool and in HESPro. Detailed technical setup files were provided by LBNL staff for this use. The same simulation engine is shared by the Emulators.

Home Energy Scoring Tool Emulator The Home Energy Scoring Tool Emulator was developed based on online documentation and configured to accept the Home Energy Scoring Tool inputs so as to emulate asset inputs only. Assumptions were matched to the Home Energy Scoring Tool assumptions. The Emulator performance was verified against the online Home Energy Scoring Tool estimates using the version of the online tool available in January 2012. The results of this comparison are presented in Figure 20.

Behavioral Perspectives on Home Energy Audits Page 326

Figure 20: Percent difference in estimated total source energy use, between Home Energy Scoring Tool Emulator and online Tool. N=31.

For the 31 cases used in this verification, all Emulator runs were within +/-2.2% of the online results except for one case where an error was found in the Emulator model inputs and corrected, while the error was not corrected for the online run. This difference in inputs was the cause of the discrepancy in this case. The mean percent difference across these runs was 0.7%, with the Emulator estimating slightly lower total source energy use on average than the online tool.

HESPro Emulator The HESPro emulator was developed based on online documentation and configured to accept a subset of the HESPro inputs so as to emulate house inputs as well as a subset of energy use behavior and appliance inputs. During the course of development of the emulator, small errors in the online HESPro tool were found, where the tool results do not match what would be expected from the documentation given the house description. These errors slightly affected appliance and energy use behavior modeling88, but did not affect the asset aspects of HESPro or the Home Energy Scoring Tool. To allow verification against the online HESPro, a version of the emulator was developed which replicates the errors found in the online HESPro. These errors were corrected in the emulator, following the online HESPro documentation, for the model comparison runs. The results of this verification test are presented in Figure 21.

88 Specifically, these errors related to: incorrect DOE2 roughness values used for certain siding and roofing types; the “adult at home during the day” HESPro input reverting to the default for Seattle (“yes”) in all runs; and “0” valued inputs reverting to defaults for clothes washing loads at each cycle temperature setting.

Behavioral Perspectives on Home Energy Audits Page 327

Figure 21: Percent difference in estimated total source energy use, between the HESPro Emulator (accounting for known errors in HESPro) and online HESPro. N=28.

For the 28 cases used in this verification, all Emulator runs were within +/-2% of the online results. The mean percent difference across these runs was 0.4%, with the Emulator estimating slightly higher total source energy use on average than the online tool.

A second version of the HESPro Emulator was developed with the errors corrected so that the model ran per the online documentation. These corrections only have a modest impact on energy use estimates, but prevented verification of this Emulator. This corrected HESPro Emulator is what was used for all modeling comparison runs. Weather Data Processing and Annualization The model runs used in the comparisons utilize a combination of standard TMY2 weather files89 (Marion and Urban 1995) and actual weather data from Seattle synthesized into “actual meteorological year” (AMY) files following the TMY2 format. The actual Seattle weather files (for the period from January 2007 through October 2011) were generated using data from two sources: Weather Analytics Inc. and the University of Oregon Solar Radiation Monitoring Laboratory. The data from these two sources were combined and synthesized into AMY weather files. Only limited weather variable were available— describing temperatures, humidity, wind, and solar radiation. Other weather characteristics were not available and were not included in the AMY data files. Additionally, solar radiation data availability was split between the two sources and required manipulation and extrapolation to generate a self- consistent data stream over the period of study. Some approximation was necessary in this manipulation.

89 Second-generation typical meteorological year weather files (TMY2) consist of hourly measurements of a wide variety of weather factors, and were developed for a variety of locations in the United States by NREL as a compilation of typical months of weather representing the period from 1961 to 1990. These files are utilized by the DOE2.1E hourly simulation engine that is used in both the Home Energy Scoring Tool and HESPro. The weather data used in EPS Auditor is also derived from TMY2 files.

Behavioral Perspectives on Home Energy Audits Page 328

To enable full year model runs that could be compared to the available periods of utility-reported usage data, full-year AMY weather files were generated that started at the 16th of each month90 and continued for the full year. Yearly weather files were generated for each month from 2007 through the most recent data available. Because TMY2-formatted files must start on the first day of January of the year, weather data for each AMY file was wrapped around so that the data from the first of January was always at the beginning of the file, even if the period covered was from July 16, 2010 to July 15, 2011. While this approximation could create minor discrepancies in the modeling, no significant issues were noted when evaluating this approach. Utility Data Processing and Annualization Both natural gas and electricity usage and billing data was provided by Puget Sound Energy and Seattle City Light, respectively, for a subset of audited homes. We received complete utility data for at least one year prior to the date of audit for 251 homes—121 all-electric homes and 130 using both gas and electric. This data ranged up to 3 full years prior to the audit (for all-electric homes), depending on the initial date of service and other limits to data availability.

Additionally, a comparison group data set with electricity usage and billing data was provided by Seattle City Light for more than 1900 homes. This data was anonymous and was not screened for audit involvement--audited homes were not excluded so a small number of audited homes may have been included in this comparison group by random chance.

Each utility data set was processed to remove incomplete or discontinuous data and to aggregate the usage and billing data into year-long periods for comparison to the annual energy usage estimates provided by the modeling tools and to match the annual periods in the synthesized weather files. This required some calculation, interpretation, and interpolation to generate yearly usage periods synchronized across gas and electric utility data (each with different billing periods). As with the AMY weather data files described above, the 16th of the month was used as an arbitrary starting point for yearly utility usage calculations. In this case, the yearly utility usage period ended the 15th of the month prior to the month that the audit was completed (starting the 16th of the year prior). Seattle City Light billing periods are typically 2 months, while Puget Sound Energy tends to bill on a monthly basis. These periods significantly limit the resolution of the analysis. As higher resolution data is not available for comparison, it is not known how much error the annualization processing contributes to the yearly usage calculations. However, only the first and last billing periods in the yearly period require interpretation, minimizing the potential impact of the calculations. Linear interpolation was utilized at each end of the yearly period to estimate how much gas or electricity was used during partial billing periods.

90 The choice to start the periods at this point in the month does not significantly affect results and was an artifact of the analyses.

Behavioral Perspectives on Home Energy Audits Page 329

Appendix O: EPS Auditor, Home Energy Scoring Tool, and Home Energy Saver-Pro Feature Comparison

Introduction and Purpose

Home energy performance modeling tools such as EPS Auditor, the Home Energy Scoring Tool, and Home Energy Saver-Pro—are not trivial to compare in a fair and meaningful way. Recognizing the substantial differences between these tools, we started by comparing them based on their features and use, so as to provide context for a comparison of modeled energy use estimates and recommendations. Approach

We reviewed the tools themselves and their output reports, as well as technical documentation and marketing literature, to identify differences and similarities between the tools. In particular we focused on eight aspects of the tools:  Intended use or users  Depth or level of detail of the assessment  Whether the modeling is asset or operational  Inputs accepted and required  Assumptions and defaults used in the models  How upgrade recommendations are made  Role and influence of the auditor  Outputs that are generated Results

Table 46 provides a summary of the key similarities and differences found between the tools.

Table 46: Similarities and Differences between EPS, Home Energy Scoring Tool, and HESPro

Aspect EPS Auditor Home Energy Scoring Tool HESPro Implications Intended Existing homeowners Existing homeowners Homeowners or auditors looking to EPS and Home Energy Scoring Tool use or users Real estate transactions Real estate transactions add operational considerations to are similar in their intended use and Home Energy Scoring Tool; audiences, and can be reasonably homeowners looking to do self- compared; each also has application assessment; auditors looking to do to existing homeowners assessments Depth/Level Modest set of model inputs; Modest set of model inputs; not HESPro does not require a specific The depth or level of detail of the EPS of Detail Diagnostic audit – includes blower billed as diagnostic. Can utilize level of detail. Tool is capable of audit and the Home Energy Score door and combustion safety testing blower door results in modeling, incorporating some diagnostics, and assessment are distinct, with EPS (where appropriate) and encourages but not required by tool (may be can incorporate house and auditors likely spending longer use of IR camera, and provides a subject to implementation). operational inputs not included in the assessing the home and compiling a detailed report allowing auditor Report is not customized by the other models customized report. The Home Energy customization; typical audit may auditor. Assessment duration is Scoring Tool may or may not require take 3-4 hours on-site indicated at under 1 hour. (Home diagnostics in the typical assessment. Energy Saver: Engineering Documentation 2012)

Behavioral Perspectives on Home Energy Audits Page 330

Aspect EPS Auditor Home Energy Scoring Tool HESPro Implications Asset vs EPS modeling is asset-based, though Home Energy Scoring Tool is an HESPro is an operational modeling EPS and Home Energy Scoring Tool operational some white goods and other large asset tool. Scope is limited to the tool, but can be used to generate are asset, although EPS includes non- energy uses are considered, house; white goods are not results representing the asset asset aspects; Home Energy Scoring potentially outside the home (hot considered. performance of a home if defaults are Tool and HESPro can be used tub, pool pump, etc). Auditor may utilized for all operations-related together to treat the spectrum from customize the report with inputs. asset to operational operationally-focused guidance or include non-asset considerations Inputs The following inputs are included in The following inputs are included HESPro subsumes the Home Energy Different scope, granularity, and EPS that are not included in Home in Home Energy Scoring Tool that Scoring Tool inputs, along with a range of model inputs Energy Scoring Tool: are not included in EPS: variety of more detailed technical -Misc energy uses (on an aggregate -Roof/attic configuration inputs, as well as operational inputs. points system) -Roof insulation (where this is the -Clothes washer efficient? primary insulation vs. attic -Secondary heating system insulation) -Secondary water heating system -Skylights -Combustion safety test results -House orientation -Duct leakage test (not required and -Different window or wall rarely completed) performance on 4 sides of house Assumptions Generally no defaults are allowed: No defaults allowed; all inputs HESPro uses a set of defaults for all Assumptions, including those related all inputs must be entered. must be entered. However, a inputs (given a zip code), when these to standard occupant energy use However, a variety of assumptions variety of assumptions are made inputs are left blank. These defaults behavior, are pretty similar are made “under the hood” for “under the hood” for model are intended to represent typical model parameters that are not parameters that are not conditions. Additionally, a variety of configurable. These include house configurable. These include assumptions are made “under the configuration, operation and house configuration, operation hood” for inputs that are not behavior, and weather. Many of and behavior, and weather. configurable, or in determining or these assumptions are not currently These assumptions are included interpreting the inputs. These publicly available. in the Home Energy Scoring Tool assumptions are consistent with and Home Energy Saver technical those used in Home Energy Scoring documentation (Home Energy Tool except in a few cases where Saver: Engineering Home Energy Scoring Tool diverges Documentation 2012). (occupancy, lighting usage, and “other appliance” usage). Many of the assumptions are included in the Home Energy Saver technical documentation (Home Energy Saver: Engineering Documentation 2012).

Behavioral Perspectives on Home Energy Audits Page 331

Aspect EPS Auditor Home Energy Scoring Tool HESPro Implications Selection of Recommendations within the EPS Recommendations generated Recommendations generated from There appear to be substantial Recommend report (except those at the from model inputs based on 1) model inputs based on 1) current differences in how recommended ed Energy discretion of the auditor in the current state of the home and state of the home and opportunity for upgrades are selected in these tools Upgrades general notes section) are opportunity for improvement (for improvement (for example, current and the influence of the generated based on 1) the condition example, current level of attic level of attic insulation vs. possible auditor/assessor over these of the house determined by the insulation vs. possible levels), 2) levels), 2) whether those upgrades selections. auditor for each of 10 energy- whether those upgrades are are replacements of existing performance related elements of replacements of existing equipment with a defined lifespan, the home; the auditor then equipment with a defined such as a furnace, water heater, or manually selects which of these lifespan, such as a furnace, water windows [upgrade when it is time to elements to make heater, or windows [upgrade replace] or whether they are recommendations for, based on a when it is time to replace] or improvements that don’t involve defined protocol which prioritizes whether they are improvements replacement of equipment with a the condition of that house element that don’t involve replacement of defined life span, such as with air (very poor, poor, average, good, equipment with a defined life sealing, duct sealing, and insulation excellent) over payback estimates. span, such as with air sealing, [repair now] and 3) estimated Auditors have room to provide duct sealing, and insulation payback of those improvements additional recommendation detail, [repair now] and 3) estimated calculated differently for the two photographs, etc. payback of those improvements categories: “repair now” (based on calculated differently for the two full cost) vs “replace later” (based on categories: “repair now” (based incremental cost of high-efficiency on full cost) vs “replace later” equipment vs. standard equipment). (based on incremental cost of HESPro does not clearly distinguish high-efficiency equipment vs. “repair now” and “replace later” standard equipment). A recommendations. maximum 10 year payback Recommendations can be removed or criteria is applied to “repair now” customized to different levels of recommendations; no payback improvement and user-specified cost criteria is currently applied to numbers. “replace later” recommendations. Outputs Energy Performance Score label, Home Energy Score calculated Energy use and CO2 emissions Presentation of energy scores differ including: an Energy Score, based on total source energy use estimates by end use, to the point where it is difficult to calculated based off of total site of home; recommendations to do recommendations including compare the scores generated by the energy use of home; a Carbon Score; now and when replacing estimated savings, costs, and payback two tools. Home Energy Scoring Tool benchmark scores and estimated equipment with $ savings per does not generate a carbon or CO2 scores if all upgrades are completed; year. Total $ savings estimated score. Recommendations generated and energy use and carbon over a 10-year period. Also somewhat differently. The EPS report estimates by fuel (natural gas and includes the conditions of the provides substantially more electric). A detailed report with home (the tool inputs) and information on the current condition energy use estimates for heating, model-estimated energy use - of the house and major elements, and cooling, water heating, and all else. total (source MBTU) and by fuel, additionally allows the auditor to Savings estimated over a 1-year and a calculation of how much customize the report with additional period. Detailed conditions of home the recommended upgrades will details and pictures documenting the (including relative evaluation—very reduce the home’s carbon condition of the home. poor, poor, average, good, footprint (%). The Beta version of excellent). Recommendations with the Home Energy Scoring Tool an estimated range of cost and also provided the upgrade cost savings by upgrade, and General estimates used in the tool Notes which are customized by the analyses. auditor and may include prioritized recommendations, information on rebates and incentives, and non- energy issues

Behavioral Perspectives on Home Energy Audits Page 332

Aspect EPS Auditor Home Energy Scoring Tool HESPro Implications Role of Auditor measures the home and Assessor measures the home and The role of the auditor/assessor with Auditor has much more latitude with auditor/ inputs these into EPS. inputs these into Home Energy HESPro depends greatly on how the EPS than with the Home Energy assessor Recommendations selected from list Scoring Tool. Scores and tool is used. The tool can be used for Scoring Tool, including the flexibility based on “condition of home”, in recommendations generated by a self-assessment, or can be used by a to include issues not directly related each of 13 categories. Auditor can tool with no customization. It has professional auditor. The tool to energy; but, auditor also must add photographs and text to report. been suggested and may come to provides capability to customize spend more time customizing the Auditor cannot alter scores or these pass, depending on how Home recommendations by altering report. Some customization appears recommendations, though many Energy Scoring Tool is released, estimated costs or selecting the to be possible using HESPro. auditors generate a customized set that assessors/auditors can desired level of performance after the of recommendations in the “general generate their own list of upgrade. The tool generates the set notes” section of the report. recommendations in place of the of recommendations. Home Energy Scoring Tool recommendations, although the score must remain.

Implications

From this analysis, the key takeaways for this comparison are:  EPS and Home Energy Scoring Tool are similar in their intended use and audiences, and can be reasonably compared; each also has application to existing homeowners so considering model performance for operational applications is justified. The tools differ in the level of detail (comprehensive diagnostic EPS audit, or Home Energy Scoring rapid assessment) of the assessment, but to the degree the tool outputs overlap, they can be reasonably compared.  The procedure by which recommendations are selected and the influence of the auditor/assessor over these selections differ substantially, and could cause diverging recommendations generated by the tools.  Auditor/assessor role, latitude, and required customization, and their ability to include non- energy concerns and recommendations is distinct between the tools, with the auditor having much more input into the EPS report, but also enabling the EPS report to be much more customized and detailed. This particularly affects the amount of information that can provided to the customer regarding the existing condition of the house, which can include pictures, infrared camera images, and detailed comments in the EPS report91.  Presentation of the energy score differs, but can be converted to common units. However, in practice the layperson will not be able to compare the asset scores of homes scored in EPS auditor and Home Energy Scoring Tool if they remain as currently scored. And, the differences in scope, granularity, and range of inputs, as well as the underlying assumptions used in the tool calculations, mean that perfect fidelity between the results generated by the two asset scoring tools as currently configured is not possible.

91 The Home Energy Scoring Tool is also planned for release as an API (application program interface), allowing it to be embedded in third-party software tools. These third-party tools could presumably enable additional customization of reports.

Behavioral Perspectives on Home Energy Audits Page 333

Appendix P: Comparison of Asset-Based Energy Use Estimates: EPS Auditor and the Home Energy Scoring Tool

Introduction and Purpose Both EPS Auditor and the Home Energy Scoring Tool utilize an asset modeling methodology to score the energy performance of houses under “standard” occupancy and operational assumptions. In this analysis, a set of houses were modeled in both tools and the results were compared, to help address the following questions:  To what degree are asset model estimates similar between these tools, and are differences arbitrary or rational?  Can potential improvements to either (or both) tools be identified?  Is the methodology utilized to convert EPS inputs to HES inputs valid for use in the follow-up analysis comparing model estimates to utility-reported usage? Approach Of the 1,355 homes for which EPS energy audits were conducted, the research team obtained complete utility records for 251 homes for at least one year prior to the date the audit was conducted. While this model comparison does not compare the model estimates to utility usage, these homes were selected so that the same set of homes and model runs could be leveraged for this comparison as well as the model comparison to utility usage (Appendix Q). Of these, 14 homes could not be successfully processed through the input converter and modeled in the Home Energy Scoring Tool Emulator due to either input errors that could not be resolved, or emulator limitations.92 This left 237 homes with full utility data and emulator results—the sample for this analysis. 114 (48%) of these homes were all-electric, and the remaining 123 (52%) used natural gas in addition to electricity. Of the 1,355 homes receiving SCL Home Energy Audits, 63% used natural gas for heating, while 17% used electricity for heating and 17% used oil, while for all Seattle single family housing (Tachibana 2010), 58% used natural gas for heating, 22% used electricity, and 17% used oil. Therefore, our sample of 237 homes for this analysis highly over- represented electric heated homes, and somewhat under-represented natural gas-heated homes, while not representing oil-heated homes at all due to lack of billing data.

For each home, the data collected during the EPS audit was processed through the Converter utility (as described in Appendix N) and run through the Home Energy Scoring Tool Emulator to generate Home Energy Scoring Tool estimates of energy usage. EPS model results from the original audit were referenced and were not re-created.

For this analysis, we compared EPS and Home Energy Scoring Tool model estimates of the following:  Total energy use (site kWheq/yr)  Total energy use (source MBTU/yr)93  Natural gas use (therms/yr)  Electricity use (kWh/yr)

92 Specifically, the emulator was unable to model integrated boiler/water heater systems (4 houses), second heating system used as primary heat source (3 cases), concrete block walls (2 cases), and various missing or out-of- range or erroneous inputs (5 houses) 93 The Home Energy Scoring Tool uses “MBTU” to represent 106 BTU. We follow the same (SI) convention here. Others have used the equivalent “MMBTU”, because MBTU is sometimes interpreted as 103 BTU.

Behavioral Perspectives on Home Energy Audits Page 334

 Heating end use (site kWheq/yr)  Water heating end use (site kWheq/yr)  Other uses (appliances, electronics, and lighting-site kWheq/yr)  Total energy use (site kWheq/yr) and electricity use (kWh/yr) after accounting for other large energy uses

Cooling is a minor end use in Seattle and thus was not treated separately in this analysis. However, cooling is aggregated into total energy use and electricity use comparisons. Setting up the Comparison The EPS Auditor and the Home Energy Scoring Tool do not generate directly comparable results. We therefore took steps to translate the outputs from each tool and to develop metrics for comparing model outputs.

As of the time of the completion of this analysis (January 2012), the Home Energy Scoring Tool outputs did not provide energy usage estimates disaggregated by fuel94 or by end use. To generate the data for the end use and fuel-specific comparisons, the Home Energy Scoring Tool Emulator was utilized.

EPS tool Energy Performance Scores are based off of the site kWheq/yr calculated for the house, while Home Energy Scores are based off of the source MBTU/yr calculated for the house. Site and Source total energy use estimates differ substantially. In particular, the total site energy metric is more influenced by natural gas use, while source energy is more influenced by electricity use, at least when using the national average source factors currently utilized by the Home Energy Scoring Tool95 (Deru and Torcellini 2007). We present analyses in terms of both site and source energy, and maintain the native units for EPS (site kWheq/yr) and the Home Energy Scoring Tool (source MBTU/yr) for ease of reference.

Additionally, because the most appropriate metric for comparing tool results depends on the purpose of the comparison, we provide a variety of metrics for each comparison case. Correlation results provide a useful measure of the co-variation of the tool outputs, while best-fit lines provide an indication of the slope of the linear relationship96 between the two models. Mean results from each tool are presented as an indicator of how close the tools run on an overall average for the set of houses compared. Mean absolute difference and mean absolute percent difference are provided to illustrate on average how far apart estimates tend to be on an absolute basis. Finally, percentage of homes where model runs are within +/-25%, more than 25% higher, more than 25% lower, more than 50% higher, or more than 50% lower provide more specific measures of the level of agreement between model estimates for each home.

94 An update to the Home Energy Scoring Tool report as of February 2012 added energy usage estimates by fuel. 95 For electricity, we used a source factor = 3.365; for natural gas, we used a source factor = 1.092. 96 Assuming the relationship is linear in form—other forms of relationship were not checked and may be present. In all cases presented, we found a significant relationship between estimates from Home Energy Scoring Tool and EPS Auditor, based on linear regression analysis.

Behavioral Perspectives on Home Energy Audits Page 335

Results

Total Energy Use Estimates Model-estimated total energy use was compared utilizing three different runs. In each case, model estimates were quite consistent between the two tools. Total energy use estimates from EPS Auditor and the Home Energy Scoring Tool were first compared using site kWh-equivalents/year and next using source MBTU/year. Results of the site energy comparison are presented in Case 1, while the source comparison is presented in Case 2 (see Table 47, Figure 22, and Figure 23).

For each of these cases, the Pearson correlation coefficient (r=.87 site and r=.84 source) indicates good agreement between model estimates. On average, the Home Energy Scoring Tool estimates of total energy use were somewhat higher (5% site, 7% source) than EPS estimates. And, most runs were reasonably close—86% of Home Energy Scoring Tool estimates for total site energy and 87% for total source energy were within +/- 25% of the EPS estimate for the same home.

One difference between the required inputs to EPS auditor and the Home Energy Scoring Tool is that EPS Audits include information on certain large energy end uses that are not included within the scope of the Home Energy Scoring Tool, likely a reflection of different interpretations of what should be considered within “asset” assessment and scoring of homes. In other words, while many non-“asset” appliances and behaviors are included in the asset model energy usage estimates based on “standard occupant behaviors” and average appliance holdings, the presence of hot tubs, pool or well pumps, and other similar items are specifically excluded from the Home Energy Scoring Tool estimates and score. On the other hand, these “other large energy uses” are roughly modeled in EPS Auditor, affecting estimates and scores. We would therefore expect, for homes that had “other large energy uses”, EPS Auditor estimates of usage would be higher than the Home Energy Scoring Tool estimates, all else being equal. To assess and adjust for this difference, we completed a comparison after removing these “other large energy uses” from the EPS results (Case 3, see Table 47, Figure 22, and Figure 23). This had very little effect on the overall correlation and other comparison metrics, except for the mean EPS estimated site energy use (26,055 kWheq/yr) which dropped slightly because these other energy uses are no longer included. Based on these only-minor changes, it is clear that this difference in the scope of inputs is a relatively minor contributor to the observed overall pattern of differences in energy use estimates.

Electricity and Natural Gas Use Estimates The tools were next compared in terms of their estimates of electricity use (all homes in this sample) and natural gas use (for only those 123 homes using gas). These two comparisons are presented in Cases 4 and 5 (see Table 47, Figure 24, and Figure 25). By most metrics, model estimates for electricity were found to be more consistent than gas.

While the Home Energy Scoring Tool and the EPS tool show a high correlation between estimates of electrical use (r=.94, Home Energy Scoring Tool approximately 8% higher than EPS on average, and 73% of Home Energy Scoring Tool estimates are within +/-25% of the EPS estimate), the natural gas estimates show a greater variability (r=.76, Home Energy Scoring Tool approximately 1.5% higher than EPS on

Behavioral Perspectives on Home Energy Audits Page 336 average, and 81% of Home Energy Scoring Tool estimates are within +/-25% of the EPS estimate). Of course, accounting for other large energy uses (as above for total site energy) may further improve the regression result for estimated electricity use; however, this analysis was not completed.

Space Heating, Water Heating, and Appliances, Electronics, and Lighting End Use Estimates Finally, model end use estimates were compared. Water heating use estimates were most consistent while the “other end uses” estimate (including appliances, electronics, and lighting) was least consistent. The three end use categories are compared using site energy (kWheq). These comparisons are presented for Cases 6, 7, and 8 (see Table 47, Figure 26, and Figure 27).

The two tools are reasonably consistent in estimated space heating energy use compared in Case 6 (r=.84, Home Energy Scoring Tool approximately 1.4% higher than EPS on average, and 71% of Home Energy Scoring Tool estimates are within +/-25% of the EPS estimate). For water heating energy use (Case 7), the estimates show good consistency (r=.89, Home Energy Scoring Tool approximately 8% lower than EPS on average, and 88% of Home Energy Scoring Tool estimates are within +/-25% of the EPS estimate). As can be seen in the scatter plot for Case 7, the Home Energy Scoring Tool water heating energy usage estimates appear to fall into a series of discrete states. This is understandable based on the limited, mostly discrete inputs that drive the water heating use calculation in the Home Energy Scoring Tool: number of bedrooms in the house in turn determining occupancy; water heater fuel (either gas or electric in these homes); and water heater efficiency is not necessarily discrete but is limited by the data available from the original audits and by the conversion of this original data into Scoring Tool inputs.

For appliances, electronics, and lighting end uses excluding other large energy uses from EPS estimates (Case 8), the models show less consistency than other comparisons (r=.80, Home Energy Scoring Tool estimates approximately 27% higher usage than EPS on average, while only 50% of Home Energy Scoring Tool estimates are within +/-25% of the EPS estimate. EPS estimated other large energy uses (such as from pools, hot tubs, well pumps, and the like) were removed from this calculation as they are excluded from the scope of Home Energy Scoring Tool. Of interest, the difference in modeled lighting, electronics, and appliance energy use appears to contribute the majority of the 5% discrepancy in average total site energy between the two tools.

Behavioral Perspectives on Home Energy Audits Page 337

Table 47: Summary of all EPS and Home Energy Scoring Tool Comparison Runs

Case 1 2 3 4 5 6 7 8 Description Estimated Estimated Estimated Estimated Estimated Estimated Estimated Estimated Total Site Total Source Total Site Electricity Natural Gas Space Water Other End Uses Energy Use in Energy Use in Energy Use Use in Use in Heating End Heating End (Appliances, kWheq/yr MBTU/yr Excluding kWh/yr therms/yr Use in Use in Lighting, etc) in Other Large kWheq/yr kWheq/yr kWheq/yr, Energy Uses Excluding Other in kWheq/yr Large Energy Uses Sample N 237 237 237 237 123 (homes 237 237 237 with any natural gas use) Correlation (r) .87 .84 .88 .94 .76 .84 .89 .80 Best Fit Line Slope (b1) .90 [.84 to .77 [.70 to .92 [.86 to .88 [.84 to .78 [.66 to .86 [.79 to .88 [.83 to 1.32 [1.19 to .97] .83] .99] .92] .90] .94] .94] 1.45] Best Fit Line Intercept 3894 [2050 61.3 [47.9 to 3651 [1837 2829 [2201 195 [88 to 2444 [1137 132 [-98 to -310 [-1101 to (b0) to 5738] 74.6] to 5465] to 3458] 302] to 3751] 361] 480]

Mean Home Energy 27720 217.4 27720 14711 855 16340 3477 7800 kWheq/yr Scoring Tool kWheq/yr MBTU/yr kWheq/yr 1 kWh/yr therms/yr kWheq/yr kWheq/yr Mean EPS 26389 203.6 26055 13577 842 16115 3785 6140 kWheq/yr kWheq/yr MBTY/yr kWheq/yr kWh/yr therms/yr kWheq/yr kWheq/yr Percent Difference in On average, On average, On average, On average, On average, On average, On average, On average, Means Home Energy Home Energy Home Energy Home Home Home Home Energy Home Energy Scoring Tool Scoring Tool Scoring Tool Energy Energy Energy Scoring Tool Scoring Tool model model model Scoring Tool Scoring Tool Scoring Tool model model estimates are estimates are estimates are model model model estimates are estimates are 5.0% higher 6.8% higher 6.4% higher estimates estimates estimates 8% higher 27% higher than EPS than EPS than EPS are 8.4% are 1.5% are 1.4% than EPS than EPS higher than higher than higher than EPS EPS EPS Mean Absolute Difference 3638 27 MBTU/yr 3574 2152 146 3305 474 1675 kWheq/yr kWheq/yr kWheq/yr kWh/yr therms/yr kWheq/yr kWheq/yr Mean Absolute % 14.1% 13.8% 14.1% 19.2% 19.4% 22.2% 13.1% 27.2% Difference Percent of Homes where 0% 0% 0% 0% 0% 0% 0% 0% Home Energy Scoring Tool is more than 50% below EPS Percent of Homes where 2% 1% 1% 2% 6% 10% 10% 0% Home Energy Scoring Tool is more than 25% below EPS Percent of Homes where 86% 87% 86% 73% 81% 71% 87% 50% Model Estimates are within +/- 25% Percent of Homes where 12% 12% 13% 25% 13% 19% 3% 50% Home Energy Scoring Tool is more than 25% above EPS Percent of Homes where 2% 2% 2% 6% 5% 6% 2% 6% Home Energy Scoring Tool is more than 50% above EPS 1 Other large energy uses are not included in the Home Energy Scoring Tool inputs. Therefore, Case 3 Home Energy Scoring Tool estimates are unchanged; only the EPS estimates were affected.

Behavioral Perspectives on Home Energy Audits Page 338

Figure 22: Scatter Plots Comparing EPS and Home Energy Scoring Tool Total Energy Use Estimates. Note: Solid line represents perfect agreement; dashed line represents best fit

Case 1 – Estimated Total Site Energy Use in kWheq/yr Case 2 – Estimated Total Source Energy Use in MBTU/yr

Case 3 – Estimated Total Site Energy Use Excluding Other Large Energy Uses in kWheq/yr

Behavioral Perspectives on Home Energy Audits Page 339

Figure 23: Distribution of the Percent Difference Between EPS and Home Energy Scoring Tool Total Energy Use Estimates

Case 1 – Estimated Total Site Energy Use in kWheq/yr Case 2 – Estimated Total Source Energy Use in MBTU/yr

Case 3 – Estimated Total Site Energy Use Excluding Other Large Energy Uses in kWheq/yr

Behavioral Perspectives on Home Energy Audits Page 340

Figure 24: Scatter Plots Comparing EPS and Home Energy Scoring Tool Total Electricity and Natural Gas Use Estimates. Note: Solid line represents perfect agreement; dashed line represents best fit

Case 4 – Estimated Electricity Use in kWh/yr Case 5 – Estimated Natural Gas Use in therms/yr

Figure 25: Distribution of the Percent Difference Between EPS and Home Energy Scoring Tool Electricity and Natural Gas Use Estimates

Case 4 – Estimated Electricity Use in kWh/yr Case 5 – Estimated Natural Gas Use in therms/yr

Behavioral Perspectives on Home Energy Audits Page 341

Figure 26: Scatter Plots Comparing EPS and Home Energy Scoring Tool Space Heating, Water Heating, and Other (Lights and Appliances) Energy Use Estimates. Note: Solid line represents perfect agreement; dashed line represents best fit

Case 6 – Estimated Space Heating End Use in kWheq/yr Case 7 – Estimated Water Heating End Use in kWheq/yr

Case 8 – Estimated Other End Uses (Appliances, Lighting, etc) in kWheq/yr, Excluding Other Large Energy

Behavioral Perspectives on Home Energy Audits Page 342

Figure 27: Distribution of the Percent Difference Between EPS and Home Energy Scoring Tool Space Heating, Water Heating, and Other (Lights and Appliances) Energy Use Estimates

Case 6 – Estimated Space Heating End Use in kWheq/yr Case 7 – Estimated Water Heating End Use in kWheq/yr

Case 8 – Estimated Other End Uses (Appliances, Lighting, etc) in kWheq/yr, Excluding Other Large Energy

Implications  Overall, there is quite good agreement between EPS and Home Energy Scoring Tool for this set of Seattle homes. The basic comparability of the two tools is slightly confounded by “other large energy uses” which are included in EPS estimates and not in Home Energy Scoring Tool estimates (due to a different scope of tool inputs); when these are considered, the overall comparison improves slightly. Though Home Energy Scoring Tool estimates for electricity usage were 8% higher than EPS estimates on average, they were otherwise more consistent with EPS

Behavioral Perspectives on Home Energy Audits Page 343

estimates than the natural gas estimates, which were only 2% higher on average than EPS estimates.  Correlation between model estimates of heating, water heating, and appliances, electronics, and lighting are generally high. However, Home Energy Scoring Tool water heating estimates tend to be slightly lower (308 kWheq/yr) than EPS estimates while Home Energy Scoring Tool estimates of appliances, electronics, and lighting are quite a bit higher (1660 kWheq/yr) than EPS estimates. This difference likely results from the assumptions necessary to characterize the “standard” household use of appliances, lighting, and electronics for asset model calculations. These assumptions vary between the tools (Appendix T).

The key takeaways from the comparison of these two tools are that: 1. The tools are reasonably consistent, in aggregate and in various components, considering the input conversion and the incomplete set of inputs available for the Home Energy Scoring Tool runs. The differences we observed, particularly in the lighting, appliances, and electronics category, are likely partially the result of differing asset “standard occupant behavior” assumptions. 2. Because the model estimates are quite consistent, the input conversion method (upon which the Home Energy Scoring Tool model estimates depend) appears to not degrade these results to an unacceptable degree. This supports the use of the input conversion and resulting model estimates for the next step (Appendix Q): comparing both tools, along with operational considerations (weather, occupancy, and behavior), to utility-reported household gas and electricity usage. 3. Some areas may be useful to dig further to understand why the comparison shows greater variability (e.g., natural gas use and space heating end use)—or where there is a substantial offset in average use (e.g., “other end uses” and water heating end use).

Behavioral Perspectives on Home Energy Audits Page 344

Appendix Q: Comparison of Asset and Operational Model Estimates to Utility-Reported Energy Usage

Introduction and Purpose Understanding that EPS Auditor and the Home Energy Scoring Tool provide estimates of a home’s asset energy use, we do not expect that these modeled energy use estimates should correspond directly to actual energy use within a particular household because of their reliance upon “standard” occupancy, operational, and weather assumptions. However, if asset model estimates are used to recommend upgrades to homeowners who care mostly about how their home actually uses energy, as seems to be the case for most of the Seattle survey respondents (see Chapter 4), it is important that energy use estimates are close enough to provide good upgrade guidance on this basis.

Operational modeling is an alternative approach that, by modeling occupant energy use behaviors,97 can avoid many of these assumptions and presumably can better represent the household’s actual energy usage, and perhaps even provide improved upgrade guidance to the homeowner.

But, how well do asset and operational model results actually represent a household’s energy usage? In this analysis, a set of houses was modeled in both asset and operational tools and the results were compared to utility-reported natural gas and electricity usage, to help address the following questions:  To what degree are asset model estimates similar to actual utility usage?  To what degree does including operational considerations in the modeling improve model estimates of actual utility usage?  Are these asset and/or operational model estimates close enough to actual energy use to be useful in informing savings potential from upgrade recommendations?  Can potential improvements to these modeling tools be identified?

To address these questions, model-generated energy usage estimates were compared to the home’s total energy use, electricity use, and natural gas use as reported by their utilities. Estimates generated by four separate models were compared to utility-reported usage.98  EPS Auditor (asset): original results from audits  Home Energy Scoring Tool (asset): generated by the Emulator utilizing inputs converted from the EPS audit data  Home Energy Scoring Tool, adjusted for weather (asset+weather): generated by the Emulator running the Home Energy Scoring Tool asset inputs with historical weather data for Seattle over the period included in the Utility-reported usage

97 “Operational” energy audits can also use an alternative approach, referencing utility billing data in the modeling process or when making recommendations. We do not consider this approach in this appendix. 98 Utility reported usage data required significant processing to calculate annual usage from asynchronous billing periods combining both gas and electric data. While utility-reported usage is assumed to be representative of “actual” household usage, no validation of this data source was possible within the scope of this project. Further details of this processing are described in Appendix N.

Behavioral Perspectives on Home Energy Audits Page 345

 HESPro adjusted for weather and including a limited set of appliance, occupancy, and energy use behavior inputs (house+weather+behavior): generated by the Emulator running the home’s asset inputs along with appliance and house99, and occupancy/behavior inputs translated from the energy use behavior survey responses100, using historical weather data101 for Seattle over the period included in the Utility-reported usage.

Because end use data for heating, water heating, and other uses was not available from the basic electricity and natural gas data, comparison of model results at the end use level is not reported.

Approach Of the 1,355 homes receiving an EPS home energy audit, the research team obtained complete utility records for 251 homes for at least one year prior to the date the audit was conducted. Of these, 14 homes could not be successfully processed through the input converter and modeled in the Home Energy Scoring Tool Emulator due to either input errors which could not be resolved, or emulator limitations.102 Of the remaining 237 homes with utility data and full model input data, 101 homes completed an energy use behavior survey, with 33 (33%) all-electric homes and the remaining 68 (67%) using both natural gas and electricity. Only these 101 homes were included in the analysis below. Utility data for homes using oil for heating was not generally available, so these homes were excluded. Of the 1,355 homes receiving SCL Home Energy Audits, 63% used natural gas for heating, while 17% used electricity for heating and 17% used oil. This distribution is similar to that for Seattle single family housing overall, with 58% using natural gas for heating, 22% using electricity for main heating, and 17% using oil for heating (Tachibana 2010).

A necessary portion of the machinery of generating these model estimates (except for the EPS results) is the use of the converter utility and the Emulator described in Appendix N. Based on the generally high correlation between model estimates from EPS Auditor and the Home Energy Scoring Tool (see Appendix P), we conclude that the conversion process was acceptable for the purposes of this analysis;

99 A set of house and appliance factors was added as inputs to the HESPro tool for these runs. Some of this information was drawn from EPS inputs that do not map to Home Energy Scoring Tool inputs; other information was collected with the audits but is not used for EPS calculations. These factors were: shading of the house; water heater location; measured hot water temperature; refrigerator and clothes washer energy star status; and stove and oven fuel. These inputs were included in the “House+Weather+Behavior” runs so as not to unnecessarily hamstring model estimates by withholding readily available house and equipment information. 100 The following inputs, gathered from the energy use behavior survey responses, were utilized to reflect behavior in the model runs: occupancy; heating and cooling thermostat settings, with different heating settings allowed for morning, daytime, evening, and overnight periods; dishwasher, clothes washer, and clothes dryer loads per week; clothes washer “typical” cycle temperature setting; number of additional refrigerators and freezers; whether someone is typically home during the day; use of certain common electronics (televisions and set-top boxes, computers). Other energy use behavior information was gathered but could not be reliably translated into model inputs. See Appendix N for further details. 101 Weather file processing is described in Appendix N. 102 Specifically, the emulator was unable to model integrated boiler/water heater systems (4 houses), second heating system used as primary heat source (3 cases), concrete block walls (2 cases), and various missing or out-of- range or erroneous inputs (5 houses)

Behavioral Perspectives on Home Energy Audits Page 346 while we expect the Home Energy Scoring Tool comparisons to utility usage to have been somewhat degraded by this conversion, the transmission losses appear to not be so great as to obscure the results of the analysis completed here.

Because the most appropriate metric for comparing tool results to utility usage depends on the purpose of the comparison, we provide a variety of metrics for each comparison case. Correlation results provide a useful measure of the co-variation of the tool outputs with utility-reported usage, while best-fit lines provide an indication of the slope of the linear relationship103 between the two models. Mean results from each tool are presented as an indicator of how close the tools run on an overall average for the set of houses compared. Mean absolute difference and mean absolute percent difference are provided to illustrate on average how far apart estimates tend to be on an absolute basis. Finally, percentage of homes where model runs are within +/-25%, more than 25% higher, more than 25% lower, more than 50% higher, or more than 50% lower provide more specific measures of the level of agreement between model estimates for each home.

Results of Comparison of Model Estimates to Utility-Reported Usage

Total Site Energy Use—Model Estimates vs. Utility-Reported Total site energy use (kWheq/yr) estimates from four model runs—EPS Auditor (asset), the Home Energy Scoring Tool (asset), Home Energy Scoring Tool accounting for weather (asset+weather), and HESPro with weather and behavior (house+weather+behavior)—were compared to observed energy use for the year prior to each home’s energy audit. The results of this comparison are presented for Cases 9, 11, 17, and 21 (see Table 48, Figure 28, and Figure 29).

103 Assuming the relationship is linear in form—other forms of relationship were not checked and may be present. In all cases presented, we found a significant relationship between estimates from Home Energy Scoring Tool and EPS Auditor, based on linear regression analysis.

Behavioral Perspectives on Home Energy Audits Page 347

Table 48: Total Site Energy (kWheq/yr)—Comparison of Asset, Asset+Weather, and House+Weather+Behavior Model Estimates to Utility Reported Usage

Case 9 11 17 21 Description EPS Home Energy Scoring Home Energy Scoring HESPro + Weather + Tool Tool + Weather Behavior HOUSE+WEATHER+ ASSET ASSET ASSET+WEATHER BEHAVIOR Sample N 101 101 101 101 Correlation (r) .66 .58 .57 .68 Correlation Line Slope (b1) .70 [.54 to .87] .60 [.43 to .77] .62 [.44 to .79] .85 [.67 to 1.02] Correlation Line Intercept 5469 [643 to 10294] 7270 [1936 to 12603] 7142 [1645 to 12639] 3674 [-1174 to 8522] (b0) Mean Utility-Reported 25335 kWheq/yr 25335 kWheq/yr 25334 kWheq/yr 25335 kWheq/yr Usage Mean Model Estimated 28259 kWheq/yr 30224 kWheq/yr 29480 kWheq/yr 25645 kWheq/yr Usage Percent Difference in EPS run average is Home Energy Scoring Home Energy Scoring HESPro run average is Means 11.5% higher than Tool run average is Tool run average is 1.2% higher than utility utility bill average 19.3% higher than 16.4% higher than bill average utility bill average utility bill average Mean Absolute Difference 6958 kWheq/yr 8286 kWheq/yr 8064 kWheq/yr 5785 kWheq/yr Mean Absolute % 33.6% 42.2% 40.9% 27.0% Difference Percent of Homes where 0% 0% 0% 0% Model Underestimates Utility Use by more than 50% Percent of Homes where 8% 8% 9% 11% Model Underestimates Utility Use by more than 25% Percent of Homes where 55% 48% 51% 61% Model Estimate of Utility Use is within +/- 25% Percent of Homes where 37% 44% 40% 28% Model Overestimates Utility Use by more than 25% Percent of Homes where 20% 27% 27% 13% Model Overestimates Utility Use by more than 50%

Behavioral Perspectives on Home Energy Audits Page 348

Figure 28: Scatter Plots of Model Estimated Total Site Energy vs Utility Reported Usage—Asset, Asset+Weather, and House+Weather+Behavior Models. Note: Solid lines represent perfect agreement; dashed line represents best fit

Case 9 – EPS Case 11 – Home Energy Scoring Tool ASSET ASSET

Case 17 – Home Energy Scoring Tool + Weather Case 21 – HESPro + Weather + Behavior ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR

Behavioral Perspectives on Home Energy Audits Page 349

Figure 29: Distribution of the Percent Difference Between Model Estimated Total Site Energy and Utility- Reported Usage—Asset, Asset+Weather, and House+Weather+Behavior Models

Case 9 – EPS Case 11 – Home Energy Scoring Tool ASSET ASSET

Case 17 – Home Energy Scoring Tool + Weather Case 21 – HESPro + Weather + Behavior ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR

These cases show modest agreement between model results and utility data, with EPS asset estimates showing a somewhat better fit (Case 9) than the Home Energy Scoring Tool asset results (Case 11), which is to be expected given the transmission losses inherent to the input mapping process. Surprisingly, accounting for weather (Case 17) does not appear to improve the Home Energy Scoring Tool results except to shift the mean modeled usage closer to the mean utility reported usage. However,

Behavioral Perspectives on Home Energy Audits Page 350 the weather in Seattle in this period104 was pretty “typical”105 on a full-year basis and therefore the weather adjustment would be expected to have only a minor impact.

The addition of appliance, occupancy, and behavior information to the HESPro model (Case 21) does improve the fit in various respects. The correlation is higher (r=0.69) than for Home Energy Scoring Tool+Weather (Case 17, r=0.57), means are much closer (1.2% difference compared to a 16.4% difference for Home Energy Scoring Tool+Weather), and a lower percentage of model results are greater than 25% and 50% different from the utility-reported usage for that home.

“Percent difference in means” expresses the difference between the group’s mean utility-reported usage and the group’s model-estimated usage (the previous two lines in the table). “Mean absolute percent difference,” in contrast, summarizes how well the modeled estimates match the utility-reported consumption for each household. For example, the case 21 (HESPro+weather+behavior) “percent difference in means” result indicates that the mean model estimate for this set of homes is slightly (1.2%) higher than the mean utility-reported total site energy usage for these homes. The HESPro+weather+behavior model estimate for an individual home, however, is on average 27% different (either high or low) from the utility-reported energy use for that home. This second number (“mean absolute percent difference”) is the more pertinent metric from the perspective of considering the applicability of the energy guidance provided to individual homeowners.

Total Source Energy Use—Model Estimates vs. Utility-Reported Results for comparison of total source energy use (MBTU/yr) to utility-reported usage were similar to the results of the site comparison. The results of this comparison are presented for Cases 10, 12, 18, and 22 (see Table 49, Figure 30, and Figure 31).

104 This weather period turns out to be from May 2009 through October 2011, which includes a full year prior to the earliest audit included in the analysis. 105 Close to the TMY2 “typical meteorological year” weather strip

Behavioral Perspectives on Home Energy Audits Page 351

Table 49: Total Source Energy (MBTU/yr)—Comparison of Asset, Asset+Weather, and House+Weather+Behavior Model Estimates to Utility Reported Usage

Case 10 12 18 22 Description EPS Home Energy Scoring Tool Home Energy Scoring Tool + HESPro + Weather + Behavior Weather ASSET ASSET ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR Sample N 101 101 101 101 Correlation (r) .66 .55 .54 .67 Correlation Line .87 [.67 to 1.07] 0.72 [.50 to .94] .71 [.49 to .92] .82 [.64 to 1.01] Slope (b1) Correlation Line 16 [-25 to 56] 32 [-17 to 80] 37 [-11 to 85] 37 [1 to 72] Intercept (b0) Mean Utility- 186.6 MBTU/yr 186.6 MBTU/yr 186.6 MBTU/yr 186.6 MBTU/yr Reported Usage Mean Model 196.5 MBTU/yr 214.9 MBTU/yr 211.8 MBTU/yr 182.0 MBTU/yr Estimated Usage Percent Difference EPS run average is 5.3% Home Energy Scoring Tool run Home Energy Scoring Tool run HESPro run average is 2.5% in Means higher than utility bill average average is 15.2% higher than average is 13.5% higher than lower than utility bill average utility bill average utility bill average Mean Absolute 47.3 MBTU/yr 57.8 MBTU/yr 57.3 MBTU/yr 44.5 MBTU/yr Difference Mean Absolute % 31.5% 41.3% 40.5% 26.7% Difference Percent of Homes 0% 0% 1% 0% where Model Underestimates Utility Use by more than 50% Percent of Homes 14% 8% 8% 19% where Model Underestimates Utility Use by more than 25% Percent of Homes 53% 49% 50% 59% where Model Estimate of Utility Use is within +/- 25% Percent of Homes 33% 43% 42% 22% where Model Overestimates Utility Use by more than 25% Percent of Homes 19% 25% 25% 10% where Model Overestimates Utility Use by more than 50%

Behavioral Perspectives on Home Energy Audits Page 352

Figure 30: Scatter Plots of Model Estimated Total Source Energy vs. Utility Reported Usage—Asset, Asset+Weather, and House+Weather+Behavior Models. Note: Solid lines represent perfect agreement; dashed line represents best fit

Case 10 – EPS Case 12 – Home Energy Scoring Tool ASSET ASSET

Case 18 – Home Energy Scoring Tool + Weather Case 22 – HESPro + Weather + Behavior ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR

Behavioral Perspectives on Home Energy Audits Page 353

Figure 31: Distribution of the Percent Difference Between Model Estimated Total Source Energy and Utility- Reported Usage—Asset, Asset+Weather, and House+Weather+Behavior Models

Case 10 – EPS Case 12 – Home Energy Scoring Tool ASSET ASSET

Case 18 – Home Energy Scoring Tool + Weather Case 22 – HESPro + Weather + Behavior ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR

These cases again show modest agreement between model results and utility data, with EPS asset estimates showing a somewhat better fit (Case 10) than the Home Energy Scoring Tool asset results (Case 12). Again, accounting for weather (Case 18) does not appear to improve the Home Energy Scoring Tool agreement with utility bills except to shift the mean modeled usage closer to the mean utility reported usage.

The addition of appliance, occupancy, and behavior information to the HES model (Case 22) does improve the fit in various respects. The correlation is higher (r=0.67) than for Home Energy Scoring

Behavioral Perspectives on Home Energy Audits Page 354

Tool+Weather (Case 18, r=.55), means are much closer (2.5% difference compared to a 13.5% difference for Home Energy Scoring Tool+Weather), and a lower percentage of model results are greater than 25% and 50% different from the utility-reported usage for that home.

Electricity Use—Model Estimates vs. Utility-Reported Results for comparison of electricity use (kWh/yr) to utility-reported usage are presented for Cases 13, 15, 19, and 23 (see Table 50, Figure 32, and Figure 33).

Table 50: Electricity Usage (kWhr/yr)—Comparison of Asset, Asset+Weather, and House+Weather+Behavior Model Estimates to Utility Reported Usage

Case 13 15 19 23 Description EPS Home Energy Scoring Tool Home Energy Scoring Tool + HESPro + Weather + Behavior Weather ASSET ASSET ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR Sample N 101 101 101 101 Correlation (r) .77 .74 .73 .80 Correlation Line .77 [.65 to .90] .75 [.61 to .89] .75 [.61 to .89] .80 [.68 to .92] Slope (b1) Correlation Line 2781 [996 to 4566] 1983 [-85 to 4053] 2044 [-39 to 4127] 2961 [1350 to 4571] Intercept (b0) Mean Utility- 11884 kWh/yr 11884 kWh/yr 11884 kWh/yr 11884 kWh/yr Reported Usage Mean Model 11763 kWh/yr 13202 kWh/yr 13144 kWh/yr 11140 kWh/yr Estimated Usage Percent Difference EPS run average is 1% lower Home Energy Scoring Tool Home Energy Scoring Tool run HESPro run average is 6.3% in Means than electricity bill average run average is 11.1% higher average is 10.6% higher than lower than electricity bill than electricity bill average electricity bill average average Mean Absolute 3703 kWh/yr 4217 kWh/yr 4225 kWh/yr 3444 kWh/yr Difference Mean Absolute % 38.1% 50.1% 50.1% 32.1% Difference Percent of Homes 7% 4% 4% 8% where Model Underestimates Utility Use by more than 50% Percent of Homes 22% 15% 15% 26% where Model Underestimates Utility Use by more than 25% Percent of Homes 41% 40% 40% 51% where Model Estimate of Utility Use is within +/- 25% Percent of Homes 37% 45% 45% 23% where Model Overestimates Utility Use by more than 25% Percent of Homes 19% 30% 30% 11% where Model Overestimates Utility Use by more than 50%

Behavioral Perspectives on Home Energy Audits Page 355

Figure 32: Scatter Plots of Model Estimated Electricity Usage vs. Utility Reported Usage—Asset, Asset+Weather, and House+Weather+Behavior Models. Note: Solid lines represent perfect agreement; dashed line represents best fit

Case 13 – EPS Case 15 – Home Energy Scoring Tool ASSET ASSET

Case 19 – Home Energy Scoring Tool + Weather Case 23 – HESPro + Weather + Behavior ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR

Behavioral Perspectives on Home Energy Audits Page 356

Figure 33: Distribution of the Percent Difference Between Model Estimated Electricity Usage and Utility- Reported Usage—Asset, Asset+Weather, and House+Weather+Behavior Models

Case 13 – EPS Case 15 – Home Energy Scoring Tool ASSET ASSET

Case 19 – Home Energy Scoring Tool + Weather Case 23 – HESPro + Weather + Behavior ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR

These cases show better agreement between model results and utility data than the total energy use comparisons, with EPS asset estimates showing a somewhat better fit (Case 13) than the Home Energy Scoring Tool asset results (Case 15). Again, accounting for weather (Case 19) does not appear to improve the Home Energy Scoring Tool results except to shift the mean modeled usage closer to the mean utility reported usage.

The addition of appliance, occupancy, and behavior information to the HES model (Case 23) does improve the fit in various respects. The correlation is higher (r=0.80) than for Home Energy Scoring

Behavioral Perspectives on Home Energy Audits Page 357

Tool+Weather (Case 19, r=0.73), means are somewhat closer (model results are 6.3% lower than utility usage compared to 10.6% higher for Home Energy Scoring Tool+Weather), and a lower percentage of model results are greater than 25% and 50% different from the utility-reported usage for that home.

Natural Gas Use—Model Estimates vs. Utility-Reported Results for comparison of natural gas use (therms/yr) to utility-reported usage are presented for Cases 14, 16, 20, and 24 (see Table 51, Figure 34, and Figure 35).

Behavioral Perspectives on Home Energy Audits Page 358

Table 51: Natural Gas Usage (therms/yr)—Comparison of Asset, Asset+Weather, and House+Weather+Behavior Model Estimates to Utility Reported Usage

Case 14 16 20 24 Description EPS Home Energy Scoring Tool Home Energy Scoring Tool + HESPro + Weather + Behavior Weather ASSET ASSET ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR Sample N 68 68 68 68 Correlation (r) .40 .37 .36 .52 Correlation Line Slope .32 [.13 to .50] .31 [.12 to .49] 0.31 [.11 to .50] 0.50 [.30 to .71] (b1) Correlation Line 416 [255 to 578] 418 [246 to 589] 423 [251 to 599] 308 [148 to 468] Intercept (b0) Mean Utility- 682 therms/yr 682 therms/yr 682 therms/yr 682 therms/yr Reported Usage Mean Model 836 therms/yr 863 therms/yr 828 therms/yr 735 therms/yr Estimated Usage Percent Difference in EPS run average is 22.6% Home Energy Scoring Tool Home Energy Scoring Tool run HESPro run average is 7.8% Means higher than gas bill average run average is 26.5% higher average is 21.4% higher than higher than gas bill average than gas bill average gas bill average Mean Absolute 277 therms/yr 284 therms/yr 272 therms/yr 202.7 therms/yr Difference Mean Absolute % 49.6% 53.1% 50.5% 35.3% Difference Percent of Homes 3% 3% 3% 3% where Model Underestimates Utility Use by more than 50% Percent of Homes 11% 8% 11% 13% where Model Underestimates Utility Use by more than 25% Percent of Homes 42% 46% 43% 51% where Model Estimate of Utility Use is within +/- 25% Percent of Homes 47% 46% 46% 36% where Model Overestimates Utility Use by more than 25% Percent of Homes 31% 32% 31% 24% where Model Overestimates Utility Use by more than 50%

Behavioral Perspectives on Home Energy Audits Page 359

Figure 34: Scatter Plots of Model Estimated Natural Gas Usage vs. Utility Reported Usage—Asset, Asset+Weather, and House+Weather+Behavior Models. Note: Solid lines represent perfect agreement; dashed line represents best fit

Case 14 – EPS Case 16 – Home Energy Scoring Tool ASSET ASSET

Case 20 – Home Energy Scoring Tool + Weather Case 24 – HESPro + Weather + Behavior ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR

Behavioral Perspectives on Home Energy Audits Page 360

Figure 35: Distribution of the Percent Difference Between Model Estimated Natural Gas Usage and Utility- Reported Usage—Asset, Asset+Weather, and House+Weather+Behavior Models

Case 14 – EPS Case 16 – Home Energy Scoring Tool ASSET ASSET

Case 20 – Home Energy Scoring Tool + Weather Case 24 – HESPro + Weather + Behavior ASSET+WEATHER HOUSE+WEATHER+BEHAVIOR

These cases showed the least overall agreement between model results and utility data among the various comparisons. EPS asset estimates showed a somewhat better fit (Case 14) than the Home Energy Scoring Tool asset results (Case 16). Again, accounting for weather (Case 20) does not appear to improve the Home Energy Scoring Tool results except to shift the mean modeled usage closer to the mean utility reported usage.

The addition of appliance, occupancy, and behavior information to the HES model (Case 24) does improve the fit in various respects, although the fit remains modest at best. The correlation is higher

Behavioral Perspectives on Home Energy Audits Page 361

(r=0.52) than for Home Energy Scoring Tool+Weather (Case 20, r=0.36), means are closer (model results are 7.8% higher than utility usage compared to 21.4% higher for Home Energy Scoring Tool+Weather), and a lower percentage of model results are greater than 25% and 50% different from the utility- reported usage for that home.

Discussion Several results from the analysis warrant additional exploration.

Asset Models Recognizing that asset-based models are not designed to predict actual usage of a household, as they do not consider operational factors, this comparison provided a baseline for determining how much the addition of operational factors—weather and occupant energy use behaviors—changed the model estimates. Asset model estimates and recommendations, when delivered as part of a rapid, low-cost assessment of a home, also merit scrutiny. Model estimates of energy usage inform the upgrade recommendations and savings estimates—when modeled energy usage differs from actual usage, it is likely that savings from completed upgrades will be over- or under-estimated.

We found EPS Auditor and Home Energy Scoring Tool estimates to be moderately consistent with utility- reported usage, but they left a substantial proportion of the sampled homes with large differences between modeled and observed usage, as summarized in Table 52 and Table 53.

Table 52: Percentage of homes where EPS asset model estimated usage was substantially lower or higher than utility reported usage

Model estimate is ….. More than More than More than More than than utility-reported use 50% lower 25% lower 25% higher 50% higher Total Site Energy Use (Case 9) 0% 8% 37% 20% Total Source Energy Use (Case 10) 0% 14% 33% 19% Electricity Use (Case 13) 7% 22% 37% 19% Natural Gas Use (14) 3% 11% 47% 31%

Table 53: Percentage of homes where Home Energy Scoring Tool asset model estimated usage substantially higher or lower than utility reported usage

Model estimate is …… More than More than More than More than than utility-reported use 50% lower 25% lower 25% higher 50% higher Total Site Energy Use (Case 11) 0% 8% 44% 27% Total Source Energy Use (Case 12) 0% 8% 43% 25% Electricity Use (Case 15) 4% 15% 45% 30% Natural Gas Use (Case 16) 3% 8% 46% 32%

For cases where model estimates are substantially higher than the actual usage, quite common in this set of homes, it is likely that households will not achieve the energy or energy cost savings from upgrades as predicted in the model. However, this does not exclude the possibility that homeowners completing these upgrades will benefit—particularly those whose low energy usage is related to curtailing behaviors may see non-energy benefits.

Behavioral Perspectives on Home Energy Audits Page 362

For cases where model estimates are substantially below the actual usage, less common but still occurring in this set of homes, it is possible that opportunities for upgrades are being missed and that the models underestimate savings. However, the high utility usage relative to model estimates in these cases could be related to other large energy uses that are not within the scope of the modeling and wouldn’t necessarily see any improvement from the upgrades (for example, electric vehicle charging).

It is additionally important to note that EPS Auditor provides for each recommended upgrade a range of possible savings instead of a single value. The current version of the Home Energy Scoring Tool provides a single value. The approach of providing a range of savings to expect does communicate to the homeowner an uncertainty in this estimate. The differences between modeled usage and actual usage, as explored here, are one source of uncertainty to this value.

Both asset-based models, for this set of households, on average substantially overestimated total energy use—though more for natural gas than for electricity (and therefore site energy estimates were overestimated more than source energy estimates—for those homes using natural gas). This finding is consistent with prior comparisons of model estimates to utility billing data (Polly et al. 2011).

Addition of Weather and Behavior When weather and behavior (and a few additional house characteristics) are considered in addition to the Home Energy Scoring Tool “asset” inputs, the consistency of model estimates with observed usage improves. This result is robust across the various metrics for each of the four comparisons (site, source, electricity, and gas). This indicates that the models better represent actual usage when basic operational considerations are included. Occupancy and behavior are clearly important in making sense of a household’s energy use.

However, despite the improvements noted from including operational considerations in the modeling, for a substantial portion of the 101 homes studied large differences between model estimates and utility reported usage remain. These results are summarized in Table 54.

Table 54: Percentage of homes where HES-ProHESPro “house+weather+behavior” model estimated usage is substantially higher or lower than utility reported usage

Model estimate is ….. More than More than More than More than than utility-reported use 50% lower 25% lower 25% higher 50% higher Total Site Energy Use (Case 21) 0% 11% 28% 13% Total Source Energy Use (Case 22) 0% 19% 22% 10% Electricity Use (Case 23) 8% 26% 23% 11% Natural Gas Use (Case 24) 3% 13% 36% 24%

After considering operational factors, the HESPro model was very close in average total energy use to average utility-reported usage—effectively correcting the substantial average overestimate evident in the asset-based models. In fact, the HESPro model including operational factors underestimated utility- reported electricity usage by 6%, while overestimating natural gas usage by 8%. This improvement over the asset model is partially the effect of considering the weather, but the majority of this difference

Behavioral Perspectives on Home Energy Audits Page 363 appears to be associated with behavior and the additional “house” inputs. This suggests that either these households are relatively conserving compared to the average household, or alternatively, that the operating conditions and energy use behaviors assumed by the asset models tend towards higher energy use than is observed in these Seattle homes.

Natural Gas Estimates and Heating End Use Of concern in the modeling comparisons presented above is the lower consistency observed between utility-reported gas use and model estimates. Several factors should be considered in weighing this result. First, the correlation between EPS and Home Energy Scoring Tool asset models regarding gas use estimates was lower than the other model estimates (see Appendix P). This indicates the possibility that the HES model estimates of gas use were degraded due to the input conversion process, more than the other model estimates. Second, the gas data was all garnered from a single utility, Puget Sound Energy, and the actual sample of gas data received was relatively small. While the researchers are not aware of any data quality issues with the gas data, it is possible that the variability seen is the result of such an issue. Finally, as natural gas use was primarily for heating purposes and secondarily for water heating and appliances in these Seattle homes,106 it is possible that poor modeling of heating use is the major source of variability—not the gas use in particular.

This possibility was also investigated for electric-heated homes, which are separated out from homes with gas in Figure 36. According to audit results for this set of 101 homes, 66 of the 68 homes with natural gas use gas for their primary heating system, while the other 35 homes use electricity for their primary heating system. 33 of these 35 electricity-heated homes are all-electric. Other primary heating fuels, such as oil or wood, were excluded from this analysis. Because this population splits nearly along fuel lines (home uses gas vs. home is all-electric), we used the all-electric population (N=33) to represent the slightly larger set of electric-heated homes (N=35). The scatterplot for Case 23, comparing modeled electricity use to utility reported use, is presented again in Figure 36 but with the 33 all-electric homes distinguished from the 68 homes having gas.

106 For the 68 homes using natural gas included in this analysis, HESPro house+weather+behavior model estimates indicate approximately 77% of gas was used for heating, 15% for water heating, and 8% for appliances, including appliance hot water use.

Behavioral Perspectives on Home Energy Audits Page 364

Figure 36: HESPro+Weather+Behavior model estimated electricity use versus utility-reported electricity use (kWh). All-electric homes (N=33) are distinguished from homes with both gas and electric use (N=68).

Focusing on the all-electric and the gas/electric homes separately, it is apparent that the models did only a moderate job of estimating usage within each group. However, the overall fit when both sets of homes are considered together appears much more in line with the “perfect fit”. The majority of usage in each of these subgroups was for space heating, suggesting that the heating end use in particular may be poorly estimated in this population, even when models include basic thermostat settings and actual weather history. However, this analysis was subject to various data and modeling limitations,107 and therefore our support for this is inconclusive. If the current heating loads in homes are particularly poorly estimated, this would raise additional concerns about recommendations generated by the tools—as space heating energy use is a primary target of house- and equipment-focused upgrade recommendations.

Increasing Residuals with Increasing Usage In several of the model cases we observe that the magnitude of residuals is greater for higher using homes than for low-use homes. This effect appears to be most evident in the electricity estimates. An example of this is presented in Figure 37, which is a plot of standardized residuals of model estimated “house+weather+behavior” total site energy against utility reported usage. The implication here is that at least some of the sources of variability between model estimates and actual usage are larger for higher-using homes, which is what we would expect.

107 In particular, incomplete or ambiguous house technical characteristics, self-reported energy use behavior including thermostat management and temperature settings, the exclusion of supplemental heating from modeling, and the lack of heating end use utility data separated from total gas or electric usage.

Behavioral Perspectives on Home Energy Audits Page 365

Figure 37: Residuals plotted as a function of the regression predicted value for HESPro “house+weather+behavior” model total site energy compared to utility-reported usage, Case 21

Outliers In an effort to identify whether any operational factors are systematically contributing to poor model estimations, we reviewed the energy use behavior survey responses for the homes with the greatest divergence (above or below) between model estimates and utility-reported usage. These “outlier” homes as a whole did not evidence a single one or two key factors driving the discrepancies, although a few factors of interest were noted.

For low users (utility-reported usage was much lower than HESPro “house+weather+behavior” model estimates):  Several of the low energy users were one-person households where that individual worked outside of the home. It is likely in this case that the standard assumptions for those operational factors not informed by inputs from the energy use behavioral survey (such as use of electronics, cooking, etc.) tend to overestimate usage for this population.

For high users (utility reported usage was much higher than HESPro “house+weather+behavior” model estimates):  A few cases where gas usage was surprisingly high were cases where a gas fireplace was in use; this may indicate that the gas fireplace was inefficient but used a lot.  For one case where gas usage was surprisingly high, the auditor reported finding a “minor gas leak”; this is a possible contributing factor  One very high electric user reported charging their electric vehicle at home, as well as having a hot tub. This type of other large energy use can definitely contribute to outliers.

Behavioral Perspectives on Home Energy Audits Page 366

A number of households report supplemental heating use. In some cases, this appears to contribute to outliers. In cases where supplemental electric heating is used with primary gas heating, or vice versa— outliers on the fuel comparisons are more likely.

Implications Based on the analysis above, we revisit the question of whether the performance of these models is sufficient for the purposes to which they are being put—particularly for making upgrade recommendations and providing payback/savings estimates for these upgrades.  These asset models appear to do a mediocre job at predicting overall energy usage, which is not a criticism of the models but may suggest a limitation on whether they should be used, as has been suggested, to provide homeowners with quick, lower resolution upgrade recommendations and savings estimates (Home Energy Score, 2012). If targeted at existing homeowners similar to those in this Seattle sample, these results suggest that modeled savings estimates could be greater than 25% higher than usage for 30% to 40% of households— potentially guiding the household to complete upgrades that are less cost effective by approximately this fraction.108 On the other hand, homes where the asset model significantly underestimates the actual usage present possible missed opportunities for upgrades—where upgrades are more compelling than estimated. In this population of homes, at least 8% of homes had model estimated total energy use 25% lower than actual use. This presents a more difficult challenge, as a variety of other large energy uses are difficult to treat in models, and may not make sense in asset models in particular.  Our operational modeling does better, but even so estimated energy use was not highly correlated with actual use. However, the behavioral data collected was not always as detailed as the energy use behavior inputs to the model, and some aspects of energy use behavior could not be modeled without significant assumptions, for example lighting usage. In these cases, we relied on the HESPro default representation of these behaviors. In general the analysis appears to suggest that heating use may be relatively poorly estimated. Additional analysis of utility data to isolate the heating end use is possible, and may help clarify whether the heating end use is particularly poorly estimated; this was not within scope for this analysis.

108 The relationship between modeled total energy use and savings from a particular upgrade depends on a variety of factors. Therefore, we cannot quantify how this difference will percolate down to the savings. However, particularly if heating estimates are a significant source of a discrepancy, on average savings estimates should be off by a fraction similar to the total usage.

Behavioral Perspectives on Home Energy Audits Page 367

Appendix R: Assessment of Impact of Missing Home Energy Scoring Tool Inputs

Introduction and Purpose Not all home characteristics used as inputs to the Home Energy Scoring Tool were measured on homes receiving the EPS audit. While most characteristics were measured or were estimated, inputs relating to skylights, attic configuration, and roof construction and insulation were not measured. Inputs related to window area, house orientation, and water heater year or efficiency were measured in some EPS audits and not others. And, some significant assumptions (including assuming the absence of daylight basements) were necessary when converting inputs gathered for EPS audits into Home Energy Score inputs.

Approach 50 homes that had received an EPS audit were re-visited by a different auditor for a “post-retrofit assessment” of upgrades they made to their homes. In the course of this assessment, auditors were asked to collect data on features of the home that are characterized in the Home Energy Scoring Tool but which were not included in the EPS tool inputs. In particular, this included the presence and characteristics of skylights, roof sheathing or insulation, and attic/ceiling configuration (unconditioned, conditioned, or cathedral ceilings). In addition, certain home characteristics were gathered during the original audits but were not consistently captured. This included the house orientation, window areas, and the year of water heater installation. These measurements were also collected for those households that undertook the post-retrofit assessments.

For 15 of the 237 cases modeled in the Home Energy Scoring Tool (details provided in Appendix P), this post-retrofit assessment data was utilized to fill in missing or incomplete data from the original conversion of EPS inputs into Home Energy Scoring Tool inputs. By comparing the change in model results when this additional input data was entered, we were able to roughly assess the sensitivity of the Home Energy Scoring Tool to these missing inputs.

Additionally, EPS Auditor and the Home Energy Scoring Tool differed in their treatment of house floor area. Specifically, EPS inputs specified finished floor area or living space, while the Home Energy Scoring Tool inputs specified conditioned area, including unfinished but conditioned spaces. The data from the original EPS audits typically reflected the finished areas of these homes; we asked auditors to collect additional details on the unfinished but conditioned areas of the homes receiving post-retrofit assessments. In approximately fifteen of the 50 homes receiving a post-retrofit assessment, auditors indicated that unfinished but conditioned spaces were present in the home. Post-retrofit assessment data on these conditioned but unfinished floor area was not, however, reliable enough to include in the analysis described below. In general, however, homes with conditioned but unfinished floor areas would typically have been modeled in the original audits as having less conditioned floor area than is actually the case. This would lead to lower estimates of the heating and cooling loads for the home than if the unfinished but conditioned floor area were included in the modeling.

Behavioral Perspectives on Home Energy Audits Page 368

Results For these 15 cases, the data added to the model, and the resulting change in total Site Energy Use is presented in Table 55 below.

Table 55: 15 Houses with Post-Retrofit Assessment Additional Information vs. Home Energy Score Results based on Original Audit Information

Home Energy Home Energy Scoring Tool Scoring Tool Estimated % Estimated Total Total Site Change Site Energy Energy with in Total based on Additional Site Original Audit Inputs Energy House (kWheq/yr) (kWheq/yr) Estimate Nature of Additional Inputs Daylight basement (reduced from 2 stories on slab to 1 story with conditioned 1 47358 41000 -13% basement); added sizeable skylight Added direction+2006 water heater year; daylight basement (reduced from 2 to 2 39317 38855 -1% 1 story+unconditioned basement) Daylight basement (reduced from 4 stories on slab to 3 stories w/conditioned 3 35595 35859 1% basement); foundation insulation modified to match Daylight basement (reduced from 3 stories on slab to 2 stories w/conditioned 4 58117 58158 0% basement); added house orientation and 2008 water heater year; added small skylight 5 39279 39279 0% Added house orientation; small skylight 6 21541 21583 0% Added house orientation, 1993 water heater year, small skylight 7 32628 31819 -2% Added house orientation, 2005 water heater year Daylight basement (reduced from 3 stories w/unconditioned basement to 2 8 24913 29396 18% stories with conditioned basement); Cathedral ceiling, larger skylights, wood shake shingles, house orientation and 2001 water heater year Added small skylight, house orientation and 2005 water heater year, could not 9 44834 45167 1% map daylight basement (excessive window size); did not map mixed window styles 10 34259 34259 0% No additional info 11 25254 24441 -3% Added small skylight, house orientation and 2005 water heater year Added window dimensions, house orientation and 2009 water heater year, roof w/EPS sheathing, composition shingles; did not add roof R-11 insulation as 12 29615 29268 -1% Home Energy Scoring Tool can only handle attic OR roof insulation, and attic had superior insulation 13 45721 44912 -2% Added small skylight, house orientation, 2011 water heater year Added house orientation, 2006 water heater year; did not map mixed window 14 52620 51811 -2% styles 15 38388 36813 -4% Added medium skylight, 2011 water heater year, house orientation

Implications Two of the cases in Table 55 above (#1 and #8) showed a substantial change in the modeled total site energy use. Notably, in both of these cases the additional information reflected the presence of a daylight basement, which is modeled with a different set of inputs in the Home Energy Scoring Tool than in EPS auditor, and affects the Home Energy Scoring Tool inputs for number of stories and foundation type. Because the EPS audit inputs do not include information on whether a daylight basement is

Behavioral Perspectives on Home Energy Audits Page 369 present in a home, the default assumption used when converting inputs from EPS for Home Energy Scoring Tool was that no daylight basement was present. This additional information in some cases resulted in changes to the number of stories and foundation type inputs and in Home Energy Scoring Tool estimated usage. Of note, even with this additional information this set of inputs is potentially problematic. The auditor’s evaluation of home foundation type, number of stories above grade, and conditioned vs. unconditioned spaces is subject to interpretation, given that many homes have multiple foundation types and various combinations of conditioned, unconditioned, finished, and unfinished spaces. This need for interpretation is likely a large source of variability in model inputs and may be a major source of differences when model results are compared to actual utility usage.

The remaining cases show relatively minor changes in modeled total site energy usage, based on the additional types of data added to the model inputs (skylights, house orientation, water heater year, and roof and attic configuration). While 15 is a small sample size, it appears likely that for most cases, the missing input data was a minor limitation to Home Energy Scoring Tool modeling.

Behavioral Perspectives on Home Energy Audits Page 370

Appendix S: Comparison of Recommendations—EPS Auditor and Home Energy Scoring Tool

Introduction and Purpose The evidence from homeowner survey responses indicates that upgrade recommendations, more than the score, are the useful output of the EPS audit report for the Seattle Homeowners receiving audits and responding to surveys (see Chapter 4). Audit recommendations are generated based on the current condition of the home and the likely cost and savings associated with the upgrades. But for any particular home, what upgrades are best—by what criteria—and how much will they save?

We compare EPS recommendations with Home Energy Scoring Tool recommendations for a set of 31 homes modeled using both tools. Home Energy Scoring Tool recommendations are generated using inputs translated from the EPS inputs and entered into the online Home Energy Scoring Tool, with necessary assumptions and approximations to fill missing inputs and overcome ambiguities in translation.109 These Home Energy Scoring Tool runs were completed mid-January 2012. EPS recommendations were taken from the audit report for the home. We focused exclusively on the “standardized” recommendations provided in the “Summary of Recommended Energy Upgrades” portion of the EPS report; additional recommendations and recommendation details were provided in many cases in other portions of the report.110 These “standardized” recommendations, which include model-generated cost and savings estimates, can be most directly compared to Home Energy Scoring Tool recommendations.

We looked to address the following questions:  Are recommendations consistent across tools? Are cost/savings/payback estimates consistent?  Are model differences or assumptions made evident by this comparison that are not otherwise evident in model results?  Do these recommendations provide useful guidance to the homeowner on whether and how best to upgrade their home?

Approach First, we compared all of the recommendations, in aggregate, generated from the EPS home energy audits and from our Home Energy Scoring Tool runs for the 31 homes. This comparison provided some indication of the overall differences and similarities in the recommendations generated for the same home using the two tools.

Next, we selected the four most commonly recommended upgrade categories in the Home Energy Scoring Tool “repair now” section of the report so as to maximize sample for comparison of individual

109 See the description of the EPS->Home Energy Scoring Tool input converter utility in Appendix N. 110 See Chapter 1.3 for additional description of the different upgrade recommendation details provided in EPS reports.

Behavioral Perspectives on Home Energy Audits Page 371 upgrade recommendations. For these 31 houses, these four categories represented all of the “repair now” recommendations made by the Home Energy Scoring Tool. We did not attempt to compare the “do when it is time to replace” recommendations, as the tools differ substantially in how this type of recommendation was managed and presented. The four categories we analyzed were air sealing, attic insulation, floor or basement insulation, and duct insulation.

One factor that could confound this analysis is inconsistency of the modeled heating estimates from the two tools. Large differences in estimated heating use will likely translate to large differences in savings estimates and payback on average.111 Conversely, small differences in estimated heating use are an indicator that the models are comparable in their heating estimates. As can be seen in Figure 38, EPS and Home Energy Scoring Tool heating end use estimates are reasonably consistent within the sample of 31 homes used in the comparison (r=.85, with 72% of EPS space heating estimates within +/-25% of Home Energy Scoring Tool heating end use estimates, and all model runs within +/-50% of each other).

Figure 38: Asset model estimated heating use (site kWheq/yr): EPS vs Home Energy Scoring Tool, for 31 houses included in the recommendation comparison; note that the line represents perfect agreement

While this result is based on a small sample and compares at the aggregate level of total heating usage, it provides evidence that differences in savings estimates and payback between EPS and the Home Energy Scoring Tool are likely to be the result of differences in how the savings estimates and payback results are generated rather than resulting directly from large differences in modeled heating usage.

111 This argument is a simplification, but is valid to the degree that the heating use is proportional to the various upgrade characteristics (air leakage rate, insulation levels, etc.) we are comparing. Therefore, the comparison between EPS and Home Energy Scoring Tool total heating estimates is treated as an indicator that the underlying heating estimates are comparable, and that therefore large differences in savings estimates for recommended upgrades are likely the result of other factors.

Behavioral Perspectives on Home Energy Audits Page 372

Results – Comparison of Recommendations in Aggregate We compared all of the upgrades recommended for the 31 homes, by EPS auditors and using the Beta version release of the Home Energy Scoring Tool as of January 2012.112 These recommendations are listed in Table 56 and Table 57.

Table 56: All Home Energy Scoring Tool recommendations for 31 homes. Note that some recommendations are presented as “repair now”, while others are presented as “replace later”

Number of Homes Percent of Homes Home Energy Scoring Tool Report Recommendation Receiving Receiving Recommendation Recommendation Repair now: These improvements will save you money, conserve energy, and improve your comfort now Attic: Increase attic floor insulation to [range from R-19 to R-60]. 13 42 Basement/crawlspace: Insulate the floor above unconditioned space to at least [range from 12 39 R-19 to R-38]. Air tightness: Have a professional seal the gaps and cracks that leak air into your home. 10 32 Ducts: Add insulation around ducts in unconditioned spaces to at least R-6. 5 16 Basement: Add insulation to walls to R-19. 4 13 Seal ducts 0 0 Replace later: These improvements will help you save energy when it’s time to replace or upgrade Water heater: Pick one with an ENERGY STAR label. 31 100 Furnace: Pick one with an ENERGY STAR label. 25 81 Windows: Pick ones with an ENERGY STAR label. 19 61 Boiler: Pick one with an ENERGY STAR label. 4 13 Central Air: Pick one with an ENERGY STAR label. 2 6 Heat Pump: Pick one with an ENERGY STAR label. 1 3 Siding: Add insulating sheathing underneath it to R-5. 1 3

Table 57: All EPS report “standardized” recommendations for 31 homes. No distinction is made about doing now or when replacing equipment

EPS Audit Report Recommendation Number of Homes Percent of Homes Receiving Receiving Recommendation Recommendation Seal air leaks to reduce leakage (air leakage rate remains above .35 ACHn). 25 81 Insulate attic to R-49. 21 68 Seal all seams and joints on ductwork and plenum with mastic paste. 17 55 Dense pack uninsulated wall cavity with cellulose insulation. 15 48 Replace refrigerator with an ENERGY STAR refrigerator. 14 45

112 Because a Beta version of the Home Energy Scoring Tool was used in this analysis, criteria for selection of upgrades may change, as may the details and wording of the actual upgrades. Home energy modeling tools are generally a work in progress and are subject to change over time as algorithms are upgraded and errors are found and corrected; this analysis can therefore only represent these tools at a particular point in time.

Behavioral Perspectives on Home Energy Audits Page 373

Wrap ducts with fiberglass insulation. 13 42 Fully insulate floor joist cavity with fiberglass or cellulose (Savings assumes R30). 11 35 Upgrade to condensing gas furnace. 11 35 Upgrade to high efficiency windows. 10 32 Replace washing machine with an ENERGY STAR washing machine. 9 29 Replace dishwasher with a high efficiency dishwasher. 8 26 Install photovoltaic panels. 7 23 Install a tankless water heater. 6 19 Install a high efficiency gas water heater. 5 16 Install a solar water heater and use your current water heater as a backup. 5 16 Install a high efficiency electric water heater. 4 13 Install tight-fitting storm windows. 4 13 Reduce air leakage rate below .35 ACHn and install mechanical ventilation. 3 10 Install a solar water heater and new backup tank. 3 10 Insulate floor cavity with spray foam. 2 6 Install an ENERGY STAR heat pump. 2 6 Install a ductless mini-split heat pump. 2 6 Install mechanical ventilation to provide adequate make-up air. 1 3 Sprayfoam 2x4-inch roof-deck cavity 1 3 Upgrade to condensing boiler and integrated domestic hot water heating system 1 3 Add rigid foam to the exterior of the house when re-siding. (Savings based on R5 rigid foam) 1 3 Install a heat-pump water heater. 1 3

This comparison shows substantial differences between the recommendations generated using the two tools. These EPS recommendations drew from a much larger portfolio of upgrades, including appliance replacement,113 the addition of mechanical ventilation, and a variety of water heater upgrade options, but does not differentiate in terms of when it would make the most sense to complete the upgrades. EPS audits also recommended certain common upgrades—e.g. air sealing, attic insulation, duct sealing, and wall insulation—much more often than the Home Energy Scoring Tool. On the other hand, the Home Energy Scoring Tool recommends for most or all homes, to “pick one with an ENERGY STAR label” when upgrading heating systems, water heaters, or windows. In total, EPS audits made 202 recommendations for these 31 homes or 6.5 per home on average, while the Home Energy Scoring Tool made 127 recommendations for these 31 homes or 4.1 per home, with an average of 1.4 of these “repair now”.

113 EPS recommendations for replacement of dishwashers, washing machines, and refrigerators highlight the subjectivity in the scope of energy uses included in “asset” home energy modeling; the Home Energy Scoring Tool excludes these appliances from the scope of the model inputs, instead relying on assumptions regarding appliance efficiency, and not making recommendations on appliance upgrades.

Behavioral Perspectives on Home Energy Audits Page 374

Results – Comparison of Individual Recommendations

Air Sealing Recommendations Air sealing recommendations from both EPS Auditor and the Home Energy Scoring Tool were evaluated across the 31 houses. Coinciding recommendations (where a home received the same recommendation in both cases) were compared in terms of estimated cost, savings, and payback. Results are presented in Table 58, Table 59, and Figure 39.

Table 58: Summary of air sealing recommendations made in EPS Auditor and the Home Energy Scoring Tool for the 31 home sample

Recommended in Recommended in Home Energy Scoring Tool Number Coinciding EPS Air tightness: Have a professional seal the 10 Seal air leaks to reduce leakage (air leakage rate remains gaps and cracks that leak air into your home. above .35 ACHn). N=10 N=25 N/A 0 Reduce air leakage rate below .35 ACHn and install mechanical ventilation. N=3 N/A 0 Install mechanical ventilation to provide adequate make-up air. N=1

Table 59: Details of the 10 air sealing recommendations that coincided between EPS Auditor and the Home Energy Scoring Tool

Home Energy Scoring EPS Auditor (N=10) Tool (N=10) Average Estimated Cost ($/yr) $800-9001 $400-2000 Average Estimated Savings ($/yr) $96 $1712 Average Payback (years) 9 (7-10) 10 (3-24)3

1 Calculated from savings and payback information 2 Only a range is provided in the EPS report 3 We calculated this payback at it is not provided in the EPS report

Behavioral Perspectives on Home Energy Audits Page 375

Figure 39: Asset model estimated air sealing savings: EPS vs Home Energy Scoring Tool, for the 10 houses with coinciding recommendations; note that the line represents perfect agreement

Of note for interpreting these recommendations, the consumer is not provided with an indication of what improvement in air leakage is expected or estimated when calculating these savings and costs. As this could be different between these two tools, these are therefore difficult to compare. According to the engineering documentation, Home Energy Scoring Tool calculations are based on a 25% reduction in the existing leakage (Home Energy Saver: Engineering Documentation 2012) .

Also of note, four homes received EPS recommendations of a type not seen in the Home Energy Scoring Tool—including mechanical ventilation. In one of these cases, the current air leakage rate appears to have been below .35 ACHn (natural air changes per hour) at the time of the audit; this triggered a corrective recommendation, for health and safety reasons—to ensure adequate ventilation. However, this recommendation was not made in Home Energy Scoring Tool, though the information to calculate ACHn was provided. It appears that a check for inadequate ventilation and the resultant recommendation to add mechanical ventilation is out of scope for the Home Energy Scoring Tool— possibly because it is not intended to be a primarily diagnostic tool—even though it appears to collect inputs sufficient for this calculation.

As can be seen in Figure 39 above, EPS estimated savings for air sealing upgrades vary widely, while Home Energy Scoring Tool estimated savings vary only modestly. This may indicate that EPS assumes different levels of reduction in leakage, depending on the starting leakage rate.

Attic Insulation Attic insulation recommendations from both EPS audit reports and the Home Energy Scoring Tool were evaluated across the 31 houses. Coinciding recommendations were compared in terms of estimated cost, savings, and payback. Results are presented in Table 60, Table 61, and Figure 40.

Behavioral Perspectives on Home Energy Audits Page 376

Table 60: Summary of attic insulation recommendations made in EPS Auditor and the Home Energy Scoring Tool for the 31 home sample

Recommended in Recommended in Home Energy Scoring Tool Number Coinciding EPS Attic: Increase attic floor insulation to: 11 Insulate attic to R-49. R-19 N=3 N=21

R-30 N=2

R-38 N=7

R-60 N=1

Total N=13 N/A 0 Sprayfoam 2x4-inch roof-deck cavity N=1

Table 61: Details of the 11 attic insulation recommendations that coincided between EPS Auditor and the Home Energy Scoring Tool

Home Energy Scoring EPS Auditor (N=11) Tool (N=11) Average Estimated Cost ($/yr) $12151 N/A (cost provided is per square foot) Average Estimated Savings ($/yr) $132 $1322 Average Payback (years) 9 (8-10) N/A (cost provided is per square foot)

1 Calculated from savings and payback information 2 Only a range is provided in the EPS report

Figure 40: Asset model estimated attic insulation savings: EPS vs Home Energy Scoring Tool, for the 11 houses with coinciding recommendations; note that the line represents perfect agreement

Behavioral Perspectives on Home Energy Audits Page 377

There are some input differences between the two tools, as a separate set of inputs for attic, wall, and floor/foundation insulation were gathered as part of the original audits to use as Home Energy Scoring Tool inputs because the EPS inputs have different granularity. Additionally, we do not have audit information for these homes on the attic configuration; therefore, some homes may be limited in practice in the insulation possibilities (for example, if a home has cathedral ceilings). The auditor may have considered this information when they selected EPS recommendations.

In Figure 40, we can see that EPS estimated savings vary quite a bit less than Home Energy Scoring Tool estimated savings. Likely, the range of recommended insulation levels from the Home Energy Scoring Tool (R-19 to R-60) contributes to this difference.

Floor or Basement Insulation Floor or basement insulation recommendations from both EPS Auditor and the Home Energy Scoring Tool were evaluated across the 31 houses. Coinciding recommendations were compared in terms of estimated cost, savings, and payback. Results are presented in Table 62, Table 63, and Figure 41.

Table 62: Summary of floor or basement insulation recommendations made in EPS Auditor and the Home Energy Scoring Tool for the 31 home sample

Recommended in Recommended in Home Energy Scoring Tool Number Coinciding EPS Basement/crawlspace: Insulate the floor 7 Fully insulate floor joist cavity with fiberglass or above unconditioned space to at least: cellulose (Savings assumes R30). N=11 R-19 N=1 Insulate floor cavity with spray foam. N=2 R-25 N=2 (Total N=13) R-38 N=9 (Total N=12) Basement: Add insulation to walls to R-19. 0 N/A N=4

Table 63: Details of the 7 floor or basement insulation recommendations that coincided between EPS Auditor and the Home Energy Scoring Tool

Home Energy Scoring EPS Auditor (N=7) Tool (N=7) Average Estimated Cost ($/yr) $11211 N/A (cost provided is per square foot) Average Estimated Savings ($/yr) $173 $952 Average Payback (years) 7 (4-10) N/A (cost provided is per square foot) 1 Calculated from savings and payback information 2 Only a range is provided in the EPS report

Behavioral Perspectives on Home Energy Audits Page 378

Figure 41: Asset model estimated floor or basement insulation savings: EPS vs Home Energy Scoring Tool, for the 7 houses with coinciding recommendations; note that the line represents perfect agreement

As can be seen in Table 63, above, the Home Energy Scoring Tool estimated savings were nearly twice those of EPS on average, although these estimates varied dramatically from house to house.

In interpreting this comparison it is important to note that the foundation type was one of the more difficult features to map from the EPS inputs to Home Energy Scoring Tool inputs. The interpretation necessary in this process likely resulted in some of the mismatch between the recommendations from the two tools. Five houses received floor insulation recommendations from EPS where a conditioned basement or slab foundation was indicated; these could be errors, but more likely represent complex house configurations where the auditor is making reasonable recommendations which conflict with reasonable model inputs. However, it is interesting that in no cases did EPS “standardized” recommendations include insulating the basement walls. This recommendation is within the existing capability of the EPS tool.

In Figure 41 we again see the Home Energy Scoring Tool savings estimates vary over a considerably larger range than the EPS estimates, likely because the Scoring Tool recommends different insulation levels for these homes (R-19 to R-38).

Duct Insulation Duct insulation recommendations from both EPS Auditor and the Home Energy Scoring Tool were evaluated across the 31 houses. Coinciding recommendations were compared in terms of estimated cost, savings, and payback. Results are presented in Table 64, Table 65, and Figure 42.

Behavioral Perspectives on Home Energy Audits Page 379

Table 64: Summary of duct insulation recommendations made in EPS Auditor and the Home Energy Scoring Tool for the 31 home sample

Recommended in Recommended in Home Energy Scoring Tool Number Coinciding EPS Ducts: Add insulation around ducts in 5 Wrap ducts with fiberglass insulation. unconditioned spaces to at least R-6. N=13 N=5

Table 65: Details of the 5 duct insulation recommendations that coincided between EPS Auditor and the Home Energy Scoring Tool

Home Energy Scoring EPS Auditor (N=5) Tool (N=5) Average Estimated Cost ($/yr) $8901 $200-700 Average Estimated Savings ($/yr) $160 $442 Average Payback (years) 6 (5-6) 16 (4-28)3

1 Calculated from savings and payback information 2 Only a range is provided in the EPS report 3 We calculated this payback at it is not provided in the EPS report

Figure 42: Asset model estimated duct insulation savings: EPS vs Home Energy Scoring Tool, for the 5 houses with coinciding recommendations; note that the line represents perfect agreement

As is seen in the table above, the average estimated savings are more than three times greater for the Home Energy Scoring Tool ($160) than for EPS ($44), for this very small set of houses. This three-fold difference is not explained by the total heating estimate differences, which would explain approximately 25% higher Home Energy Scoring Tool savings. This therefore may indicate a difference in assumptions between the two tools in ducting system efficiency or regain, although the sample size is too small to be conclusive. Additionally, Home Energy Scoring Tool cost estimates ($890) are approximately twice those of EPS ($450) on average. This is difficult to reconcile, if the same task is being performed.

Behavioral Perspectives on Home Energy Audits Page 380

There were 8 cases where duct insulation was recommended in EPS reports that were not recommended in the Home Energy Scoring Tool. This can be explained by differences in tool resolution and imperfections in conversion between the tools. The Scoring Tool allows a single duct location (conditioned space, vented crawlspace, etc.) while EPS allows separate specification of the % of ducting in an unconditioned attic and in an unconditioned basement or crawlspace. Therefore, during the mapping process homes with less than 50% of their ducting in unconditioned spaces were modeled in the Home Energy Scoring Tool as having their ducting in conditioned space. This explained 5 of the 8 non-coincident homes that were recommended in EPS but not in Home Energy Scoring Tool. Also, finer gradation of insulation level inputs in EPS (settings for none, R-4, and R-8) versus Home Energy Scoring Tool (not insulated or insulated) enable EPS recommendations in cases where the same information is not available to Home Energy Scoring Tool.

Implications  The “catalog” of possible upgrade recommendations from the two tools is quite different, with EPS drawing from a wider set of possible upgrades. The Home Energy Scoring Tool differentiates upgrades into “repair now” and “replace later” recommendations. EPS auditors tend to recommend certain common upgrades— air sealing, attic insulation, duct sealing, and wall insulation – much more often than the Home Energy Scoring Tool. On the other hand, the Home Energy Scoring Tool recommends certain “replace later” recommendations at very high rates. These differences contribute to our observation that there is a substantial divergence between the upgrade guidance produced using these two tools.  Upgrade savings estimates from the two tools generally appear to be pretty widely varying for any particular upgrade for any particular home. Several factors may contribute to this: o Savings will depend on the upgrade level (how much insulation, how much reduction in air leakage); the two tools appear to apply different assumptions or optimization routines to set the upgrade level in each category. o The underlying heating usage estimates vary somewhat between the tools. This will contribute some variability to savings estimates o In particular cases, for example as may be the case in the duct insulation recommendations, the underlying models could be significantly different making modeled savings more or less sensitive to upgrades of that element.  Upgrade cost estimates from the two tools also differ, although we are unable to compare estimates for attic and floor insulation costs as these are provided in EPS per square foot, and attic or basement area is calculated internal to the model and is not available in the report outputs. EPS provides a cost range instead of a discrete value, while we calculated the Home Energy Scoring Tool cost from savings and payback information.114 For air sealing, the Scoring

114 Costs, savings, and payback calculations were performed using data gathered from the Home Energy Scoring Tool reports generated by the online tool mid-January, 2012. In early February 2012, the report format was modified. In the new format, payback information is no longer presented. Therefore, cost information cannot be calculated either.

Behavioral Perspectives on Home Energy Audits Page 381

Tool calculated average cost is reasonably close to the middle of the EPS estimated cost range. For duct insulation, the Scoring Tool cost estimate is substantially higher than the EPS range.  The number of homes receiving upgrade recommendations across the four analyzed categories was also different. EPS audits generally appeared to provide more recommendations, with a total of 77 recommendations across these four categories to 44 from the Home Energy Scoring Tool.  While the primary recommendation type in each of these four categories coincided well, less frequent recommendations in these categories did not—for example, insulation of basement walls or installation of mechanical ventilation. This may be the result of small sample sizes or because tool recommendation capabilities differ in these details. Of note, the Home Energy Scoring Tool does not appear to flag or identify cases where the ventilation in a home drops below the recommended level (0.35 ACHn), even though the model inputs appear to include the information to evaluate this. This seems like an important feature for the tool to have, as aggressive air sealing and insulation can result in inadequate ventilation (Manuel 2011). However, this feature may be omitted due to the non-diagnostic focus of the Home Energy Scoring Tool application.

Behavioral Perspectives on Home Energy Audits Page 382

Appendix T: Model Assumptions of “Standard” Operation Compared to Reported Energy Use Behaviors

Introduction and Purpose Both EPS Auditor and the Home Energy Scoring Tool produce a model-based asset score representing the energy performance of a house. To model the home’s asset energy performance, various assumptions are utilized to represent “standard” occupancy and occupant energy use behaviors. Even in the Home Energy Saver Pro (HESPro) tool, which includes a variety of operational inputs, defaults are provided and inputs are limited in ways that may affect whether and how energy use behaviors are represented in the model.

Three related questions are addressed in this analysis, which compares energy use behavior survey responses to EPS, Home Energy Scoring Tool, and HESPro assumptions and defaults:  Do modeling tool “standard occupant behavior” assumptions represent typical Seattle survey respondents?  Does use of a “standard behavior” assumption in these models represent most respondents pretty well, or is there a high variability in respondent behavior relative to the assumed standard behavior?  Are important behaviors not represented in the models?

We compare responses from over 300 Seattle homeowners to the Energy Use Behavior Survey to the assumptions and inputs to EPS, Home Energy Scoring Tool, and HESPro to address these questions.

Approach We reviewed the Energy Use Behavior Survey results in the context of key asset model assumptions regarding the operation of the home, as well as the HESPro operational model default settings. Where these assumptions are different for the tools, they are noted below. While a very wide variety of energy use behaviors can affect a home’s energy use, we selected a handful of behaviors for which data were available and which are perceived to contribute substantially to the total energy use of a home: thermostat use and settings; supplemental heating use; occupancy (having a small direct effect but primarily an indirect effect on use of the home’s systems, number of electronic devices, etc.); large appliance use; and whether someone is usually home during the day.

Information on many of these energy use behaviors have been collected by survey at the national level through the U.S. DOE Energy Information Administration’s periodic Residential Energy Consumption Surveys (see http://205.254.135.7/consumption/residential/index.cfm) as well as in other studies (e.g., Parker, Fairey, and Hendron 2010; Hendron and Engebrecht 2010; Hendron et al. 2004; Lutz et al. 1996). The results of these surveys have sometimes been used to define many of the “typical” energy use behavior assumptions utilized in asset-based home energy models (e.g., occupancy, large appliance usage, and miscellaneous electricity loads). Instead, we focus on characterizing the heterogeneity in

Behavioral Perspectives on Home Energy Audits Page 383 energy use behaviors reported by program participants as a means to assess to what degree “typical” energy use behavior assumptions would be able to characterize this population.

Results

Heating Thermostat Settings: Survey participants were asked a variety of questions about heating and cooling thermostat use and settings. We focus on reporting weekday heating results only, as these settings account for the majority of heating usage and weekend responses tended to be similar to weekday responses. Participants were asked to select their weekday thermostat settings for four time periods: mornings (6am-9am), daytime (9am-5pm), evenings (5pm-10pm), and overnight (10pm-6am). Participants indicated whether their thermostat was set to: Off, 59F or less, 60-65F, 66-67F, 68-69F, 70-71F, 72-73F, 74-79F, or 80F or greater. Of course households’ thermostat usage often will not fit neatly into these bins and categories—therefore these results should be treated as generally indicative but not quantitatively accurate.

The thermostat setting assumptions utilized by EPS, the Home Energy Scoring Tool, and HESPro are listed in Table 66.

Table 66: Thermostat Setting Assumptions Used in Modeling Tools

EPS Home Energy HESPro Default Auditor Scoring Tool (Standard Thermostat); configurable Daytime (8am-5pm) 72F 64F 60F Night (5pm-8am) 72F 68F 68F

We utilized the energy use behavioral survey self-reported thermostat settings to determine typical thermostat settings as well as typical setback patterns in this population of Seattle homeowners. Median temperature settings, drawn from these categories, are provided in Table 67, based on the responses provided.

Table 67: Self-reported thermostat settings from survey respondents

Number of responses Median Setting Morning (6am-9am) 310 66-67F Daytime (9am-5pm) 310 60-65F Evening (5pm-10pm) 312 68-69F Overnight (10pm-6am) 310 60-65F

These results indicate, not surprisingly, that morning and evening represent the periods with highest thermostat settings.

Behavioral Perspectives on Home Energy Audits Page 384

We next looked at whether these Seattle households reported turning the heating temperature setting down or off (manually or by a programmed setback) during the daytime and overnight periods.115 Results indicate that setbacks were common during both the day (141 of 308 households, 46%) and overnight (227 of 308 households, 74%). Nearly half of these households reported using thermostat setbacks during both the day and overnight (135 of 308, 44%).

The use of thermostat setbacks, particularly overnight, appears to be quite common in this population, and this may not be adequately represented in either EPS or the Home Energy Scoring Tool. Of course, the thermostat setting assumptions are just one factor affecting heating energy usage estimates; some houses have no thermostat at all, others have two or more heating systems, and zoning may vary. Many houses, as described in the next section, utilize supplemental heating. Thermostat setting differences therefore represent only some of the variability that would be expected in heating energy use.

Supplemental Heating Use: In general, supplemental heating uses, including portable electric heaters, gas and wood fireplaces, and baseboard and wall heating (where these are not part of the home’s primary or secondary heating systems), are assumed not to be present in the home as modeled in EPS Auditor or the Home Energy Scoring Tool, as the supplemental heating equipment and usage are not considered to be pertinent to the asset-based “score” generated for the home. Instead, the tools assume that all necessary heating is performed by the primary (or for EPS Auditor, primary and secondary) heating system(s).

More than 60% of survey respondents (200 of 322) indicated that they at least sometimes use supplemental heating such as gas or wood fireplaces, portable electric heaters, baseboard heaters, other wall or floor heating systems, and heating blankets, as illustrated in Figure 43.

115 A setback was defined for this analysis as a period where the temperature setting was lower during that period than both the period before and the period after.

Behavioral Perspectives on Home Energy Audits Page 385

Figure 43: Self-reported supplemental heating use

For these homes, a variety of supplemental heating sources were used, but portable electric heaters are most common. The use of the various supplemental heating sources is described in Table 68.

Table 68: Number of respondents reporting using a particular supplemental heating source one or more hours per week (some report more than one type). Total sample N=322

Supplemental Heating Source Number of Percent of Respondents Respondents Gas fireplace(s) 48 14.9% Wood fireplace(s) 23 7.1% Portable electric heater(s) 97 30.1% Baseboard heater(s) 45 14.0% Other heating sources (responses included 22 6.8% electric blankets, wall heaters, floor heaters)

In some of these cases what respondents classified as supplemental heating may have been identified as a secondary heating system in EPS Auditor—particularly baseboard or wall heating systems. Therefore, in some proportion of cases EPS modeling will include information on supplemental heating. Only one heating system for the home can be modeled in the Home Energy Scoring Tool and in HESPro.116 Considering the relatively frequent use of supplemental heating by the respondents, this omission may represent an important limitation for the tools.

116 HESPro has an input allowing the user to indicate the portion of the home heated with a wood burning stove or portable electric heater; however, this supplemental heating use is not included in the energy use modeling results. HESPro also allows input of user-specified end uses, which could presumably be used to manually specify secondary or supplemental heating systems.

Behavioral Perspectives on Home Energy Audits Page 386

Occupancy EPS Auditor and the Home Energy Scoring Tool, as asset tools, utilize standard values for occupancy of the home. Both tools currently estimate occupancy based on the number of bedrooms in the home. In order to evaluate this assumption, we compared self-reported number of occupants of the home to the number of bedrooms recorded from the EPS audit. These are plotted in Figure 44 below, which also includes the EPS and Home Energy Scoring Tool “standard” estimates.

Figure 44: Self-reported and model estimated occupancy versus number of bedrooms in home. Jitter was added to self-reported values (on both X and Y axis) to make density of points visible. N=312.

A linear regression analysis was performed using number of bedrooms to predict self-reported occupancy. The resulting regression line (r2=.387, intercept=1.07 [.61 to 1.52], b=.51 [.37 to .64], t=7.39, p<.001) is consistent, within tolerance limits with the basic formula utilized by the Home Energy Scoring Tool to predict occupancy (Hendron and Engebrecht, 2010):

Occupants = .59*(Number of Bedrooms) + .87

The Home Energy Scoring Tool rounds this calculated occupancy to the nearest whole number. EPS utilizes a similar relationship between number of bedrooms and occupants, and can be seen in the figure.

Behavioral Perspectives on Home Energy Audits Page 387

While these model assumptions appear to fit our sample population, there is significant variability in the actual occupancy around these “standard” values. This variability is one factor that can reduce the efficacy of asset model estimates in reflecting actual energy use in a home.

It is also relevant to note that a number of survey respondents, when asked about other factors that could affect their household’s energy use, indicated that the number of people living in their home had changed (for example, a new baby) over the period of the study, or that their household size was variable with frequent visitors or household members living with them only part of the time. This factor therefore hides significant additional complexity that could affect energy usage in various ways, and a single occupant number (or the number of occupants in various age groups, as entered into HESPro) is likely still only a gross approximation of the actual household in many cases.

Whether a Household Member is Usually Home During the Day: An important portion of a household’s energy use is a function of how much household members are home to consume that energy—most notably in faucet usage of hot water, but also in usage of various electronics, willingness to set back the thermostat settings while not home, etc. We therefore asked survey respondents to tell us how often someone was home during the day.

EPS auditor does not appear to explicitly assume that anyone is home or not home; this factor is instead embedded in domestic hot water, appliance, and electronics usage calculations which are in turn based on the occupancy and other model factors. The Home Energy Scoring Tool does assume that someone is at home during the day in Seattle; this assumption varies depending on the climate zone of the house (Home Energy Saver: Engineering Documentation 2012). HESPro allows the user to indicate whether someone is at home during the day or not. In the Home Energy Scoring Tool and HESPro tools, when someone is indicated as being home during the day, this translates into approximately an additional 10 gallons per day of hot water usage (presumably from tap water use), but does not have any other major effect on the model results.

Figure 45: Response to question about whether someone is home during the day

Typically, during weekday daytime hours (8 am to 5 pm), would you say that? Frequency Percent Did not respond 11 3.4 Somebody is home only occasionally (less than 25% of the 113 35.1 time) Somebody is home less than half the time (25% to 49% of the 29 9.0 time) Somebody is home about half the time (50% to 74% of the time) 47 14.6 Somebody is usually home (75% of the time or more) 122 37.9 Total 322 100.0

As can be seen in Figure 45, more than half (169) of respondents indicate that they are at home at least 50% of the time. Interestingly, there does seem to be a pretty clean split—75% of respondents indicate that someone was either usually home (75% of the time or more) or only occasionally home (less than

Behavioral Perspectives on Home Energy Audits Page 388

25% of the time). This seems to support the yes/no model criteria in place in HESPro, including additional choices would not improve modeling much, for these household. However, the Home Energy Scoring Tool assumption that someone is usually at home during the day poorly represents the greater than 1/3 of households that report being only occasionally home during the day. Because this factor specifically affects modeled hot water usage in the Home Energy Scoring Tool, this means that for many households where someone is not usually home, recommendations for water heater upgrades would be based on an assumption of more hot water usage (and reflect greater savings) than might actually be the case. This presents a challenging situation for asset modeling—neither choice is a good one for the population as a whole. One alternative would be to split the difference, assuming that someone is home approximately half of the time (with associated hot water usage). Either way, this factor likely represents only a relatively modest effect compared to other sources of variability in hot water usage between households (such as appliance use and efficiency and shower duration).

Large Appliance Use Use of large appliances such as dishwashers, washing machines, and clothes dryers represent a significant use of energy and hot water in the home, and are directly dependent on household members. Table 69 presents the assumptions utilized by the modeling tools to represent usage of these appliances.

Table 69: Large appliance use model assumptions

EPS Home Energy Scoring HESPro (Default- Auditor Tool configured by user) Dishwasher loads per week Appliance hot water usage is 3 3 based on assumed occupancy of home (in turn based on number of bedrooms) Washing machine loads per week by cycle temperatures: Clothes washing hot water Hot wash/warm rinse usage is based on assumed 0 0 Hot wash/cold rinse occupancy of home (in turn 0 0 Warm wash/warm rinse based on number of 1 1 Warm wash/cold rinse bedrooms) and washing 2 2 Cold wash/cold rinse machine efficiency input 3 3 Dryer loads per week Based on occupancy of home 5 5 (in turn based on number of bedrooms)

Home Energy Scoring Tool and HESPro assume that appliances are of standard efficiency; HESPro can be configured for efficient appliances. EPS has an input to indicate whether the washing machine is standard or efficient.

Energy Use Behavior Survey respondents reported their use of dishwashers, washing machines, and clothes dryers in loads per week, as well as their typical clothes washing cycle settings. These responses are summarized in Figure 46, Figure 47, Figure 48, and Figure 49, below.

Behavioral Perspectives on Home Energy Audits Page 389

Figure 46: Dishwasher use as reported by energy use behavior survey respondents (N=321); average (mean and median) approximately 3 loads per week

Figure 47: Energy use behavior response on typical washing machine cycle setting; respondents could only select one setting for the wash cycle and one setting for the rinse cycle Typical Washing Machine Cycle Temperatures Cumulative

Frequency Valid Percent Percent Valid Cold Wash Cold Rinse 133 42.2 42.2 Cold Wash Warm Rinse 5 1.6 43.8 Warm Wash Cold Rinse 131 41.6 85.4 Warm Wash Warm Rinse 39 12.4 97.8 Hot Wash Cold Rinse 4 1.3 99.0 Hot Wash Hot Rinse 3 1.0 100.0 Total 315 100.0

Behavioral Perspectives on Home Energy Audits Page 390

Figure 48: Washing machine use reported by energy use behavior survey respondents (N=322), average (mean and median) approximately 4 loads per week

Figure 49: Clothes dryer use reported by energy use behavior survey respondents (N=316), averaged over the whole year (fall/winter and spring/summer usage was collected separately); average (mean and median) was between 3 and 4 loads per week

Behavioral Perspectives on Home Energy Audits Page 391

Overall the “standard occupant” appliance usage assumptions in the Home Energy Scoring Tool are similar to those of the survey respondents. Dishwashing usage is similar, and clothes washing temperatures are similar. Reported typical washing machine and dryer loads per week were lower for these Seattle respondents than is assumed in the Home Energy Scoring Tool (4 loads per week reported compared to 6 and 5 loads per week assumed, respectively).

Other Large Energy Uses: One factor important in comparing modeled energy use to utility reported usage is whether the household has major energy-using equipment that is not captured in the model or the audit. Examples of these uses include electric vehicle charging, kilns, saunas, air compressors, medical equipment, shop tools, hot tubs, dehumidifiers, heated greenhouses, pool pumps, pool heating systems, air filtration systems, sewage and well pumps, and elevators. Depending on the scope of the audit or assessment, these uses may or may not be accounted for in the model. For example, while the EPS Auditor tool allows the auditor to account for many of these energy uses in the scores and energy use estimates, the Home Energy Scoring Tool excludes these uses. HESPro allows modeling of certain of these uses explicitly; others can be manually configured. However, the actual usage associated with these devices likely varies and may be hard to estimate.

According to the EPS audit data and responses to the energy use behavior survey, approximately 24% of households (77 of 322) had one or more of these large energy uses, which could increase their utility usage beyond what would be expected from the audit results. About two thirds of these 77 households were flagged by auditors as having one or more of these large energy uses—which were then incorporated into the EPS audit results.

Additionally, a small number of surveyed homes reported having solar panel systems on-site. Generation systems likely interfere with the comparison between audit estimates and utility reported usage in this small set of cases.

Water Heater Temperatures Water heating energy use depends on the efficiency of the system, water heater temperature settings, hot water usage, and incoming water temperature. While system efficiency is estimated in both the EPS audits and Home Energy Scoring Tool assessments, usage and temperature settings are treated as operational factors, and therefore standardized assumptions are used. Hot water usage is calculated based on occupancy and assumed usage of appliances, tap water, and showers, and assumed appliance efficiency.117 Water heater temperature settings are assumed to be 120 F in the current Home Energy Scoring Tool; this assumption is not known for EPS auditor.

EPS auditors collected a measurement of hot water temperature; however, this is not used as an input to asset modeling. Typically the measurement was achieved by running the shower closest to the hot

117 EPS Auditor includes an input for clothes washer efficiency, while HESPro includes inputs for clothes washer and dishwasher efficiency.

Behavioral Perspectives on Home Energy Audits Page 392 water heater at full heat until a stable temperature was reached; a thermometer was then used to measure this hot water temperature. While we expect that in most cases this measured temperature is close to the heater temperature setting, we cannot confirm this. Figure 50 presents a histogram of measured hot water temperatures.

Figure 50: Measured hot water temperatures from EPS home energy audits, prior to any adjustment that may have been made by an auditor or in response to the audit

As can be seen in the histogram, the great majority of houses (272 of 319) had hot water temperatures between 115 F and 130 F, with the majority of those close to 120 F. This is consistent with the current assumption utilized in the Home Energy Scoring Tool and the default in HESPro, 120 F. Less than 10% of these homes (27 of 319 homes) had measured water temperatures greater than 130 F.

Implications The use of heating systems, thermostat settings, and supplemental heating varies considerably—much more than is captured in the asset models. Much of this is a result of how households operate their home.

Other operational factors were reviewed: occupancy, whether someone is home during the day on weekdays, appliance usage, the presence of other large energy uses, and hot water temperature. With the exception of hot water temperature, each of these factors varied substantially within this population. This provides an indication that asset model assumptions about “standard” occupant

Behavioral Perspectives on Home Energy Audits Page 393 behavior, as well as the capability of operational modeling to capture the subtleties of actual household energy use behaviors and equipment, are likely to be quite limited.

Combined together and for many of these various individual inputs, the survey respondents appeared, as a whole, to behave in ways that would be expected to use less energy than the “standard” household assumed in the tools. This may be at least partially the result of self-reporting and the self-selection of audit recipients and survey respondents.

To quantify this effect, we used HES models to isolate the effect on estimated energy usage from integrating household-specific behaviors compared to the “standard occupant behaviors” assumed in the asset tool. The result, for each of the 101 households with behavioral surveys and complete utility data (this is the sample from Appendix Q, which represent a subset of the homes included elsewhere in this appendix), is presented in Figure 51. For each house, we compared the Home Energy Scoring Tool modeled “asset+weather” energy use for each house to the HESPro modeled “house+weather+behavior” (see Appendix Q), additionally controlling for tool differences in the lighting estimation algorithm and “other large energy uses”. Certain other technical house factors that were modeled in the “house+weather+behavior” model but not in the “asset+weather” model (see Appendix Q) were not controlled for in this comparison, and could affect the results.

Figure 51: Histogram of fractional change in model estimated total site energy use of a home when the household’s energy use behaviors and some appliance efficiency inputs were included in the HESPro model (compared to Home Energy Scoring Tool asset assumptions).

The results of comparison indicate that, using just the limited set of self-reported behaviors that were included in the modeling exercise, most households had lower modeled energy use when their energy

Behavioral Perspectives on Home Energy Audits Page 394 use behaviors were considered. That is, while this comparison does not completely control for house and equipment input differences between model runs, most households reported more energy- conservative behavior, overall, than assumed in the standard modeling practice. While this likely is a function of this particular population, it may also be a result of non-representative asset model assumptions. Figure 51 also indicates that this limited set of additional inputs makes a substantial difference in model results—with a mean reduction in model estimated usage of approximately 12%, but a wide range of changes. As this result only captures a subset of influential energy use behaviors, the actual variability could be greater.

Behavioral Perspectives on Home Energy Audits Page 395