3 Ways to Make Energy Efficiency Emotional

Tool_energysavvy
In acknowledging that consumers tend to make purchase decisions based on emotional appeal vs. rational need, EnergySavvy did research and found there is a lot that can be done in "applying the lessons of behavioral economics to selling home energy retrofits."  EnergySavvy is helping the U.S. Department of Energy "create an 'emotional tug' for homeowners to further engage with home energy upgrades" in the new national Home Energy Score's visual elements.

 

Posted on EnergySavvy
March 25, 2011

3 Ways to Make Energy Efficiency Emotional

EnergySavvy Research: Testing DOE’s Home Energy Score

One of the critical jobs for a consumer product marketer is to make an emotional connection with potential customers, to create a desire to buy that isn’t based on a reasonable evaluation of the costs and benefits of the product. How else does Apple convince everyone to upgrade iPhones every year when each new version has just a few extra cool features but costs just as much as the previous one? Or buy the new iPad 2 when by most accounts, it’s pretty much the same thing as the original iPad?

Selling Emotion: If it works for Apple, it can work for home energy retrofits.

Now take our industry – home energy efficiency. With a rational consumer evaluation of the costs and benefits, we’ve got a great product compared to Apple. The problem is that consumers don’t make buying decisions rationally. So what can we do to sell our product – home energy retrofits – based on its emotional appeal?

Plenty. There’s lots to be learned from applying the lessons of behavioral economics to selling home energy retrofits. In this article, we’re going to focus on what works and what doesn’t in visual design – specifically, the design of energy efficiency reports.

A|B testing Home Energy Score visuals

Over the past several months, EnergySavvy has worked with the U.S. Department of Energy to conduct an online test of potential visual elements for the new national Home Energy Score (currently in pilot test by the DOE), to determine what design elements have greater visual appeal and create an “emotional tug” for homeowners to further engage with home energy upgrades.

The basic framework of the Home Energy Score design was released in November by the DOE: a 1-10 scale that compares your house with other homes in your area and projects where you could be with upgrades.

In an online A|B test we set up, we used 12 different design variations from the original to isolate the engagement rate differences associated with relatively minor variations, such as emphasizing the dollar savings number over the score difference (or vice versa), personalizing language used in the scale, changing color elements, etc.

In the test, users were directed to a simple landing page with the caption “How Efficient is Your Home?” and a prominent display of the design with or without a variation. Despite relatively subtle differences in many variations, there were significant differences in the conversion rates. (A “conversion” in this test was defined as a click-through on a button next to the visual, labeled “Take the Survey” – implying that there would be some effort associated with the next step.)

What we learned from the experiment

The variations that had the highest engagement rates had the following characteristics:

  • Personalized Language: By making the language and design of the visual more personal, through changes such as using “Your Home” instead of “Current Score”, or by adding friendly “home” icons, we showed statistically significant increases in conversion rates.

  • Estimate Energy Savings over Multiple Years: Showing a five-year Dollar Savings instead of an Annual Dollar Savings calculation increased conversion rate significantly, even though the five-year number was just the one-year number multiplied by five.

  • Simplify the Result: Showing an estimated dollar savings alone, without also showing a target score, significantly increased conversion rate.

These are all very small differences, but each resulted in statistically significant differences in engagement rates in an A|B test. And presenting a house’s energy score is just one of the many opportunities in a home energy efficiency program to either engage or disengage a customer.

What energy efficiency programs can learn and apply

To generalize what we learned from the quantitative A|B tests on the DOE Home Energy Score design, we’ve drawn a few conclusions that can be applied to visual and creative design and copy for all home energy efficiency:

  1. Keep it simple: Simpler, friendlier language and less data always “won” our A|B test cases. In an industry full of building science nerds (and we use that term affectionately), we’ve all got to fight the urge to overwhelm homeowners with technical details and data. Research by Columbia University professor Sheena Iyengar showed a correlation between the number of 401K plan choices that employees at a company had with their participation rate: the more plan choices they had, the lower the participation rate was in any of the options.
  2. Personalize: Personalized language (“Your Home” and “Efficient Homes in Your Neighborhood” for example) and visuals performed well in our tests. It works to convince hotel guests to reuse their towels: “Other people who stay in this particular hotel room reused their towels at least once during their stay”. Maybe it’ll work to encourage home energy efficiency work.
  3. Make it Long-Term: Given multi-year measure lives of most energy efficiency improvements and the average tenure of homeowners, it seems legitimate to show a five-year savings estimate (at least!). While behavioral economics studies do show that people aren’t always motivated more when they are given more money, the fact that people were more motivated by the five-year savings number than the equivalent annual savings number fits with documented social psychology research. (In one study, subjects were willing to switch telephone providers for an average of $56 savings per year when the savings were presented as an annual number. But when the savings were presented as a monthly number, the average needed to be $11/month – the equivalent of $132/year!)

Other research and analysis from EnergySavvy

For more information on methodology on this test or to ask other specific questions, contact EnergySavvy. Or learn more about EnergySavvy for Programs.

No comments yet... Be the first to leave a reply!

Leave a Reply