
Rapid usability feedback drives smarter solutions
Context (the business):
The EPA BEC (Building Energy Calculator) helps assess a building's energy performance by comparing its energy use intensity (EUI) to similar buildings. It aids owners, managers, and energy professionals in identifying areas for efficiency improvements and tracking energy use. The tool integrates with ENERGY STAR Portfolio Manager for benchmarking. It offers insights into energy-saving strategies and supports regulatory compliance, helping buildings reduce costs, improve sustainability, and achieve ENERGY STAR certification.
Project goals:
The BEC is designed for building owners and managers, energy professionals and consultants, facility managers, and government agencies. Prior to its launch, I was brought on as a consultant to conduct a usability study with five representative users. The goal was to test the product’s functionality and user experience, identify potential issues, and gather feedback to refine the tool before its official release.
Design goals
Uncover problems in the design, Discover opportunities to improve the design, Learn about user behavior and preferences.
Role
My role on this project was as User Experience Researcher. I was a consultant within an agency that specialized working on federal government projects.
Timeline: three weeks.
Why this case study?
I’m adding this case study to highlight the value of conducting quick research to enhance a product’s user value, ultimately driving adoption and usage. It also demonstrates a key research method used during the iterative stage of product development.
Approach
Five areas of functionality were tested:
Logging into the BEC
Ingesting data
Emission Baseline user interface
Forecasting user interface
Downloading data
The usability tasks associated with each of these functionality areas are outlined (and color-coded), totaling seven tasks.
Five clients participated in the usability tests. Note that I was not involved in the recruitment process.
Analysis
All sessions were recorded and input to Dovetail.
Observations were inductively and deductively into meaningful themes (features, pain points, tasks, i.e).
Final “Severity” definitions were applied to each functional area using a standardized definition.
These findings were communicated to the EPA client, with recommendations for specific changes for improvements.
Findings
Impact & Insights
“Ingesting data” into the BEC was most severe
Recommended changes here were to
alter all calendar functionality to be consistent across the product and more consistent with standardized calendars in products, in general.
improve UX writing to be more clear and consistent across the product with regard to ingestion. Example: three words were used to do the same thing (ingest, pull, get).
Impact: (severe) recommendations adopted.
Impact: usability testing value better understood.
UAT checks the box if the product was built as determined. Usability testing tells us if the product was built like someone would actually use it. The distinction was better understood by the account team, who was interested in adding usability sessions to future projects.