EPA Building Emissions Calculator
Background
The Portfolio Manager is software developed by ENERGY STAR and the EPA, used to measure and compare a building’s energy consumption and emissions. As its companion, a new tool was being developed: the EPA Building Emissions Calculator (“the EPA Calculator”). This application would take data from Portfolio Manager and provide to user additional, analytic information like forecasting.
When I joined this project, the EPA Calculator was scheduled for production in less than two months. This was a simple usability project. The timeline made things a bit more interesting. At first, due to a simple lack of prior exposure to Research, it was believe one could write usability tasks without access to the software. With some swift convincing, I gained access and wrote a meaningful discussion guide.
Approach
I chose a qualitative, moderated usability study approach. I did not have access to usability tool (like UserTesting.com) to assist with metrics like “time on task” and I knew turn-around would be quick for this project. So I opted for “severity” as defined by NN/G, And based on research heuristics, tested with five participants.
I coded qualitative data in Dovetail using a deductive coding method, analyzing findings by task, feature, design area, and usability issue.
For communicating information to Engineers, I used the constructs “feature” and “design area.”
Team structure
On this project I was the lead and sole Researcher. My principal contacts were with the agency Subject Matter Expert who interfaced with EPA, the Project Manager, the EPA client, and Engineers.
Findings
Two features were found to have the highest levels of severity (3 = Major usability; 4 = Usability catastrophe):
Calendar functionality: the EPA Calculator used two calendars, each different from each other. As well, each behaved differently from typical software calendars. While both were awkward for users, one was especially problematic and cause myself or others on the call to walk participants through their use. This was actually so painful for the sponsor, they no longer wished to see this feature be tested after two participants.
UX microcopy and UX copy: The UX writing proved to be an issue throughout the application. The writing used overly wordy tool tips which users didn’t want to read, lacked information or contained incorrect information, and used inconsistent language. The content issue existed at the level of small text snippets (UX microcopy) or large swaths of descriptive writing (UX copy).
Impact
Direct product/service impact:
Research led to direct changes in software/product prior to its next launch .
I provided links to calendar “best practices” to engineers and suggested which calendar functionalities could work.
I offered UX writing changes (which were not SME-specific).
Strategic impact:
Among this part of the business, a new understanding of Research and its value was gained — that’s a win!