top of page

Fixing a Catastrophically Bad Design

Goal:  Solve vexing usability issues with a major sales-management data warehouse.


My Roles:  Researcher, leader, negotiator, product designer.

Impact:  $2 million investment saved - for $40K.

Circle the Customer, Ecolab

Situation

Task

Actions

Ecolab is a huge company, involved in over a dozen market verticals.

At one point, the company figured it was possible for salespeople from several of these verticals to be sitting in the lobby at one of their big institutional clients – McDonalds, Hilton, Yum Foods – and not even know each other were there selling Ecolab.

Marketing opportunities were being lost.

As a result, the company instituted a new position, a D-level Customer Account manager, to manage relationships with the really big clients.

To support them, the company commissioned one of the big software solutions companies to build a Data Warehouse application, centralizing reporting across the verticals for both Ecolab and their customers.

Unfortunately, after spending $2 million, the system had serious issues – not just usability issues, but a creeping sense that something was very, very wrong with the application.

 

I was called into the office of the vice president in charge of the program.

“This thing is just awful”, she said, telling me to figure out what we could so, but fast and on a budget.

Diagnose and fix massive, debilitating problems – while not ruffling executives political feathers.

At the beginning, it just seemed to be a bunch of simple micro-usability problems.

The original vendor – one of the big software solutions companies – had built a three-

step “wizard” for generating a report on Ecolab’s sales activities by vertical, by branch in the customer organization, and others factors.

A simple heuristic review revealed many of them:

  • The term “report” meant three different things: the criteria that were the report’s parameters, the file that saved those criteria, and the final output of the process. All three of which involved different user interactions.

  • On the other hand, the object produced in part of the “wizard” where the user queried the data on the customers had three different names, sometimes used interchangeably in the same sentence of instructional text.

  • There were myriad other terminology issues.

 

A deeper dive showed a much bigger problem, though.

When  the user edited a “report” (the file that saved the criteria) and changed the name, it simply renamed the report object.

But when the user renamed the Customer Query (the part that originally had three names, above), that created a separate copy of that query. The user wasn’t clear not only on which copy they were working on, but it was entirely possible for the “Report” to be getting its data from the wrong query.

As a result, the user could be generating numbers that had nothing to do with what they expected. The numbers were so complex, there was no way for them to know.

The system was practically designed to put out bad data.

In addition, the “customer query” portion relied on a 2,000 node treeview control that took as much as five minutes to load, and couldn’t be avoided. It was so slow, people thought it had crashed. Which, sometimes, it did.

The Response

To make things even more complex, most of the executives who’d signed off on the original $2 million development cost were still at the company, and still had egos invested.

I had to be careful.

So I ran a round of usability testing with five users, ascertaining that the problems I identified were real – nobody could either successfully create a report, or knew that they were not viewing the data they thought they’d put in it.

One key finding – about 2% of users actually used the Customer Query step at all. Ever.

So I did a mocked up three different approaches to fixing the problem, at low fidelity (to make fast iteration easy.

  • A “Low-Impact” solution: getting the terminology rationalized, explaining the process and warning the users of the potential pitfalls.

  • A “Middle-Impact” solution, with all the “Low-Impact” measures plus some moving steps around.

  • A “High Impact” solution, which I’ll describe below.

 

The “High Impact” solution used the results of the usability testing.

Visio-AP346_CTC_Redesign_Wireframe__High-Effort__pdf__page_3_of_8_1.png

Its just a wireframe – although Ecolab’s actual stylesheet wasn’t much more elaborate back then.

But the action is around callout 15 – where I hid that huge treeview control that the users needed on roughly one query out of fifty.  That, and labeling everything correctly and consistently, did the trick.

Results

Results

Management decided to invest in my “High Impact” solution; the investment was worth it for this high-profile a project.

  • I moved the “Customer Query” with its 2,000 node treeview that was used in a tiny fraction of reports, off the “happy path”, into a branch in the process. Thus, I got rid of the second and third steps of the “wizard”, and boiled the whole thingdown to one page.

  • With the aid of a very talented developer, we fixed the rename/copy model, so both the “Report Definition and the “Customer Query” (if applicable at all) behaved the same way, ensuring no surprises in the data.

Then, to make sure the results were absolutely clear enough to overcome any political obstacles, I ran a round of usability testing with 27 users, making sure every possible usage permutation and audience were covered exhaustively.

 

It was a huge success.

Click for Patch for Neurons slide show

bottom of page