After endless hours of analysis and planning, we thought we had created an extremely fine piece of technology for the lawyers of the firm. And then we launched the application, only to discover that the lawyers didn’t agree with our assessment of the tool. It was, to put it mildly, disappointing.
This experience is not unique to our team, and it certainly is not unique to technologists. People in every profession can tell stories of well-intentioned projects that fell short of the mark. Time and time again we think we’ve considered every angle and planned for every contingency, but then we are tripped up by the most basic challenge: our failure to understand fully the ultimate user or beneficiary of our work.
The frustrating part was that we followed the book (at least as it was written at the time) on how to handle a technology project properly. We did a thorough needs assessment, wrote a requirements document and a detailed project plan, carefully considered the comments we received during our pilot, and yet…disappointment.
What went wrong? In retrospect, I think we made one basic mistake. While we created personas to help us anticipate user behavior, those personas were a reflection of the technologists’ perceptions of the end-user population. By relying on our own perceptions (and, if we were to be completely honest, prejudices) about how lawyers interact with technology, we did not truly design with user behavior and preferences in mind. To do this properly, we should have consulted with psychologists as well as user interface experts.
What can psychology and behavioral economics teach us about good design? It can help us understand better why people do the things they do and what we can do to encourage them to do the right thing. Richard Thaler and Cass Sunstein call this “choice architecture.” In their book, Nudge, they state that a “choice architect has the responsibility for organizing the context in which people make decisions.” This means that if you do not design with good choice architecture in mind, you’re likely to find that your end-users do not interact with your system or process as you intended. If you want to see this approach in action, consider an example provided by Steven Dubner, co-author of Freakonomics:
- The Inland Revenue Service in England adopted the recommendations of the British Government’s Behavioral Insights Team (aka the “Nudge Unit” after the book of the same name) to encourage people to pay taxes. Instead of threatening delinquent taxpayers with dire consequences, they simply informed their target audience that 9 out of 10 people in their region pay the taxes owed. This positive message encouraged compliance by bringing social pressure to bear on the person who was considering avoiding taxes. The thinking behind the approach is to dispel the tax delinquent’s belief that most people don’t pay their taxes and, therefore, it’s fine not to pay.
- You can see the same dynamic at work when you contact colleagues who have not done something they were supposed to do. If you send an email to a group of people who are delinquent and display their names in the “To” field rather than hiding the names in the “bcc” field, you show them that there are several people similarly situated. This disclosure has the effect of giving them permission to continue to be delinquent since they now perceive they are in good company.
So what might we do differently with respect to law firm technology and business processes if we were to integrate behavioral science into our design?
- Set the defaults in your system with care. Since behavioral science shows us that people have a hard time overcoming inertia in order to choose another setting, ensure that the default setting is in the best interests of the user and likely to the give the user her preferred outcome. While this may seem paternalistic, Thaler and Sunstein point out that the designer has to choose a default, so that designer might as well choose one that is likely to result in an outcome that is good for the end-user.
- Design based on evidence rather than mere perceptions. Thaler believes that the basis for the best decisions is evidence garnered from randomized controlled trials. This is the approach used by the Nudge Unit and it is described in their white paper, “Test, Learn, Adapt.” In the delinquent taxpayer example discussed above, the Nudge Unit sent out letters with a variety of messages to 140,000 taxpayers in a randomized trial. By tracking the response to each of these messages, they were able to document a 15% increase in early payment by the people who received the new positive message, as compared to the people who received the old reprimanding message. This resulting hard evidence is a key benefit of the “test, learn, adapt” approach. What does this mean for you? As you are designing your systems and processes, generate several working hypotheses regarding what end-users in your firm are likely to do and why they are likely to behave that way. Then find simple ways to test those hypotheses. Once you have the results of those tests, you can make an informed decision about how to design for the optimal outcome for your specific population of end-users.
- Make it easy. According to Thaler, another key mantra of the Nudge Unit is: “If you want to encourage some activity, make it easy.” This means working through all the steps in the process to ensure that the best choice is the easiest choice. Dr. David Halpern, a social psychologist who leads the Nudge Unit, sets the bar high for all of us: “We’re obsessed about the tiny details. We’re obsessed about the little inconveniences and hassles that get in the way of people being able to do things … not only because they can often be annoying for citizens, but if you can get rid of those frictions, those details, those problems, everything works better.” [emphasis added]
- Adapt your system to the end-user. When designing systems and processes, there is a huge temptation to demand that your end-user adapt to your system. This rarely works. Under the best of circumstances you’ll have cranky, minimally compliant clients. What’s the better way? Find out what is driving user behavior and then adapt your system to accommodate the reality of that user behavior. Once again, the Nudge Unit has a good example of this principle in action. When generous government subsidies alone failed to encourage people to insulate their attics, the British government then offered people assistance in clearing out their attics, which led in turn to higher rates of insulation. Why did this happen? Even though people had to pay for the attic-cleaning service, the sheer convenience of the service helped people overcome the biggest barrier to insulating their attics — the fact that they had too much stuff in their attics. As you can see in this example, the belief that financial incentives could motivate the desired behavior was trumped by the reality that attics had to be emptied before they could be insulated, which represented an enormous barrier to people who would otherwise like to insulate their attic and collect the financial incentive. Once they had help to empty their attics, the barrier to the desired behavior went away. Because the government was willing to adapt its approach to accommodate user behavior, it was better able to realize its policy goal of energy conservation through increased insulation.
With the benefit of 20/20 hindsight, I realize that our disappointing project might have turned out better if we had spent less time in navel-gazing planning meetings and more time with our lawyers. And we should not have assumed that people would automatically fall in line with our perception of rational behavior. (Reading the work of psychologists and behavioral economists would have made the falsity of that assumption abundantly clear.) At the end of the day, the users’ preferences always trump those of the designer. Thankfully, those preferences are not entirely unknown. Behavioral science has learned a great deal about how people tend to behave. It would be time well-spent to learn what behavioral science knows. With this information in hand, you can avoid the worst pitfalls of poor design. While this is no guarantee of project success, it does increase the likelihood that your system or process will deliver the intended results.