Gains from technologies that leverage legal labor are real, and so enticing. The positive impact of a properly selected and implemented tool can be profound. But so too can be the fallout from a poor purchasing decision: opportunity costs, wasted implementation time, living with tools not fit-for-purpose and the resulting cynicism waiting for the next poor soul who suggests a tech-centric initiative.
Lawyers, law departments and law firms have a hard time making coherent technology buying decisions because we lack time, attention, familiarity and acumen. We should introduce more rigor into our technology purchase process to make better, more objective decisions tailored to our specific needs.
Mistakes We Make
Using technology is still outside most legal professionals’ comfort zones. Choosing technology is an order of magnitude more disconcerting. Lacking the appropriate foundation, we are frequently given to chasing slick solutions in search of real-world problems. Even when a problem or use case is identified, we often behave like last-minute shoppers. We self-limit options because we are unfamiliar with the universe of vendors and, instead, go with a short list pieced together from Google, scattered articles and chance interactions. We then make hasty, if not always quick, decisions because we get distracted by the multitude of other legitimate priorities that compete for our time and attention. We take what seems like the path of least resistance. We gravitate toward shiny objects, instant gratification and the promise of the black box that drives superior outputs from the same inputs. Well-done but serial, siloed and canned demos easily sway our sense of “must-haves.” Meanwhile, distinguishing between vendors is a challenge – causing us to default to our impressionistic impulses about which salesperson we liked most on a personal level or who seemed to offer the most politically salable price point. To the extent we find ourselves confused, we look to other indicia of quality like investments made by our respected peers, presuming they face similar problems but actually did all the homework.
A Better Way
An approach used for a long time in other spaces that can be successfully applied to legal tech purchasing decisions is to define and weigh detailed legal and business requirements and base the final decision on a quantitative analysis of responses against those requirements. Following this approach ensures that choices are based on specific needs, leads to an objective decision point and produces a full audit trail to support internal approvals.
Here is a step-by-step example of how it works:
- Identify the problem you are trying to solve. For example, this could be to enable self-service for your business.
- Prepare a list of system features required to solve the problem. For our example, some options would be a smart form to collect information from a business requestor, artificial intelligence to recognize clauses in third-party paper, approval by email and an integrated e-signature tool. To help get started, an exhaustive list of functional requirements for a contract life cycle management system can be found HERE.
- Send the features list to each vendor to complete.
- Score vendor responses and rank solutions. Here is an example of a simple formula:
Importance = How important the feature is to solve your problem
|Importance||Example of Weighting|
|Nice to Have||1|
Availability = How well the vendor’s system covers the feature
|Availability||Example of Weighting|
|Available Out of the Box (i.e., without customization, configuration or third-party integration)||4|
|Available Through Out-of-the-Box Third-Party Integration at No Additional Cost||3|
|Available but Customer Configuration Required||3|
|Available but Vendor Configuration Required (i.e., customization)||2|
|On Road Map||1|
|Not on Road Map||0|
Scores across functional requirements are added together to come up with an overall score for a vendor, and vendors can then be ranked to help determine which tools are worth a deeper look, with a goal of getting as close as is feasible to testing a live system. One light-touch method is to give the vendors a specific script to follow in their demo. This forces them to abandon their canned routines and focus on your core functional requirements. It can also be telling of what is to come during an implementation, testing how well a vendor can translate your instructions into a live build and how easily a system can be configured.
If bandwidth permits, you can also try to gain access to a “sandbox” or build in a pilot period to your license with an easy opt-out. Many of the major technology investments are amenable to this approach, but be warned: these become demanding projects unto themselves. They need to be properly designed to produce useful insights and properly positioned to generate buy-in from the right stakeholders. For those up for the challenge, real-world testing surfaces implementation obstacles that are hard to discern in the abstract or in a demo.
Worth the Cost
Applying rigor and discipline to technology tool selection demands scarce resources: time and attention. In short, it will cost you – upfront. But, ultimately, the initial investment will cost far less than the long-term consequences of making the wrong selection. If you do not have the bandwidth or expertise, find a partner who does – someone whose only objective is to help you make the right choice. Experts can work with you – emphasis on “with” – to create and calibrate your decision matrix. They can then help run proofs of concept for you.