Insights

2026 SKILLS Legal AI Survey: Where Legal AI is Working

What the SKILLS Use Cases Survey tells us and where we are heading next.

by Megan McMillin Mar 6, 2026

Sometimes the best conversations at legal innovation conferences happen in the hallways. That's where people get honest about what they've rolled out, what's working, and what isn't. They talk about the question on everyone’s mind: What tools are firms actually using, in production, on live matters? The SKILLS Legal AI Use Cases Survey is our industry’s version of that hallway conversation at scale.

The survey collected responses from leaders at 130 of the world's largest law firms, from the people responsible for AI strategy, enablement, and deployment. The full survey dashboard releases on March 12, and I'd encourage every firm leader to spend real time with it. The category-level data is detailed and worth reading closely, and it will confirm some things you suspected, challenge a few assumptions, and reveal quite a bit about where this market is headed.

We've gotten very good at measuring adoption. The next step is measuring the effects of AI in law firms.

The survey results tell a clear story about how far firms have come. AI adoption at large law firms is broad, real, and concentrated across the workflows that carry the most risk and the most value, such as drafting, contract negotiation, due diligence, discovery automation, playbook generation, and timelines. This is not back-office work. This is substantive, client-facing work.

Even 18 months ago, most firms were treating AI as an internal productivity experiment. The SKILLS data shows that phase is over, and the pattern is clear in the survey results.

Harvey's results reflect this shift. We appear across 11 use-case categories and lead in seven of the most substantive, including legal drafting, contract review and analytics, due diligence, contract negotiation, playbook generation, discovery automation, and timelines and chronologies. Those results represent trust earned through a deep understanding of how our customers work, and I'm proud of that.

Harvey's results are also part of a broader pattern worth noting. Across the survey, firms are making decisive choices about which platforms they're building on, but the number of tools actually reaching production is smaller than most people in the industry would expect. The long tail of newer entrants is generating evaluation interest but not yet converting to live use at scale. At the same time, established incumbents continue to hold their categories firmly in areas like eDiscovery, timekeeping, and proofreading. The market is sorting itself into layers, and the March 12 data makes that visible in a way I hadn't seen before.

Firms are using AI in real, client-facing work. The SKILLS data clearly settles that question.

But the next question, and the one I expect our industry to focus on in 2026, is what that adoption actually means for the practice of law. How is AI showing up in work product quality? In matter economics? In how firms staff, price, and deliver legal services? In client experience? These are the next round of questions.

I spent fifteen years practicing law and then several more evaluating AI platforms from the buy side at global firms. I know how much weight innovation leaders put on independent deployment data. I also know that every leader reading the March 12 dashboard will be asking the same follow-up question: What happens now?

Adoption data is a measure of commitment. It is not yet a measure of impact. And our industry needs to pursue the impact question with the same rigor SKILLS has brought to measuring deployment.

What I think 2026 is actually about.

The past two years have been about getting AI live through clearing security and governance reviews and getting tools into production on real matters. The SKILLS data documents that chapter, and the March 12 release will be the definitive record of how it played out.

The next chapter is about what happens after go-live. It is about measuring the effects of AI on legal work, on quality, on economics, on how teams collaborate, and moving from "our lawyers have access to an AI tool" to "AI is making our firm's work product, institutional knowledge, and client service measurably better."

The next chapter is about what happens after go-live. It is about measuring the effects of AI on legal work, on quality, on economics, on how teams collaborate.

That is the work I'm focused on at Harvey. Not just the survey results, though they speak for themselves, but what comes next for the firms that are already deployed. The practice-group-by-practice-group, workflow-by-workflow work of making sure AI delivers on its promise.

Explore the full SKILLS Legal AI Use Cases dashboard when it releases on March 12. It will give you a clear picture of where your peer firms stand and might even be more informative than the hallway conversations, where I think the question is shifting from what are we using, to now what?