Dominique Shelton Leipzig is featured in this article.

Big Law is readying for what DLA Piper’s new chief data scientist Bennett Borden described as the greatest regulatory issue of the next decade: accountable artificial intelligence.

In addition to DLA Piper’s hiring of Borden to lead an AI and data analytics practice, Allen & Overy recently revealed a ChatGPT-powered AI platform dubbed “Harvey” to help lawyers with client analysis. Other firms are scouring the talent market for tech-fluent lawyers, with Debevoise & Plimpton doubling the size of its AI practice in the past 18 months.

The AI craze has not only reinforced Big Law’s interest in top-notch regulatory and privacy expertise, but it has also underscored the wide-reaching implications of the technology across industries and geographic boundaries, and spurred a series of unique, client-tailored offerings.

Winston & Strawn, Mayer Brown and Gibson, Dunn & Crutcher are among a handful of firms pursuing growth in the area to meet a fever pitch of demand, in part spurred by the launch of ChatGPT.

“The ChatGPT paradigm is now inscribed into public consciousness,” said Vivek Mohan, a former senior attorney at Apple who joined Gibson Dunn in June to lead its AI and automated systems practice.

“The legal issues were a bit of a trailing indicator on the news,” Mohan continued. “Over the holidays, a lot of people were playing with ChatGPT, and only in the new year did some clients start reaching out about it. We’ve now crossed the rubicon in terms of this being something companies in all sectors are trying to figure out what to do with.”

Borden, a practicing attorney and data scientist, said demand is on the upswing as clients seek to understand the legal and ethical implications related to AI in a business context.
“Part of this is helping companies understand the disruption that will hit their industry,” Borden said. “Everybody is talking about generative AI because it’s the first time we’re really seeing the technology revolution hit the white-collar classes.

“It is a direct threat to knowledge workers who create and generate content,” Borden added. “That is massively disruptive. How do companies understand that so they can ride the tech wave instead of being crushed by it.”

Responsible AI

After forming an official AI practice group about four years ago, DLA Piper uncovered a need to help clients test and monitor automated decisions, so that discrimination in raw data isn’t carried forward.

“We looked far and wide for a computational legal practice, a group of lawyers who are also data scientists and programmers,” AI practice chair Danny Tobey said, referring to Borden and his team of about 10 data scientists. “By marrying the two practices together, now we have an unprecedented offering that is on the forefront of regulatory changes, and also able to follow up and test clients’ AI.”

By way of example, Tobey pointed to work DLA Piper did for a digital consulting firm building an algorithm to help pharmaceutical companies predict which patients would do well on certain medications. But because the system was trained on economic data, it was tracking who had the best follow-up care, which tended to be wealthy white people.

“They were getting the cause and effect confused,” Tobey said. “We got the product remediated before it ever went out.”

Mayer Brown also launched a unique offering when it hired Dominique Shelton Leipzig in April to launch its global data innovation practice, aimed at providing board-level advice to companies about responsible data stewardship.

“With generative AI, that couldn’t be a more timely offering,” Shelton Leipzig said. “It’s a keen area of attention for global companies around the world. It’s right up there with ESG, whether that is in a social vertical or a governance prong, it is front and center of what CEOs and boards are thinking about.”

Shelton Leipzig said the issues are global in nature, but pointed to one example in the Biden administration’s proposed AI Bill of Rights, which she said boils down to companies being intentional about the use of AI, having governance, monitoring algorithms before it is put into a product and then monitoring regulatory guidelines.

“I’m encouraging clients not to tether themselves, like with Web 1.0 or 2.0, to what is legally permissible, but rather looking ahead aspirationally in order to remain data leaders,” Shelton Leipzig said. “It’s about building trust from the start. It’s the opposite of moving fast and breaking things.”

Technical Expertise

Winston & Strawn is also investing in AI, working to assemble what partner Bobby Malhotra, who joined the firm in November, described as a “SEAL Team Six” or a multidisciplinary team with broad subject-matter expertise focused on dealing with the most challenging areas of data.

Malhotra said just as technology is transcending borders and practices, he no longer considers himself just an e-discovery lawyer. “I’m a data lawyer,” he said. “I’m here to deal with the thorniest, most complex data issues. That could relate to e-discovery, or information governance, or cross-border data privacy issues.”

Among clients, the issues are aplenty, he said. Often, clients are asking about the risks of deploying AI systems, the evolving landscape of data privacy and how various regulators interpret legislation, and cross-border discovery and disclosure requests, which also present privacy concerns.

The firm is responding with “legal teams comprised of experts,” Malhotra said. “Above and beyond, clients want lawyers who can help translate the tech issues and associate data challenges into easy-to-understand and actionable guidance.”

It’s an area of focus for Gibson Dunn, too. The AI practice is oriented as a regulatory offering, Mohan said, which often overlaps with transactional and litigation issues that come to the firm.

Regulatory expertise, combined with deep technical backgrounds, is what differentiates the firm, he said.

In addition to Mohan’s arrival at Gibson Dunn, the firm in recent months has assembled a heavy roster of tech veterans, including former Apple chief privacy officer Jane Horvath and Facebook former deputy general counsel Ashlie Beringer.

“When a client is calling a law firm to advise on legal issues, first and foremost, they are looking for lawyers who understand the business and can help manage and advise on legal risk,” Mohan said. “That means being a lawyer that understands the framework and can crisply distill information and have the fluency to help clients understand it.

“When you have in-house experience, you know what is hard/easy to do and that goes a long way to give advice for something you can implement,” he added. “You can’t not give practical advice.”


Keeping abreast of regulatory developments and the rapid development of technology is all part of the puzzle. But so too is trust, according to Shelton Leipzig.

“So much of this has to do with trust,” she said. “Just look at the markets. The companies that aren’t doing well have distrusted information, or information that is biased or filled with misinformation. We’re seeing the stock tank for companies that do not get trust right.”

Shelton Leipzig said she is working with clients to help them get trust right from the get-go. The global data innovation practice brings expertise to CEOs and board members that otherwise might not have exposure or experience with these challenges, she added.

“If we get the trust principles in place, a lot of innovation can occur,” Shelton Leipzig said.

Mohan more or less agreed.

“This is of interest to everyone and my mother because it has so much potential to impact people, both good and potentially bad,” he said. “People want to see responsibility and guardrails put in place, to unlock the opportunity of the tech while not having us end up like ‘Terminator 2.’”


Reprinted with permission from the March 15 edition of The American Lawyer © 2023 ALM Properties, Inc. All rights reserved. Further duplication without permission is prohibited.