February 17th, 2026
Chamber of Commerce

Better call DAVID? AI chatbot gives legal advice in Alberta personal injury cases


By Canadian Press on February 17, 2026.

EDMONTON — DAVID, a chatbot powered by artificial intelligence, answers a call before the first phone ring even finishes.

“Hi,” it says.

“Did you reach out looking to speak with someone about an injury claim?” it asks, before saying the call is being recorded.

It then expresses empathy.

“I’m sorry to hear you were in (an) accident. Could you tell me a little more about what happened?”

After asking for the caller’s name, address and more details, DAVID presents what could be the outcome of a civil claim and requests a fee.

DAVID, an acronym for Digital Attorney for Victim Injury Disputes, is operated by Painworth, an Alberta-based personal injury law firm.

Co-founder Michael Zouhri, who has a background in data science, says the AI lawyer received its first client in December.

“If you’re in any sort of personal injury incident, whether it be a car accident, a slip-and-fall, you can call DAVID,” Zouhri said in a phone interview.

DAVID is available 24-7 and can chat in almost every language.

Each client is overseen by one of Painworth’s three human lawyers, Zouhri said.

Zouhri said he got the idea to create DAVID after he was hit by a drunk driver in 2019 and struggled to find a lawyer. He was left with nerve damage and a concussion and still receives treatment.

“I called a couple of those billboard attorneys, and I met several of them. And honestly, I just really hated the process,” he said.

They didn’t respond on weekends, each gave different advice and nearly all asked for half of a possible settlement.

Angela Lee, a professor in the University of Alberta’s faculty of law, says DAVID is among several AI-powered tools given the green light in recent years in Alberta to provide legal assistance.

“This is an emerging landscape,” she said.

Lee said the Law Society of Alberta has relaxed multiple rules and regulations so Painworth and DAVID can assist with personal injury claims.

For example, the rule that only people licensed to practise law can own and operate law firms in Alberta was waived, as Painworth is not owned by a lawyer, Lee said.

“And then the rule around preventing unauthorized practice of law is an exemption … to allow (Painworth’s unlicensed) AI bot to assist people with their legal claims,” Lee said.

The law society says exemptions were given to Painworth through its Innovation Sandbox initiative, and DAVID can only provide services in Alberta.

“The Sandbox environment supports potential providers in testing new ideas and models for the delivery of legal services in a controlled environment, with the law society providing guidance and oversight,” Elizabeth Osler, the organization’s chief executive officer and executive director, said in an email.

The law society’s website shows it has also greenlighted other companies to use AI, including Philer.

Philer’s website says it uses AI to conduct real estate transactions.

Exemptions have also been given to a company called Jointly, which uses AI to develop marriage contracts.

Gideon Christian, an AI and law professor at the University of Calgary, says law-society-led sandbox initiatives have popped up across Canada in recent years, including in British Columbia, Manitoba and Ontario.

He said DAVID is more of a legal assistant than a lawyer, but “a huge game changer.”

“If a lawyer is billing you $400 an hour to maybe spend eight hours performing a task that an AI tool can perform in less than 30 minutes, then it makes huge economic sense.”

Lee said using DAVID to assist with personal injury cases is also a more reasonable way for lawyers to start incorporating AI in their work.

“For the average person who’s looking to sue someone for a slip-and-fall or who got into some kind of car accident, the claims won’t typically go to trial. And they’ll typically be more straightforward in that there’s a large body of precedent of similar cases that courts will then use to decide what’s a fair outcome.”

But Lee said there are privacy concerns.

And while DAVID might express empathy, it doesn’t embody it.

“There have been a number of situations where AI models like ChatGPT have caused psychosis, because of the sycophantic way in which they engage with users,” she said.

“And so, while there is this way in which talking to AI bots about your legal claims can give you the sense of feeling heard and understood, sometimes that might not be fully rooted in reality in the way that you might get when you talk to an actual human being.”

AI can also get details wrong, Lee said, but she appreciates that humans are overseeing AI tools in Alberta’s sandbox initiatives.

“Not only can AI be wrong, but it’s often quite confidently wrong. It produces outputs that look very polished and sound convincing but aren’t necessarily legally sound.”

She said the tools’ effectiveness can only be determined in the years to come.

“It’s important to have a balanced approach that does allow for incremental innovation while also ensuring robust safeguards.”

This report by The Canadian Press was first published Feb. 17, 2026.

Fakiha Baig, The Canadian Press

Share this story:

42
-41
Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments


0
Would love your thoughts, please comment.x
()
x