
An AI system’s little oopsie, and a police department’s staggering incompetence, landed an innocent grandma in jail.
Harrowing reporting by North Dakota radio station WDAY details how the 50-year-old Angela Lipps spent nearly six months in the clink after Fargo cops using an AI facial recognition tool flagged her as a suspect in a bank fraud case in the state.
The mother of three — and grandmother of five — says she’s lived her entire life in north-central Tennessee, roughly a thousand miles away from where the crimes she was accused of committing took place. US marshals showed up at her doorstep last July while she was babysitting four kids and arrested her at gunpoint.
First, Lipps was booked in a Tennessee county jail as a fugitive from justice from North Dakota. And because she was considered a fugitive, she was held without bail and sat in the jail for nearly four initial months. Lipps received a court-appointed lawyer for the extradition process, WDAY reported, and was told she’d have to travel to North Dakota to fight the charges.
“I’ve never been to North Dakota, I don’t know anyone from North Dakota,” Lipps told the station.
According to Fargo police department files obtained by WDAY, the error arose from surveillance footage detectives viewed while investigating bank fraud cases in April and May 2025. The footage shows a woman using a fake US Army military ID to withdraw tens of thousands of dollars.
To generate leads, the detectives turned to AI facial recognition software, which identified Lipps as the person in the video.
The cops seemingly did little to verify the AI’s lead. Court documents showed that a detective agreed that the suspect’s facial features, body type, and hair were a match to Lipps. But Lipps said that no one from the Fargo police department ever called to question her.
Adding insult to injury, the Fargo police didn’t pick up Lipps from her Tennessee jail until 108 days after her arrest, after which she was flown to North Dakota to make a court appearance. The first time they interviewed her was in December, when she was being held in the North Dakota lock-up, after she had spent more than five months behind bars.
“If the only thing you have is facial recognition, I might want to dig a little deeper,” Jay Greenwood, a lawyer representing Lipps in North Dakota, told WDAY.
Greenwood produced bank records showing that Lipps was more than 1,200 miles away in Tennessee at the time that investigators say the bank fraud was perpetrated. With Greenwood having essentially done their jobs for them, the police released her from jail on Christmas Eve, and dropped the case.
But Lipps says that the police didn’t even offer to pay for her trip home, and with no money to her name, she was stranded in Fargo. Sympathetic local defense attorneys pooled together money to pay for a hotel room, and a local nonprofit called the F5 Project arranged her trip back to Tennessee.
“I had my summer clothes on, no coat, it was so cold outside, snow on the ground, scared, I wanted out but I didn’t know what I was going to do, how I was going to get home,” Lipps said.
Lipps says she lost her home, her car, and her dog as a result of her stint in jail. No one from the Fargo police department has apologized for the disastrous mix-up, she said.
This isn’t the only criminal case of mistake identity caused by AI tools. In April last year, the New York Police Department arrested a man named Trevis Williams based on a facial recognition match from grainy CCTV footage — despite Williams being over half a foot taller than the suspect in the video. That February, a woman in Detroit sued the city’s police department, alleging that it arrested her after a facial recognition tool identified her as a murder suspect, despite similarly blatant discrepancies in her physical appearance.
More on AI: ICE Is Scanning Civilians’ Faces, Telling Them They’re Being Entered Into a Terrorism Database

