One Incident Shows How AI Still Isn’t the Perfect Solution for School Security
- By Brent Dirks
- December 10, 2025
Yes, artificial intelligence technology is here. And it’s here to stay. But that doesn’t mean AI is perfect as a recent incident in Baltimore, Maryland shows.
In late October, armed police handcuffed and searched a student after his bag of Doritos was incorrectly flagged by an AI detection systems as a weapon.
“The first thing I was wondering was, was I about to die? Because they had a gun pointed at me,” Kenwood student Taki Allen told local TV station WBAL.
Allen told WBAL, saying about “eight cop cars” pulled up to the school.
“I was just holding a Doritos bag — it was two hands and one finger out, and they said it looked like a gun,” he said.
After being flagged by the software, Allen was ordered to his knees, placed his hands behind his back and was cuffed.
Along with the incorrect call by the AI-powered system, there was also failures in communication as well.
Kenwood’s principal Kate Smith later said that the school district’s security department reviewed the alert and canceled it after determine that there was no weapon. But she was unaware of that cancelation and asked the school’s resource officer to intervene. The officer called local police for support.
The Omnilert system has been in use in Baltimore County schools since 2023.
While AI has been a game changer for the physical security industry, incidents like this show that it isn’t something that’s 100 percent accurate. And no matter what technology is used to protect schools, humans are still a big part of the equation.
About the Author
Brent Dirks is senior editor for Security Today and Campus Security Today magazines.