(NewsNation) — As shootings at schools, stores and other public places have become increasingly common in the U.S., one company is using artificial intelligence to try to prevent violence.
In an effort to reduce shootings, one security company is using AI algorithms and security cameras to detect weapons and alert law enforcement before a shot has ever been fired. Staffed entirely with former veterans and law enforcement officers who are experts in identifying weapons, the founders designed the technology with the aim of protecting kids from school shootings.
Former Navy SEAL Sam Alaimo is a co-founder of ZeroEyes. Alaimo founded the company with other fellow former SEALs who found they missed having the kind of purpose the military gave them after they got out of the service.
The idea for the company came after the Parkland school shooting in 2018. One of the other cofounders was picking his daughter up from school and noticed security cameras around the building. He asked what they were doing to prevent school shootings and was told the cameras were not used for prevention and were there to be used after something happened.
That formed the idea for ZeroEyes, which Alaimo admitted took longer than expected to build. While the company began with the core goal of serving K-12 schools, the COVID-19 pandemic began shortly after it launched, shutting down schools and forcing them to consider commercial and government clients.
Those include clients like casinos. Travis Thompson, director of compliance and surveillance for the Muscogee (Creek) Nation Gaming Enterprises. While Thompson declined to name the specific casino where the system was in use, he told NewsNation it was critical for the company to have a system that worked well without feeling intrusive for guests who expect a resort-type experience.
But Alaimo said the primary motivation is still clients like schools and hospitals, where people are especially vulnerable.
“Because a school is a place of learning. It’s a place of nourishment, and hospitals a place of healing. Like you go there to get some sort of positive benefit,” Alaimo said. “The idea of a shooting happening in one of those locations, locations is particularly atrocious.”
ZeroEyes works by capturing footage on security cameras and using an artificial intelligence algorithm to identify weapons, which triggers an alert. That alert goes to the ZeroEyes operating center, where in-house employees verify the AI has correctly identified a weapon.
“We staff it 100% with former military veterans and former law enforcement personnel, both of whom are very comfortable identifying guns and very comfortable in high-stress situations,” Alaimo said.
If they confirm the weapon is real, another alert is sent to the client or another agency of their choosing, for example, going directly to a local police department or 911 dispatch center.
For Thompson, the operations center was a major selling point for the system.
“When they got those alerts, we had somebody live on the other end of the phone so we didn’t have the false alarms,” he said.
In addition to the alert, the company sends an image of the potential shooter along with the exact location and time the gun was detected. Alaimo explained that is designed to help prevent a situation like that in Uvalde, Texas, where officers waited 77 minutes before taking down a shooter who killed 19 children and two adults in an elementary school.
By providing exact information, the company hopes to reduce the amount of time first responders would need to spend determining what is happening and the best way to approach a situation.
“What we’re doing here is giving them the situational awareness, they need to show up to the exact location to stop that threat from doing any more harm that he or she may have already done,” he said.
One reason the company uses in-house employees to verify AI instead of outsourcing the job is an emphasis on privacy and security. That is also why the company doesn’t store any biometric data or even show live camera streams to the operations center. Employees there only see footage once AI has already detected a possible weapon.
“We care deeply about privacy, we don’t want to be able to see children’s faces, we don’t want to be able to see hospital patients. So we deliberately built it to avoid all of that we just detect the object that gun, never people, we can’t recognize faces,” Alaimo said.
As a company designed to prevent something from happening, Alaimo admitted it’s difficult to quantify success because it’s impossible to know how many people would have been harmed if their system wasn’t in place.
However, he said many clients have been shocked by the number of guns detected, including those carried by people who never intend to use them and never fire them.
“People would be shocked at how many guns are out there in the world,” Alaimo said.
Although the company has attracted many types of clients, protecting schools continues to be the main motivation as school shootings continue to happen.
“We built this thing to be able to do something about it right now. Not keep offering thoughts and prayers,” Alaimo said.