From the company that brought you Taser stun guns, comes an AI weapon so dangerous it was rejected by the company’s own Artificial Intelligence ethics advisory board. But that didn’t stop the CEO from announcing the weapon as a response to the May 24 Uvalde, Texas elementary school shooting, like some misguided white horse hoping to tase our nation to safety.
According to the BBC, Axon (formerly known, terrifyingly, as Taser International), has announced plans to produce a lightweight taser that can be deployed on a drone or robot and operated remotely via “targeting algorithms.” The operator, a human (for now), will have “agreed to take on legal and moral responsibility for any action that takes place.”
This is how they hope to help stop school shootings.
Axon founder and CEO Rick Smith, himself a father of young twins, told the BBC that the recent elementary school shooting tragedy in Uvalde, Texas compelled him to share the project with the public because “politicians in the US have not been effective in dealing with this.”
His logic follows a precedent set by politicians themselves, especially Republicans who insist that their efforts to “increase mental health access and provide tools to crisis intervention officers” are an acceptable alternative for comprehensive gun control reform. It doesn’t help that, following the Uvalde shooting, Fox News gave a platform to guests who suggested dispersing bulletproof “ballistic blankets” to school children as an effective means of curbing gun violence.
Axon’s taser-drone project would apparently incorporate integrated camera networks that share real-time sensor access with local public safety agencies. The company says they’re not sure who would operate the drones: police departments, federal agencies, or Axon employees themselves.
A limited pilot project last year was panned by the company’s own Artificial Intelligence ethics advisory board, who voted against the idea of moving forward. “Reasonable minds can differ on the merits of police-controlled Taser-equipped drones – our own board disagreed internally – but we unanimously are concerned with the process Axon has employed regarding this idea of drones in school classrooms,” the board said.
But Smith doesn’t care. He told the BBC that, pending regulatory approvals, a “proof-of-concept” model could be ready within a year and field trials could be possible in two.
In a statement to the Associated Press, a New York University law professor who sits on the Axon AI Ethics Board called the idea “dangerous and fantastical.” “This particular idea is crackpot,” said Barry Friedman, “Drones can’t fly through closed doors. The physical properties of the universe still hold. So unless you have a drone in every single classroom in America, which seems insane, the idea just isn’t going to work.”