Exploring the risks of autonomous weapons and their increasing influence on daily life
“If we’re imagining a single ‘Terminator’ arriving at our doorstep, we might be focusing on the wrong threat,” says Matt Mahmoudi, an AI researcher with Amnesty International. “What we should really be watching out for are the more subtle ways in which these technologies are beginning to shape our everyday lives.”
Laura Nolan, a software engineer who previously worked at Google and now represents the International Committee for Robot Arms Control, shares a similar concern. She points out that lethal autonomous weapons systems (LAWS) are closely tied to surveillance technologies. “These weapons are intricately linked to surveillance systems,” she explains.
Nolan goes on to warn about the potential dangers of embedding societal and warfare logic into algorithms and automated processes. “When we start encoding these behaviors, it could lead to uncontrollable consequences,” she cautions.
Despite the concerns, Mahmoudi believes there is still hope for a ban on autonomous weapons, referencing existing international laws that protect against chemical and biological weapons. “It’s not too late, but we must prioritize human values over data-driven agendas,” he asserts.
On UpFront, Marc Lamont Hill discusses the potential risks posed by autonomous weapons with experts Laura Nolan from the International Committee for Robot Arms Control and Matt Mahmoudi from Amnesty International.