Last week the Future of Life Institute released a letter calling for an international ban on autonomous artificially intelligent weapons. Among the 1,500 signatures of technology and robotics researchers were Stephen Hawking, Elon Musk and Steve Wozniak.
According to Scientific American, the signatures are currently over 17,000 and include the John Hancock of current and former presidents and members of the Association of Computing Machinery, The American Association of Artificial Intelligence, and the IEEE Robotics & Automation Society. Additionally, many scientists and researchers of AI robotics companies have signed the open letter including Google’s DeepMind and IBM’s Watson Team.
These leaders of technology and robotics argue that a ban is necessary on autonomous weapons due to the serious threats they pose on humanity. AI algorithms could lack the ability to distinguish combatants from civilians and the situational understanding to determine if violent force is necessary or proportionate.
These are requirements of international law for humans who fire weapons, but AI weapons would not be able to discriminate the same as humans and would open up a an “accountability gap.”
Heather Roff, a political scientists and visiting professor at the University of Denver, worries that the use of AI weapons could lead to a “new class of blameless atrocity” where the killing of innocents could easily be waved off as a mistake, Popular Science reports.
“Suddenly everything becomes an accident,” said Roff.
Suddenly everything becomes an accident. There is no more definition of war crime, because there’s no intention.
Meanwhile, scientists and religious leaders are questioning the ability of AI to form religious and spiritual beliefs, reports Daily Mail.
Dylan Love recently published a report on The Daily Dot concerning AI and religion. Many religious leaders believe that AI can be programmed to believe in religion, but argue whether this would be beneficial to mankind. Some leaders believe it’s entirely possible for artificially intelligent robots to have souls.
“If you left a computer by itself, or a community of them together, they would try to figure where they came from and what they are,” said Marvin Minksy, an MIT professor and pioneer in the field of artificial intelligence.
What humans have is a more complex and larger brain than any other animal – maybe a whale’s brain is physically large, but it’s not structurally more complex than ours. If you left a computer by itself, or a community of them together, they would try to figure out where they came from and what they are.