Technology Warnings on autonomous weapons

Debi

Owner/Admin
Staff
Joined
Sep 16, 2013
Messages
241,998
Reaction score
235,208
Points
315
Location
South of Indy
http://www.wired.co.uk/news/archive/2015-07/27/musk-hawking-ai-arms-race

A global robotic arms race "is virtually inevitable" unless a ban is imposed on autonomous weapons, Stephen Hawking, Elon Musk and 1,000 academics, researchers and public figures have warned.

In an open letter presented at the International Joint Conference on Artificial Intelligence in Buenos Aries, the Future of Life Institute signatories caution that "starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control".

Although the letter, first reported by the Guardian, notes that "we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so", it concludes that "this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow".

Joining Professor Hawking and SpaceX founder Elon Musk below the letter are Steve Wozniak, cofounder of Apple, linguist Noam Chomsky, cofounder of Sky Jaan Tallinn and Stephen Goose, director of Human Rights Watch's arms division.

The UK says it is not developing lethal AI, but the potential to build such weapons already exists and is developing fast -- a recent report into the future of warfare commissioned by the US military predicts "swarms of robots" will be ubiquitous by 2050. In response, experts and high-profile figures like Musk have made repeated calls to limit the development of deadly AI, even as peaceful autonomy grows more central to virtually every other area of tech and industry. The Future of Life Institute announced in June it would use a $10m donation from Elon Musk to fund 37 projects aimed at keeping AI "beneficial", with $1.5m dedicated to a new research centre in the UK run by Oxford and Cambridge universities.

The latest letter starts by defining autonomous weapons as those which "select and engage targets without human intervention", including quadcopters able to search for and kill people, but not remotely piloted missiles or drones. It also lists the arguments usually made in favour of such machines -- such as reducing casualties among soldiers.
 
There is still hope...
j5_and_toronto.jpg
 
I am afraid (pessimistic) that this "ban" will NOT occur.
Name one area in science that is off limits to the military industrial complex, I am unable to.
Microbes, atomic, chemical, mechanization, air, sea, engineering etc. rationalization of the need (to save lives)...remember the need to use 2 atomic bombs in WWII ? To save lives....as an invasion would have costs too many lives..and I always wondered what the hurry ? Our fleet was there and I did not think they (Japanese) were going anywhere.
Scientist that value financial security and the chance to proceed in their research with a fat checkbook backing them would find this hard to turn down also..
I just don't see this Genie being kept in a bottle at all.
Paranoia does not help either.
My opinion
 
The social responsibilities of the scientist

I can’t imagine the emotional struggle experienced by the nuclear physicists who worked on the science of the atomic bombs that ultimately led to the 1945 obliteration of Hiroshima and Nagasaki and deaths of about 250,000 human beings within four months of the blasts. J. Robert Oppenheimer, most often credited as “father of the atomic bomb,” was noted in his 1967 NYT obituary as having said:

“Scientists are not delinquents,” he added. “Our work has changed the conditions in which men live, but the use made of these changes is the problem of governments, not of scientists.”