Refusing to work on AI

Debi

Owner/Admin
Staff
Joined
Sep 16, 2013
Messages
240,727
Reaction score
232,112
Points
315
Location
South of Indy
Google Employees Revolt, Refuse To Work On Clandestine AI Drone Project For The Pentagon

Google Employees Revolt, Refuse To Work On Clandestine AI Drone Project For The Pentagon

Around a dozen Google employees have quit and close to 4,000 have signed a petition over the company's involvement in a controversial military pilot program known as "Project Maven," which will use artificial intelligence to speed up analysis of drone footage.

Project Maven, a fast-moving Pentagon project also known as the Algorithmic Warfare Cross-Functional Team (AWCFT), was established in April 2017. Maven’s stated mission is to “accelerate DoD’s integration of big data and machine learning.” In total, the Defense Department spent $7.4 billion on artificial intelligence-related areas in 2017, the Wall Street Journal reported.

The project’s first assignment was to help the Pentagon efficiently process the deluge of video footage collected daily by its aerial drones—an amount of footage so vast that human analysts can’t keep up. -Gizmodo

Project Maven will use machine learning to identify vehicles and other objects from drone footage - with the ultimate goal of enabling the automated detection and identification of objects in up to 38 categories - including the ability to track individuals as they come and go from different locations.


Project Maven’s objective, according to Air Force Lt. Gen. John N.T. “Jack” Shanahan, director for Defense Intelligence for Warfighter Support in the Office of the Undersecretary of Defense for Intelligence, “is to turn the enormous volume of data available to DoD into actionable intelligence and insights." -DoD

The internal revolt began shortly after Google revealed its involvement in the project nearly three months ago.

Some Google employees were outraged that the company would offer resources to the military for surveillance technology involved in drone operations, sources said, while others argued that the project raised important ethical questions about the development and use of machine learning. -Gizmodo

The resigned employees cited a range of frustrations, from ethical concerns over the use of AI in a battlefield setting, to larger concerns over Google's overall political decisions.

291EB1F400000578-0-image-a-91_1432748766737_0.jpg

More at site
 
That was a brave decision for those employees to resign after revealing that project. It’s probably the best job they’ll ever had. But at least they blew it for an honourable cause.

Google definitely crossed a line there.
I think law makers are going to need to make Isaac Asimov’s three laws of robotics into an actual law that AI developers have to abide by.
 
That was a brave decision for those employees to resign after revealing that project. It’s probably the best job they’ll ever had. But at least they blew it for an honourable cause.

Google definitely crossed a line there.
I think law makers are going to need to make Isaac Asimov’s three laws of robotics into an actual law that AI developers have to abide by.
Sadly, a law won't do any good. We are sliding head first down the drain!
 
  • Like
Reactions: Charleh and Paulm
They are sociopathic weapons :eek:
Yup. They don’t understand the meaning of life because AI will never be truly alive. So they’ll disregard anything it calls living.
 
this tech has been around for a while.....started with the phalanx systems years ago...computers automatically take over the weapons systems when the outer defenses are breached and its too much for humans to handle....possibly also done with the nuke arsenal, as was hinted at during the cold war era as "mad scenario" ( mutually assured destruction)...supposedly prevented a sneak attack and such....anyways its just a continuation of what is already in use.....just makes ya wonder how smart are these systems becoming..if ppl are sacrificing their careers then just what exactly are we developing here.
 
  • Like
Reactions: Lynne