Editor’s note: Google, though “Idea Groups” and “Jigsaw” has been tied to gassing civilians in Syria and now looks to be a primary partner with MI6 and the Mossad in hacking the US election and hanging the blame on Russia. The “Zuck-fool” at “AssBook” is nothing more than one of the thousands of toads we deal with every day.
Washington and London are populated with them. This creepy little story is far from an expose, just more lazy reporting. And so it goes:
Thousands of Google employees have signed an open letter urging the tech giant not to work on a US government surveillance engine which could use artificial intelligence to improve the targeting of drone strikes.
“We believe that Google should not be in the business of war. Therefore we ask that Project Maven be cancelled and that Google draft, publicize, and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology,” reads the letter which was published by the New York Times on Tuesday.
It’s great that Google employees are protesting their company’s Pentagon AI drone research, but that’s hardly the only work Google does for militaries and law enforcement.
What about Google’s work with predictive policing contractors? What about the NSA?https://t.co/8UfdiGVeqc
— Yasha Levine (@yashalevine) April 5, 2018
It goes on to describe Project Maven as a “customized AI (artificial intelligence) surveillance engine that uses ‘wide area motion imagery’ data captured by US government drones to detect vehicles and other objects, track their motions and provide results to the Department of Defense.”
It states that although Google Board of Directors member Diane Greene previously told staff that the technology would not be used to operate or fly drones or to launch weapons, it could still be used to assist in such tasks.
The letter also cites the “growing fears of biased and weaponized AI,” while stating that Google is “already struggling to keep the public’s trust.”
“This contract puts Google’s reputation at risk and stands in direct opposition to our core values. Building this technology to assist the US government in military surveillance – and potentially lethal outcomes – is not acceptable,” it says.
Marine Corps Col. Drew Cukor, who is part of the US government team working on Project Maven, said it “focuses on computer vision – an aspect of machine learning and deep learning – that autonomously extracts objects of interest from moving or still imagery.” He added that the only way to develop an AI system to fit the needs of the government is to have “commercial partners” working alongside Pentagon officials.