Google bars use of AI tech in weapons

Google has drafted new standards barring use of its artificial intelligence technology in weapons, after protests by employees against its US military work.

Google will not allow its artificial intelligence software to be used in weapons or unreasonable surveillance efforts, under new standards for its business decisions in the nascent field.

The restriction could help Google management defuse months of protest by thousands of employees against the company's work with the US military to identify objects in drone video.

Google instead will seek government contracts in areas such as cybersecurity, military recruitment and search and rescue, Chief Executive Sundar Pichai said in a blog post on Thursday.

"We want to be clear that while we are not developing AI for use in weapons, we will continue our work with governments and the military in many other areas," he said.

Breakthroughs in the cost and performance of advanced computers have carried AI from research labs into industries such as defence and health in the last couple of years.

Google and its big technology rivals have become leading sellers of AI tools, which enable computers to review large datasets to make predictions and identify patterns and anomalies faster than humans could.

But the potential of AI systems to pinpoint drone strikes better than military specialists or identify dissidents from mass collection of online communications has sparked concerns among academic ethicists and Google employees.

A Google official, requesting anonymity, said the company would not have joined the drone project last year had the principles already been in place.

Google plans to honour its commitment to the project through next March. More than 4,600 employees petitioned Google to cancel the drone project sooner, with at least 13 employees resigning in recent weeks in an expression of concern.

Google's principles say it will not pursue AI applications intended to cause physical injury, that tie into surveillance "violating internationally accepted norms of human rights," or that present greater "material risk of harm" than countervailing benefits.


Share
2 min read
Published 8 June 2018 8:18am
Source: AAP


Share this with family and friends