About Me

My photo
I have a burning need to know stuff and I love asking awkward questions.

Saturday, June 09, 2018

Google bans AI for weapon use

From The BBC

8 June 2018

Google has promised not to use AI for weapons, following protests over its partnership with the US military. A decision to provide machine-learning tools to analyse drone footage caused some employees to resign. Google told employees last week it would not renew its contract with the US Department of Defense when it expires next year. It has now said it will not use AI for technology that causes injury to people.

The new guidelines for AI use were outlined in a blog post from chief executive Sundar Pichai. He said the firm would not design AI for:

technologies that cause or are likely to cause overall harm
weapons or other technologies whose principal purpose is to cause or directly facilitate injury to people
technology that gathers or uses information for surveillance violating internationally accepted norms
technologies whose purpose contravenes widely accepted principles of international law and human rights

He also laid out seven more principles which he said would guide the design of AI systems in future:

AI should be socially beneficial
It should avoid creating or reinforcing bias
Be built and tested for safety
Be accountable
Incorporate privacy design principles
Uphold high standards of scientific excellence
Be made available for use

When Google revealed that it had signed a contract to share its AI technology with the Pentagon, a number of employees resigned and thousands of others signed a protest petition. Project Maven involves using machine learning to distinguish people and objects in drone videos. The Electronic Frontier Foundation welcomed the change of heart, calling it a "big win for ethical AI principles".

[If only Google had the kind of power to stop AI being used in weapons – which, of course, it doesn’t. But still it’s a good move on their part although they may still move forward secretly. AI can, by its very nature, be used for multiple functions of course so a benign AI can still be taught to fight drones or some such. The Google decision won’t stop Judgement Day but it might slow it down a bit. It’s all they, and Sarah Conner, can hope for in the end. After all I’m just too darned old to fight killer robots!]

2 comments:

Mudpuddle said...

as you say, a good move if they actually do it... but who'll ever know?

CyberKitten said...

I guess we'll never know for sure if they go ahead with their idea. If they secretly engage with military AI we'll know when it turns on us and we can blame Google.