The Tech Perspective with Lizzie Silver:

Technological Aspects and Tech Industry

This episode of Stay in Command emphasised the technological dimensions and concerns as well as the implications on lethal autonomous weapons on the tech industry. Our guest Dr Lizzie Silver is a Senior Data Scientist at Melbourne-based AI company Silverpond.

Content in this episode:

Troubling reality of these weapons [1:49]

Problems with fully autonomous weapons - explainability[3:49]

Facial recognition and bias[7:11]

Military benefits from technical point of view [11:36]

Machines and the kill decision [15:01]

Hacking [16:30]

Positive uses of AI and funding battle [17:10]

Challenge of Dual Use [20:45]

Regulation: Treaty, Company Policy, Individual Actions [22:16]

If you have questions or concerns please contact us via [email protected]

If you want to know more look for us on Facebook, Twitter and Instagram Australia Campaign to Stop Killer Robots or use the hashtag #AusBanKillerRobots.

Become part of the movement so we Stay in Command!

For access to this and other episodes along with the full transcription and relevant links and information head tohttps://safeground.org.au/podcasts/ ( safeground.org.au/podcasts).

Transcript:

Welcome to SafeGround, the small organisation with big ideas working in disarmament, human security, climate change and refugees. I’m Matilda Byrne.

Thank you for tuning in to our series Stay in Command where we talk about lethal autonomous weapons, the Australian context and why we mustn’t delegate decision making from humans to machines.

This episode we’re looking at the “Tech Perspective”. We are going to discuss the technological concerns of lethal autonomous weapons and their implications on the tech industry.

And so with me today I have a great guest with me today in Dr Lizzie Silver. Lizzie is a Senior Data Scientist at Silverpond which is an AI company based in Melbourne, which is also where I am coming to you from - so welcome Lizzie, thanks so much for joining us today

Lizzie Silver[00:00:52] Thanks for having me

Matilda Byrne: Before we jump in, I’m just going to talk a bit about the definition of killer robots in case any of our listers are unfamiliar with exactly what it is we’re talking about.

So killer robots or fully autonomous weapons are weapons that have no human control over the decision making. So when they select a target and engage the target so decide to deploy lethal force on that target, there is not a human involved in that process and it is just based on AI and algorithms. So with these fully autonomous weapons there are lots of concerns that span a whole of areas that span a number of different areas - today we are going to go into technological concerns in particular because we have Lizzie and her expertise, but there's also things like moral, ethical, legal global security - a whole host of concerns really.

What is the most concerning thing about killer robots?[00:01:49]

Matilda Byrne: And what I’m interested in Lizzie, is, just to start off with if you could tell us what is it about fully autonomous weapons that you find the most worrying, so what about them makes you driven to oppose their development.

Lizzie Silver: It’s really a fundamental issue with these issues is you can't give a guarantee on how they’re going to behave. WIth humans we can’t give a guarantee on how they're going to behave but that’s why we have all these mechanisms for holding a human accountable. Now you can’t hold an algorithm accountable in any meaningful way. So what you would like to do is find a way to characterise how it’s going to behave in every situation, but the thing is a conflict situation is just too complex. There are too many potential inputs and outputs, different scenarios that could confront the AI. You’re never going to get through all of them. You’re never going to be able to fully characterise the space. So what you’d like to say is say ‘Ok, on this...

Podden och tillhörande omslagsbild på den här sidan tillhör John Rodsted. Innehållet i podden är skapat av John Rodsted och inte av, eller tillsammans med, Poddtoppen.