While the advantage of remotely operating a direct-fire weapon such as a machine gun or sniper rifle is obvious, remote weapons can also make small bands of insurgent groups seem stronger and better equipped. The report covers one instance in which Kurdish troops attacked an Islamic State remote-controlled sniper rifle, losing men in the process while the shooter remained protected in a bunker nearby. Instead of using men to protect the remote weapon, the Islamic State instead tied up dogs around the system.
Experts are increasingly impressed and worried with the level of technological sophistication of tele-weapons used by militant groups in Iraq and Syria, especially ISIS. Mowing down Kurdish fighters with a remote-operated gun protected by dogs seems like a perfect combination of ISIS’s patented brutality and technological sophistication.
But really, it’s only the dogs, and the relatively crude-but-successful nature of the operation that make it any different from the way violence is changing, and becoming more remote and even automized. It’s one of the main issues of our day, and I don’t think it will ever be possible to have a real discussion about it, since we rush forward, and any attempt to say we shouldn’t have a weapon is lost in a deafening drum circle of retrograde chest-thumping.
Look for example at the Modular Advanced Armed Robotic System. Even just the name makes it sound like a bad idea, but saying “this will protect the lives of Marines”– which could very well be true!– makes its deployment essentially a done deal. It needs human control now, but that’s just the first step.
There will be a point one day soon when robots will make the kill decision, algorithmically. I’m not a person afraid of a robot takeover, but I am apprehensive of them making what are essentially moral decisions. I’m also concerned that adopting higher and higher tech makes it impossible for us to condemn it, and keep it out of the hands of even worse actors. But barring high-level political action, I don’t see this as a road off of which we’re going to veer.