Sensors and sensibility: Chris Middleton sets out the case for and against autonomous systems being used on the battlefield.

Six companies have been awarded a share of a £3 million funding pot to develop semi-autonomous concept demonstrators for the British Army, according to an announcement from the government.

The competition, run by the UK’s Defence and Security Accelerator (DASA), sought collaborative ideas for semi-autonomous unmanned reconnaissance systems controlled from manned mobile positions – a concept known as Manned-Unmanned Teaming (MUM-T).

The funded projects will be led by Leonardo, General Dynamics (UK), QinetiQ, Horiba-Mira, SCISYS, and Tekever. The Defence Science and Technology Laboratory (DSTL) will act as technical partner on the various programmes in the lead-up to live demonstrations in April next year.

The winners were announced earlier this month at Defence & Security Equipment International (DSEI), the biennial arms fair in London. This year saw protestors attempt to disrupt the fair amid concerted efforts to force the 2021 event out of the capital.

Protests, led by Mayor of London Sadiq Khan, focused on the event bringing weapons systems into the heart of the city and the presence of delegations from Saudi Arabia and other regimes with poor human rights records. Concerns were also raised about the rise of autonomous military systems and the ‘gamification’ of warfare.

The UK programme is certainly part of a global trend among armed forces to team troops with sensor-filled ground robots, unmanned aerial vehicles (UAVs, or drones), and other autonomous or semi-autonomous systems to create battlefield awareness and tactical advantage. For example, some US troops already use Black Hornet personal reconnaissance drones – miniature, sensor-filled rotorcraft.

The US Army has a number of active programmes to team soldiers with robots. The US Army Research Laboratory (ARL) is researching intelligent systems that, it says, are able to: understand the environment; learn from experience; adapt to dynamic situations; possess a common world view; communicate naturally; perform useful functions; and act independently “within well-prescribed bounds”.

The Lab wants soldiers to be able to communicate with robots swiftly and intuitively, not via onscreen point-and-click interfaces. They also want to minimise the amount of heavy equipment that troops have to carry. Accordingly, it is also advancing robotic and AI systems’ capacity for: abstract reasoning; learning by example; reinforcement learning; semantic perception; natural language communication; human behaviour modelling; manipulation; and agile 3D mobility.

In the latter case, the ARL has also demonstrated an immersive augmented reality environment in which remote operators blend real-time data from drones and ground robots to give soldiers a virtual picture of the battlefield.

Proponents of this type of research would say technologists are creating life-saving systems for troops, while opponents would point to the step-by-step creation of robots that can reason and think independently – the classic Terminator scenario.

At the root of controversy about robotic warfare is the question of autonomy. In many industries, such as space exploration, autonomous systems are primarily developed to allow robots to carry out preset missions in environments where real-time operator control is difficult – the lengthy radio time-lag between Earth and Mars, for example, makes it impossible to control a rover from Earth with a joystick.

In each of these applications a human is always in the loop. Shared-control or operator-assistance technologies offer numerous benefits too, helping humans and machines to collaborate effectively in a variety of settings.

All of these principles apply to the Defence sector, where it would be fair to acknowledge that not all systems are weapons. For example, removing the need for surveillance helicopters or cargo planes to carry human crews decreases weight and significantly increases the available space for fuel or supplies: autonomous aircraft can fly further for longer, and deliver more supplies to more people.

But of course, some autonomous platforms carry different payloads, including weapons – and semi-autonomous warplanes are in active development and service. This, together with fears that smart systems may one day cut humans out of decisions, has given rise to intense debate over the role of autonomy in warfare. Each innovation may appear justifiable in isolation – to solve a specific problem – while being a small step on a more dangerous journey towards the proverbial killer robot.

The dilution of moral agency in life-and-death decisions and the gamification of warfare certainly move humans further and further away from experiencing the consequences of their actions. In those extreme environments the real hazards are ethical, with related issues including AI’s challenge to legal concepts such as liability.

Should a machine be allowed to take a life without a human making or owning that decision? Ethicists would draw the line at that point. However, military personnel might argue that they are merely keeping troops safe, better informed, and better equipped against opposing – and perhaps even better equipped – forces.

In announcing the UK competition winners, Major General Jeremy Bennett, the British Army’s Director of Capability, said, “The Army’s commitment to innovation and UK prosperity has been reinforced in the Army Warfighting Experiment 19. We will work with both the Wildcat prime contractor, Leonardo, and the Ajax manufacturer, General Dynamics (UK), to integrate the control station for UAVs into these platforms.

“Building on previous investment with QinetiQ and Horiba-MIRA, we will show how high levels of automation will reduce the cognitive burden for vehicle commanders and helicopter crews. Finally, two consortiums of small and medium-sized enterprises headed by SCISYS and Tekever will explore the benefits of open architectures and operating UAVs beyond visual range.”

Last year, a range of drones and other unmanned systems were tested by the British Army on Salisbury Plain to tackle the problem of delivering supplies to frontline troops. In June 2018, the British Army’s Chief of General Staff, General Mark Carleton-Smith, outlined a new focus on emerging technologies and announced a programme of ‘Autonomous Warrior’ vehicle tests – an odd choice of name for a project that is supposedly about delivering supplies.

LEAVE A REPLY

Please enter your comment!
Please enter your name here