Robots and the future of warfare

By Adam Seline

Warfare in the 21st century has become progressively more automated, placing ever-greater emphasis on computers and robots. The skies are no longer filled with expensive jet fighters, such as the American F-22, but instead with near-silent predator drones, unmanned planes that are far less expensive to manufacture, as well as considerably more expendable.

Unmanned craft have become very popular in military circles, as technological developments in recent years have made them viable in ever more diverse fields of operation. Longer battery life, greater resilience, and more reliable computer interface, as well as budding progress in the realm of artificial intelligence, make such tools very desirable for use in hostile climes where human personnel would be in mortal danger. Scientists have developed increasingly varied uses for these robotic operatives, creating more and more advanced and adaptable machines that can react with ever greater semblances of intelligence.

Researchers at the Georgia Institute of Technology have taken a step into what may well be the future of espionage, having developed a robot that can actually deceive its adversaries, according to their study, published in the International Journal of Social Robotics. This marks a new step in the development of artificial intelligence programmes, with robots now able to exercise greater freedom due to sophisticated algorithmic programming.

By use of interdependence theory and game theory, Dr Ronald Arkin, the lead researcher on the project, developed algorithms to teach a robot how to recognise scenarios warranting the use of deception tactics. Georgia Tech’s robot is designed to deceive enemy soldiers by creating false trails and hiding so as to evade capture.

The robot was also taught how to distinguish between differing scenarios, and to employ different strategies. During trials, a “hider” robot, programmed with the deception algorithm, had to hide from a programmed “seeker” robot. Through deceptive tactics the “hider” was able to evade the “seeker” in 75 percent of trials.

While Arkin’s research team expect their research to be useful in both the military and civilian spheres, they can see the moral dilemma that robots capable of deception might pose to some people wary of negative consequences.

If Hollywood has taught people nothing else, it is to fear machines that become dangerously clever. Despite the potential ethical conflicts, Georgia Tech’s research offers a new avenue for research and its advances will hopefully serve as a harbinger of a brighter future in the field of robotics.