Following Professor Noel Sharkey’s warning that the fleet of robots being developed for DARPA will ultimately be used to “kill people,” Boston Dynamics has released a new video showcasing how its LS3 robot is able to autonomously track humans over rugged terrain.
The video clip shows a field test of the DARPA Legged Squad Support System autonomously following a man through a forest. The robot is also able to interpret visual and verbal commands.
“We’ve refined the LS3 platform and have begun field testing against requirements of the Marine Corps,” Army Lt. Col. Joe Hitt, DARPA program manager told Fox News. “The vision for LS3 is to combine the capabilities of a pack mule with the intelligence of a trained animal.”
Although Boston Dynamics and DARPA claim the robots are ostensibly being designed to help conduct humanitarian and relief missions, Noel Sharkey, professor of artificial intelligence and robotics at the University of Sheffield, recently warned that the true purpose of the machines is less benign.
Speaking about the Cheetah, a similar robot currently being perfected by Boston Dynamics, Sharkey said the device represented, “an incredible technical achievement, but it’s unfortunate that it’s going to be used to kill people.”
“It’s going to be used for chasing people across the desert, I would imagine. I can’t think of many civilian applications – maybe for hunting, or farming, for rounding up sheep.” Sharkey added.
“But of course if it’s used for combat, it would be killing civilians as well as it’s not going to be able to discriminate between civilians and soldiers.”
As we reported last week, the Cheetah recently broke the human world speed record set by Usain Bolt in 2009 by achieving a running speed of 28.3 miles per hour.
Sharkey has previously warned that the world may be sleepwalking into a potentially lethal technocracy and has called for safeguards on such technology to be put into place.
In 2008, Professor Sharkey told the Alex Jones Show:
“If you have an autonomous robot then it’s going to make decisions who to kill, when to kill and where to kill them. The scary thing is that the reason this has to happen is because of mission complexity and also so that when there’s a problem with communications you can send a robot in with no communication and it will decide who to kill, and that is really worrying to me.”
The professor also warned that such autonomous weapons could easily be used in the future by law enforcement officials in cites, pointing out that South Korean authorities are already planning to have a fully armed autonomous robot police force in their cities.
Paul Marks at the New Scientist pointed out such proposals are somewhat concerning, because they inevitably will be adapted for domestic purposes such as policing and crowd control.
“…how long before we see packs of droids hunting down pesky demonstrators with paralysing weapons? Or could the packs even be lethally armed?” Marks asked.
In 2008, the Pentagon issued a request to contractors to develop a “Multi-Robot Pursuit System” designed to search for, detect and track “non-cooperative” humans in “pursuit/evasion scenarios”.
This again illustrates how behind the veneer of relief missions and humanitarianism, the true purpose that the robots are eventually designed to fulfil is based around tracking, pursuing and even killing humans – in the context of both law enforcement and combat.