Nations in a Race to Develop
Efficient Killing Machines
By JOHN MARKOFF
FORT BENNING, Georgia
WAR WOULD BE a lot safer, the United States Army says, if only more of it were fought by robots. Smart machines are already a part of modern warfare, but the Army and its contractors are eager to add more to handle a broader range of tasks, from picking off snipers to serving as indefatigable night sentries. In a mock city here used by Army Rangers for urban combat training, a 38-centimeter robot with a video camera scuttles around a bomb factory on a spying mission.
Overhead an almost silent drone aircraft with a 1.2-meter wingspan transmits images of the buildings below. Onto the scene rolls a sinister-looking vehicle on tank treads, about the size of a riding lawn mower, equipped with a machine gun and a grenade launcher.
Three backpack-clad technicians, standing out of the line of fire, operate the three robots with wireless video-game-style controllers. One swivels the video camera on the armed robot until it spots a sniper on a rooftop. The machine gun pirouettes, points and fires in two rapid bursts.
Had the bullets been real, the target would have been destroyed. The machines, viewed at a “Robotics Rodeo” in October at the Army’s training school here, not only protect soldiers, but also are never distracted, using an unblinking digital eye, or “persistent stare,” that automatically detects even the smallest motion. Nor do they ever panic under fire.
Fifty-six nations are now developing robotic weapons, said Ron Arkin, a Georgia Institute of Technology roboticist. “One of the great arguments for armed robots is they a former vice admiral and the chief operating officer of iRobot, which makes robots that clear explosives as well as the Roomba robot vacuum cleaner. Yet the idea that robots might someday replace or supplement soldiers is still a source of controversy.
Because robots can stage attacks with little risk to the people who operate them, opponents say that robots lower the barriers to warfare. “Wars will be started very easily and with minimal costs,” said Wendell Wallach, a scholar at the Yale Interdisciplinary Center for Bioethics and chairman of its technology and ethics study group.
Civilians will be at greater risk, critics argue, because of the difficulties of distinguishing between fighters and bystanders. This problem has already arisen with Predator aircraft, which find their targets with the aid of soldiers on the ground but are operated from the United States.
Civilians in Iraq and Afghanistan have died as a result of collateral damage or mistaken identities. But among the supporters of robot combatants are even some human rights advocates. “A lot of people fear artificial intelligence,” said John Arquilla, executive director of the Information Operations Center at the Naval Postgraduate School.
“I will stand my artificial intelligence against your human any day of the week and tell you that my A.I. will pay more attention to the rules of engagement and create fewer ethical lapses than a human force.” Dr. Arquilla argues that weapons systems controlled by software will not act out of anger and malice and, in certain cases, can already make better decisions on the battlefield than humans.
“Some of us think that the right organizational structure for the future is one that skillfully blends humans and intelligent machines,” Dr. Arquilla said. “We think that that’s the key to the mastery of 21st-century military affairs.” Automation has proved vital in the wars America is fighting.
In the air in Iraq and Afghanistan, unmanned aircraft with names like Predator, Reaper, and Raven have kept countless soldiers from flying sorties. Moreover, the military now routinely uses more than 6,000 tele-operated robots to search vehicles at checkpoints as well as to disarm one of the enemies’ most effective weapons: the I.E.D., or improvised explosive device.
Mr. Arkin, the Georgia Institute of Technology roboticist, has argued that it is possible to design “ethical” robots that conform to the laws of war and the military rules of escalation. But the ethical issues are far from simple.
In October in Germany, an international group including artificial intelligence researchers, arms control specialists, human rights advocates and government officials called for agreements to limit the development and use of tele-operated and autonomous weapons.
The group, known as the International Committee for Robot Arms Control, said warfare was accelerated by automated systems, undermining the capacity of human beings to make responsible decisions. For example, a gun that was designed to function without humans could shoot an attacker more quickly and without a soldier’s consideration of subtle factors on the battlefield.
“The short-term benefits being derived from roboticizing aspects of warfare are likely to be far outweighed by the long-term consequences,” said Mr. Wallach, the Yale scholar, suggesting that wars would occur more readily and that a technological arms race would develop. As the debate continues, so do the Army’s automation efforts.
In 2001 Congress gave the Pentagon the goal of making one-third of the ground combat vehicles remotely operated by 2015. That seems unlikely, but there have been significant steps in that direction. For example, a wagonlike Lockheed Martin device that can carry more than 450 kilograms of gear and automatically follow a platoon at up to 27 kilometers per hour is scheduled to be tested in Afghanistan early next year.
For rougher terrain away from roads, engineers at Boston Dynamics are designing a walking robot to carry gear. Scheduled to be completed in 2012, it will carry 180 kilograms as far as 32 kilometers, automatically following a soldier. The four-legged modules have an extraordinary sense of balance, can climb steep grades and even move on icy surfaces.
The robot’s “head” has an array of sensors that give it the odd appearance of a cross between a bug and a dog. Indeed, an earlier experimental version of the robot was known as Big Dog.
In November the Army and the Australian military held a contest for teams designing mobile micro-robots ? some no larger than model cars ? that, operating in swarms, can map a potentially hostile area, accurately detecting a variety of threats. Military technologists assert that tele-operated, semi-autonomous and autonomous robots are the best way to protect the lives of American troops.
Army Special Forces units have bought six lawn-mower-size robots ? the type showcased in the Robotics Rodeo ? for classified missions, and the National Guard has asked for dozens more to serve as sentries on bases in Iraq and Afghanistan. These units are known as the Modular Advanced Armed Robotic System, or Maars, and they are made by a company called QinetiQ North America. The Maars robots first attracted the military’s interest in 2008.
Used as a nighttime sentry against infiltrators equipped with thermal imaging vision systems, the battery-powered Maars unit remained invisible ? it did not have the heat signature of a human being ? and could “shoot” intruders with a laser tag gun without being detected itself, said Bob Quinn, a vice president at QinetiQ.
To follow military rules of engagement, Maars has more recently been equipped with a loudspeaker as well as a launcher so it can issue warnings and fire tear gas grenades before firing its machine gun. Remotely controlled systems like the Predator aircraft and Maars move a step closer to concerns about the automation of warfare.
What happens, ask skeptics, when humans are taken out of decision making on firing weapons? “If the decisions are being made by a human being who has eyes on the target, whether he is sitting in a tank or miles away, the main safeguard is still there,” said Tom Malinowski, Washington director for Human Rights Watch, which tracks war crimes. “What happens when you automate the decision? Proponents are saying that their systems are win-win, but that doesn’t reassure me.”
The United States
Army is eager to add more robots to its fighting ranks. An
armed robot, called Maars, maneuvering at a training site . Most
armed robots are operated with videogame-style consoles,above, helping to keep humans away from danger.
댓글 안에 당신의 성숙함도 담아 주세요.
'오늘의 한마디'는 기사에 대하여 자신의 생각을 말하고 남의 생각을 들으며 서로 다양한 의견을 나누는 공간입니다. 그러나 간혹 불건전한 내용을 올리시는 분들이 계셔서 건전한 인터넷문화 정착을 위해 아래와 같은 운영원칙을 적용합니다.
자체 모니터링을 통해 아래에 해당하는 내용이 포함된 댓글이 발견되면 예고없이 삭제 조치를 하겠습니다.
불건전한 댓글을 올리거나, 이름에 비속어 및 상대방의 불쾌감을 주는 단어를 사용, 유명인 또는 특정 일반인을 사칭하는 경우 이용에 대한 차단 제재를 받을 수 있습니다. 차단될 경우, 일주일간 댓글을 달수 없게 됩니다.
명예훼손, 개인정보 유출, 욕설 등 법률에 위반되는 댓글은 관계 법령에 의거 민형사상 처벌을 받을 수 있으니 이용에 주의를 부탁드립니다.
Close
x