As robots grow more autonomous, society needs to develop rules to manage them. A recent piece in the Economist explored the issue of robot ethics and some of the dilemmas that autonomous machines present.
A US military agency is to announce Grand Challenge for a new generation of humanoid robots to add it to the country’s military arsenal.
Pentagon’s Defense Advanced Research Projects Agency (DARPA) is turning its attention to legged humanoid robots. According to robotic news portal Hizook, the agency will soon officially announce its new Grand Challenge for a robot able to “operate in an environment built for people and interact with made-for-human tools.”
DARPA wants the androids to be able to drive an open-frame utility vehicle such as a tractor. The task is to get into the driver’s seat and drive it to a specified location. Then to get out of the vehicle, maneuver to a locked door, unlock it with a key, open the door, and go inside.
Will anyone even consider that this cannot go well?
We’re moving, as Johnson and colleagues put it, “from a mixed phase of humans and machines, in which humans have time to assess information and act, to an ultrafast all-machine phase in which machines dictate price changes.” We’re crossing a boundary into a trading twilight zone, and doing so without much thought or awareness of the potential dangers.