Research funded by DARPA is underway to create a "model of human norms" that gives machines a basic sense of manners and could eventually improve how unmanned systems interact with people across all sectors, including the military.
The Pentagon's hub of tech described the model in a recent announcement, noting that an algorithm can be coded into machines, enabling them to gain additional norms of behavior in unfamiliar situations.
But what's in a norm?
DARPA program manager Dr. Reza Ghanadan said in an interview that his team collected data on what people felt were normal behaviors in certain locations and incorporated it into the programming of the machine. For example, a robot might be coded to be quieter or silent when in a library or theater.
The norms programmed into the current machinery are the most basic, Ghanadan said. More research needs to be done to incorporate situational, cultural, hierarchical and more complex norms.
"We have a maturity level here; fundamental research is level 1 and you want to get to 3 or 4 before you deploy it," he said, referring to the strides science needs to take before this technology can be incorporated into public-use machinery.
He said the public will likely see the technology applied first to personal assistants such as Siri or Amazon Alexa, perhaps synced with calendar and location information, for example, so that a phone is silenced when at the movies, except in cases of emergency.
From a defense perspective, the technology is nowhere near ready to be incorporated into a combat setting. However, the U.S. Air Force is looking to integrate a personal assistant into their Air Operations and Combined Air Operations Centers.
Lead researcher on the project and professor at Brown University, Dr. Bertram Malle, gave a more military-focused frame of reference for the research. He described military missions as not only having defined goals but also operating within other contexts, including laws of war, rules of engagement and the implicit norms within Army platoons or Navy Seal teams.
"A robot that is part of a mission team must be aware of these norms and signal to the team members that it intends to follow them," he said.
Once the technology "grows up," Ghanadan sees it being used by robots and UAVs that interact with humans across all industries, giving them the ability to make humans more comfortable with their presence and actions.
He gave the example of a UAV collecting survey data on an area. It would be uncomfortable for the UAV to stop at a bus stop and "stare" at a person. A UAV with manners would know to move on and come back later. Ghanadan also sees the artificial intuition as useful to driverless cars as they interact with human operators of vehicles that drive very differently than robots. Human drivers arbitrarily change lanes and have slower reaction times than robot drivers may.
"If we’re going to get along as closely with future robots, driverless cars, and virtual digital assistants in our phones and homes as we envision doing so today, then those assistants are going to have to obey the same norms we do," Ghanadan said in an announcement.
Malle said the model is not currently being used but the project is continuing and leading to new grant proposals.