An Army-led research team developed new algorithms and fill in knowledge gaps about how robots contribute to teams and soldires what robots know about their environment and teammates.
The idea of integrating context into AI development is a difficult process. Researchers tend to have very different ideas about what meant by context and the best practices for integrating context into AI development, what robots know.
Many researchers and institutions have looked at a smaller piece of this problem set.
But this team looked at a more holistic approach for how to develop and integrate different types of context related to environmental context, mission context, and social context to advance human autonomy teaming through advanced bidirectional communication capabilities.
This has advanced science in robotics and AI processes in the areas of natural language communication, world model development, multi-modal communication, and human autonomy teaming.
The collaborative team seeks to identify critical scientific advances made by the Army’s Robotics Collaborative Technology Alliance, or RCTA, on techniques for developing and advancing context-driven artificial intelligence to support future human autonomy teams.
Strategic investments in Army-led foundational research as part of the RCTA resulted in advanced science in four critical areas of ground combat robotics that affect the way U.S. Warfighters see, think, move and team.
This research supports the Army Modernization Priority for the Next-Generation Combat Vehicle by advancing science for integrating context-driven artificial intelligence within human autonomy teams.
The integration of context-driven AI is important for future robotic capabilities to support the development of situation awareness, calibrate appropriate trust, and improve team performance in collaborative human robot teams.
Avenues of research discussed include how context enables robots to fill in the gaps to make effective decisions more, supports more robust behaviors, and augments robot communications to suit the needs of the team under a variety of environments and team organizations and across missions.
The article’s findings will be used to support the laboratory’s continued research in the Human Autonomy Teaming Essential Research Program.
In particular, this research will help to develop effective bidirectional communication methods and interventions for calibrating appropriate team trust and shared situation awareness in high risk, complex operations.