Research Projects

Human-Automation Collaboration Taxonomy (HACT)

HACT is an innovative taxonomy aimed at providing a new information-processing model for collaborative human-computer decision-making, defining specific collaboration roles (moderator, generator, and decider) and their characteristics, and representing collaboration in a direct-perception visualization. HACT can be both used to describe and compare command and control system(s) from a collaboration standpoint. Future research based on HACT will incorporate design trade-off characterizations in order to provide system designer with a cost-benefit analysis tool.

S. Bruni, J.J. Marquez, A. Brzezinski, C. Nehme and Y. Boussemart (2007). Introducing a Human-Automation Collaboration Taxonomy (HACT) in Command and Control Decision-Support Systems, Accepted, 12th International Command and Control Research and Technology Symposium, Newport, RI, June, 2007.

Office of Naval Research

 

 

 

 

back to top

Tracking Resource Allocation Cognitive Strategies (TRACS)

The "Tracking Resource Allocation Cognitive Strategies" tool (TRACS) allows for post-hoc visualization of the cognitive steps exhibited by a human operator while interacting with a multivariate resource allocation decision-support interface. This tool was applied to both mission planning for multi-criteria resource allocation for military strikes, and also multi-variable geospatial path planning problems for astronaut moon traversals. Both domains involve a human operator interacting with an automated decision-support system in order to find a solution to a complex planning problem involving multivariate and constrained optimization for a cost function. With the help of TRACS, clear patterns of behavior were identified that could be correlated to performance in both applications.

Related papers:

S. Bruni, Y. Boussemart, M.L. Cummings, and S. Haro . Visualizing Cognitive Strategies in Time-Critical Mission Replanning. In Proceedings of HSIS 2007: ASNE Human Systems Integration Symposium, March 19-21, 2007, Annapolis, MD, USA, 2007.

Bruni, S., Marquez, J., Brzezinski, A., & Cummings, M.L., Visualizing Operators’ Cognitive Strategies In Multivariate Optimization, Proceedings of HFES 2006: 50th Annual Meeting of the Human Factors and Ergonomic Society, San Francisco, CA, USA, October 16-20, 2006.

Bruni, S., & Cummings, M.L., Tracking Resource Allocation Cognitive Strategies for Strike Planning, COGIS 2006 - Cognitive Systems with Interactive Sensors, Paris, France, 2006.

Office of Naval Research

back to top

Combat System of the Future

Combat System of the FutureThis project is a continuation of research for the Combat System of the Future, a Naval Submarine that will be operational 20-25 years down the road. The main focus of current research is the design of a Mobile Situational Awareness Tool (MSAT) to aid the commander in collision avoidance for surface operations. A Cognitive Task Analysis has been completed, and the informational requirements were used in the design of the MSAT. In order to determine the usefulness of this tool, testing is currently underway to determine the added benefit of this tool when compared with current navigation methods, including the usefulness of an automatic path planner for path planning/re-planning.
» see video

Related paper:

Carrigan, G. P., (2009), The Design of an Intelligent Decision Support Tool for Submarine Commander, S. M. Thesis, MIT Engineering Systems Divisvion, Human Systems Engineering Track, Cambridge, MA.

Sponsored by Rite Solutions

back to top

Multimodal Interface Toolkit for UAV Systems (MITUS)

Multimodal Interface Toolkit for UAV SystemsThe visual channel is the primary modality for displaying information to unmanned aerial vehicle (UAV) operators. The focus of the MITUS research has been to explore alternative modalities and combinations of modalities for displaying information to UAV operators. Are there certain pieces of information that are better portrayed over the audio or haptic channel? What are the effects on performance and workload when information is parsed out or repeated over various modalities? This research project has evaluated the use of continuous audio (sonifications), a tactor wrist vibrator, and a waist pressure band. All testing has been completed on HAL's Multiple Aerial Unmanned Vehicle Experiment (MAUVE) simulator. The results of an experiment with 44 military personnel have shown that using continuous audio in conjunction with visual displays does enhance operator performance. Further, preliminary studies have shown that continuous haptic feedback can also enhance performance, in particular for monitoring of events that are continuous in nature (e.g., UAV course conformance).

back to top

StarVis: A Configural Decision Support Tool for Schedule Management of Multiple Unmanned Aerial Vehicles

StarVisAs unmanned aerial vehicles (UAVs) become increasingly autonomous, current single-UAV operations involving multiple personnel could transition to a single operator simultaneously supervising multiple UAVs in high-level control tasks. These time-critical, single-operator systems will require advance prediction and mitigation of schedule problems to ensure mission success. However, actions taken to address current schedule problems may create more severe future problems. Decision support could help multi-UAV operators evaluate different schedule management options in real-time and understand the consequences of their decisions. This thesis describes two schedule management decision support tools (DSTs) for single-operator supervisory control of four UAVs performing a time-critical targeting mission. A configural display common to both DSTs, called StarVis, graphically highlights schedule problems during the mission, and provides projections of potential new problems based upon different mission management actions. This configural display was implemented into a multi-UAV mission simulation as two different StarVis DST designs, Local and Q-Global. In making schedule management decisions, Local StarVis displayed the consequences of potential options for a single decision, while the Q-Global design showed the combined effects of multiple decisions. An experiment tested the two StarVis DSTs against a no DST control in a multi-UAV mission supervision task. Subjects using the Local StarVis performed better with higher situation awareness and no significant increase in workload over the other two DST conditions. The disparity in performance between the two StarVis designs is likely explained by the Q-Global StarVis projective “what if” mode overloading its subjects with information. This research highlights how decision support designs applied at different abstraction levels can produce different performance results.

Asymmetric Collaboration for Distributed Teams

Table top display Complex task domains such as emergency response and command and control often involve collaboration between operational personnel in the field and tactical personnel in a central command centre responsible for coordinating the efforts of those operational personnel. The asymmetries of their respective work environments, job responsibilities, available information, and situation constraints produce distinctly different technological requirements for potential support systems for these different personnel. A tactical actor in a command centre may exploit the benefits of a large powerful computing system, but the operational actor in the field is restricted to using a small handheld device. We use a tabletop display to support the planning and coordination duties of the tactical actor, and a handheld device for the simpler map and schedule reading of the operational actor. The acute difference in display sizes, coupled with the other role related differences, create a very asymmetric type of remote collaboration.

Our research focuses on the design of interfaces for tabletop and handheld devices for synchronous time-critical collaboration. We are using urban search and rescue as a scenario. Our task analysis has lead us to create interfaces for sharing three classes of information: maps, schedules, and forms. Maps provide spatial information about the incident, schedules present a temporal view of the team's plan, and forms allow essential data to be captured and disseminated. We treat these classes as shared workspaces that ground conversation between team members, and we provide workspace awareness mechanisms to support collaborative gesturing and editing. Continuing advancements in tabletop displays, mobile computing, and wireless networking set the stage for this work. Synchronous collaboration in mobile contexts offers significant benefits over asynchronous message passing, and the asymmetry we are considering raises new issues that are not apparent in conventional groupware.

Sponsored by Thales and European Commission FP6

back to top

Reducing Operator Workload in the Control of Multiple Unmanned Vehicles

poster of workload ops

In many important applications, current technologies require multiple human operators to control a single unmanned vehicle (UV). However, in order to (a) reduce costs and (b) extend human capabilities, it is desirable to invert this ratio so that one human operator can control multiple UVs. A human operator involved in one-to-many interactions does not have sufficient cognitive resources to perform low-level tasks on all UVs. Thus, low level tasks must be offloaded to the automation, which allows the human operator to focus on more high level tasks. In order to design systems to promote this futuristic control, we are currently investigating:

  1. Given a mission and a team configuration, what is the system's expected performance?
  2. Given a mission and a team configuration, what is the operator's expected workload and how can this be mitigated?
  3. How will adjustments in UV autonomy levels (i.e., changes in team configuration) affect system performance and the operator's workload?

Sponsored by Lincoln Labs, AAI Corporation and Charles River Analytics

back to top

Intelligent Support for Collaborative, Time-Sensitive Operations

Team LabCollaboration and distributed decision making is a critical component for network-centric operations like those needed for first-response teams, air traffic control, and military command and control. In these complex systems, allowing remotely located individuals and/or groups the ability to leverage information both locally and globally to come to decisions is critical. However, since these systems necessarily contain high levels of automation, it is a fundamental human supervisory control problem to determine what roles or sharing of roles is effective, and how intelligent autonomy may improve or degrade time sensitive team decisions. This research effort involves the development of technology to support collaborative decision making between both humans and humans and computers. In particular, this project focuseson supporting teams of operators interacting with highly autonomous unmanned vehicle systems (UVSs) during time-sensitive intelligence, reconnaissance, and surveillance (ISR) missions. In addition, several assistive collaboration technologies are currently under development, including activity awareness interface technologies and interruption assistance technologies to facilitate the planning and coordination activities of both individual team members and team supervisors.

Participants: Boeing Phantom Works, Thales, Air Force Research Lab, Charles River
Analytics
, Office of Naval Research, University of Central Florida's MIT2 Lab

back to top

Interruption Recovery in Time-Critical Human-Supervisory Settings

The goal of this research is to evaluate the effectiveness of an Interrruption Recovery Assistant tool in reducing the negative effects of interruptions on recovery time, decision accuracy and overal task performance of team supervisors in a simulated futuristic Unmanned Aerial Vehicles team task environment.

This research is funded by Boeing

back to top