16.422 Quiz 2 1. Introduction and Domain Comparison The three examples selected: aircraft, highway vehicles and teleoperators; are superficially similar, in that all involve the direction of moving apparatus under the shared control of humans and machines. There are, however, significant differences in both the extent and nature of automation that is desirable and possible in each of the circumstances. In order to contrast the situations, it is instructive to first compare the problem domains of each situation. 1.1 Aircraft The distinctive characteristics of manned aerial vehicles are as follows: - 3-dimensional spatial environment. This has positive consequence in the form of increased flexibility for obstacle avoidance, but negative in terms of the inability to simply cease motion at arbitrary points. - Relatively large response time window, typically many times the human reflex response time threshold. - High probability of fatality in the event of an accident. - High impact of environmental effects (weather). - Relatively low external traffic. - Dynamic external coordination (ATC). - High predictability of external traffic behaviour (e.g. radio). - Low frequency and predictable (mostly terrain based) static obstacle arrangement. - Low tracking precision. - High system complexity. - High level of operator training. - Designed for instrument flight. 1.2 Highway Vehicles The distinctive characteristics of manned highway vehicles are as follows: - 2-dimensional spatial environment. Maneuverability is hence limited, but the operator can choose to coast to a halt safely at most times. - Relatively small response time window, often at or below human reflex response time. - Lower probability of fatality in the event of accident. - Lower impact of all but severe environmental effects. - High external traffic. - Mostly static external coordination (signs, markings). - Lower predictability of external traffic behaviour (mostly visual observation only). - High frequency and less predictable static obstacle arrangement. - High tracking precision. - Low system complexity. - Low level of operator training. - Instrument-only driving not currently possible. 1.3 Teleoperators Teleoperators are a broad class that encompass both of the above categories, with additional defining characteristics relating to their teleoperated status, and many other possible application domains, including undersea, space and all-terrain vehicles, and non-vehicular devices such as robot arms and other actuators. Some of the major characteristics of teleoperators are as follows: - Spatial domain depends on application; includes 2-D and 3-D. - Often significant sense/report/action lag time. - Variable sensor/feedback fidelity. - Little or no probability of fatality in the event of malfunction. - Often little or no external traffic. - Frequency and predictability of static obstacles depends on application. - Variable tracking precision. - High system complexity. - High level of operator training. - Usually some important reason for teleoperation, for example the external environment is frequently hazardous. Given these domain characteristics, many of the differences in the roles of the humans and automation systems, as well as the failure modes and responses, become apparent. These will be dealt with in the following sections. 2. Human Roles All three categories share similarities in the role of the human or humans in the supervisory capacity. For example, although the automation may provide decision support, in all cases the human is responsible for executive strategic planning, context-sensitive decision making and recovery from system failure, although in the teleoperation case system recovery can have a much lower success rate depending on the domain. However, the cases differ in the amount of automatic advisory support, mechanisms for communicating instructions to the automation, operator feedback modes, and emergency protocols. The general distinctions in each case are given in the sections below. 2.1 Aircraft Operations planning: The human is responsible for: - High level navigation and route selection (waypoints) - Scheduling - Trip-wise efficiency selection (speed/fuel) - Communication with external control - Selection of autoflight parameters Depending on the type of aircraft and mission, the human is responsible for primarily higher-level cognitive tasks in general. Programming: The human communicates instructions to the automation by: - Entering alphanumeric data into navigation systems - Operating controls reminiscent of mechanical inputs - Selecting/toggling output modes Communication with the automation is often in explicit form. Feedback: The human acquires situational awareness from automated systems by: - Monitoring dedicated output devices (e.g. warning lights) - Comparing simulated proprioceptive inputs with conditioned memories (force feedback) - Monitoring appropriate modes of multi-function displays - Listening for audible alerts The range and specificity of feedback from the automation is high. Emergency Protocol: Human factors involved in emergency and failure response include: - The human responds to warnings from the automation often in comparison to warnings from other sources. - The response window is relatively large, so the human is often able to choose between partially automated and purely manual recovery. Human decision making in the event of failure tends again to be fairly high level. 2.2 Highway Vehicles Operations planning: The human is responsible for: - Both high and low level navigation and route selection - Momentary speed selection, in general - Momentary heading selection - Momentary engine/transmission parameter selection - Entertainment facility operation The involvement of the human with the immediate motion of the vehicle is high. Programming: The human communicates instructions to the automation primarily by operating control mechanisms to which passive sensors are attached. Explicit instructions are usually reserved for high-level options such as navigation aids or mission-uncritical systems such as entertainment devices. Feedback: The human acquires situational awareness from automated systems by: - Monitoring dedicated output devices and instruments - Audible alerts are typically reserved for reminders, such as seatbelt or open door indicators. Emergency Protocol: Failure of automated systems is not usually safety critical unless an emergency already exists prior to the failure. Automated safety systems augment human response, rather than the reverse. The reaction window is so small that there is typically no option to manually activate an emergency response system; similarly, automated safety systems can rarely be manually disengaged. 2.3 Teleoperators Operations planning: Depending on the application, the human can be responsible for: - High level planning - All forms of immediate control - Selection of supervised autonomy modes The range of roles the human can possibly take is highest in this application. Programming: The human can submit instructions to the automation by: - Master-slave replicated body movements - Translated physical inputs - Abstract parameter selection (e.g. knobs, switches) - High level interpreted commands The range of inputs is broad and depends on application factors. Feedback: The human acquires situational awareness from: - Direct output from appropriate sensors (vision, audio) - Abstract exocentric environmental representations (e.g. electronic maps) - Visual, audible and proprioceptive egocentric feedback representations - Dedicated system status instruments Due to sensor fidelity and lag, the human must be trained to interpret feedback that may exhibit paucity or dissimilarity to the true situation in a variety of respects. Emergency Protocol: All handling of emergency or system failures must be handled through the automated system; pure manual control is by definition not possible. Emergencies are often handled using higher-level decisions. 3. Computer Roles The roles that automation takes can exhibit more marked difference in the three cases than the role of the human; after all, the human is always performing some sort of supervisory function. The characteristic automation roles are given in the following sections. 3.1 Aircraft Interpreting Instructions: - The automation is able to modify many control inputs. - The mapping of inputs to high-level functions (e.g. navigation, ascent/descent rate) is often 1:1 and thus highly specific. - Level of instruction required depends on flight mode. Providing Feedback: - Almost all conceivable system state variables are represented. - State information available in both specific (numerical) and overview (graphical) form; also sometimes redundant (HUD plus panel display). - Critical alerts often redundant (visual, audible, force feedback). - Information reported depends on flight mode. - Relatively high tolerance for false alarm, low tolerance for miss. Performing Control: - Automated direct control (heading, thrust) is common. - Control limits may be strictly enforced (e.g. fly-by-wire). - Preallocated control modes may be entered automatically in response to operator action or detected events. 3.2 Highway Vehicles Interpreting Instructions: - Few if any different operation modes. - Instructions mostly pre-programmed responses to sensor inputs; high level instructions reserved for decision aids (navigation). - Control inputs rarely modified before actuation (some exceptions, e.g. ABS) Providing Feedback: - A small selection of system state variables represented. - State information generally available in only one form. - Automated feedback almost entirely visual from dedicated instruments. - Same information types always reported. - Feedback must be available "at a glance" to match smaller reaction window. - Advisories therefore generally reserved for advance warnings (fuel etc) rather than hazard alerts. - Low tolerance for false alarm, higher tolerance for miss. Performing Control: - Automated heading control impractical due to context sensitivity and reaction window. - Automated speed control only practical under certain conditions (cruise control) with extended reaction window. - Automated engine and traction control becoming common, as decisions required below human reaction threshold. - Automated support of some emergency response common (ABS, air bags). 3.3 Teleoperators Receiving Instructions: - All control inputs subject to automated interpretation. - Usually redundant command space (more than one way to achieve task) for complex teleoperated devices, especially as lag increases. - Often several levels of selectable autonomy or control modes. Providing Feedback: - As many system state variables should be represented as possible. - Link bandwidth, sensor fidelity may determine feedback type. - Redundant feedback often preferable (e.g. combination egocentric/exocentric status indicators). - Feedback should match human experience (HMD, force feedback, stereo). - Prediction may be necessary due to lag. - False alarm less undesirable than miss. Performing Control: - Some level of automated control necessary due to communication lag. - Automated reaction ability desirable if typical hazard reaction window exceeded by lag. - Ability to override erroneous/outdated control input sometimes desired. - Autonomous modes use preprogrammed logic to interpret high-level commands. 4. Failures The types of automation failures, the appropriate responses, and the criticality of such failures as defined by the risk to human life (as opposed to risk of loss of the machine), differ in the three categories. The characteristic failure categories and procedures are listed below. 4.1 Aircraft Failure Modes: - Low-level actuation control malfunction (e.g. hydraulics, fly-by-wire) - High-level control malfunction (e.g. autopilot) - Instrument malfunction - Operator input error (e.g. programming error, mode selection mismatch) - External navigation/coordination system shutdown - False hazard alarm - Missed warning (either by automatic system or operator) Responses: - Educated decision on part of operator (e.g. emergency landing) - Pilots trained to operate craft in event of control failure - Visual flight in case of instrument failure - Conferral with ground-based support - Cross-check of settings - Redundant processing ability - Redundant alerts - Automated accident avoidance systems (e.g. go-around) 4.2 Highway Vehicles Failure Modes: - Mechanical assist actuator failure (e.g. power steering/brakes) - Automated system/sensor failure (e.g. traction control) - Instrument malfunction, including operator error due to erroneous readings - Navigation system malfunction/error Responses: - Coast to halt usually possible - Vehicle status monitoring via non-automated means (e.g. engine sound) - Some automatic systems not critical to continued vehicle operation; travel can be continued at reduced performance - Navigation by external means (signage) - Manual shutdown of misbehaving device (e.g. stereo) 4.3 Teleoperators Failure Modes: - Communications link failure - Control/feedback errors due to noise - Sensor/actuator failure - Operator error (mode selection mismatch, feedback misinterpretation) - Deficiencies in autonomous programming Responses: - Educated decision by human operator - Reset communications link - Automatically activate "holding" mode in case of link failure - Automated survival programming/overrides - Redundant sensors/actuators - Redundant processing ability - Redundant alerts - Informative command confirmation requests - Reversion to different autonomy levels