Network for innovative training on rotorcraft safety.

News

Ironies of Automation by Daniel Friesen (ESR#11)

Cockpit of the AgustaWestland 139 helicopter, by user Jet Request [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)], from Wikimedia Commons.

Piloting a helicopter is not an easy task. To support the pilot, automated systems like stability augmentation systems, autopilot systems or flight management systems have been developed and used in the aviation domain. While these systems can be a great asset, many issues and potential pitfalls must be considered during their development and design process. This blog post briefly describes some of the more widely known issues with these systems, called “Ironies of Automation”.

Ironies of Automation
The so-called “Ironies of Automation”, as introduced by Lisanne Bainbridge in 1983, describe pitfalls of introducing automation into human control tasks. Originating in the industrial process control domain, the discussed ironies are also applicable to the generalized case of a (complex) technical system being controlled by a human operator. Bainbridge’s paper details a number of problems which arise by reallocating control responsibilities from the human operator to automated systems.
I want to highlight some of the ironies of automation that must be considered when developing and evaluating helicopter automated systems. It is based on Bainbridge’s original paper [1]. Reviews of the current state of these problems are available for example in [2] or [3]. There exist more issues, beside these ironies, that must be considered while designing or evaluating automated systems, which are outside of the scope of this short blog post – interface design principles, which metrics should be used for system evaluation, or additional automation issues like opaqueness and brittleness.

1. Tasks after automation
Often, automated systems are designed for a certain task and specific conditions. While the automation controls the system directly, the human’s task is to monitor the actions of the automation. In case of unanticipated automation or system behaviour, or unanticipated external conditions, the human operator is supposed to take control from the automation and control the system manually/directly, or in a state of reduced automation capabilities. The human operator’s task is no longer predominantly a manual control task, but a supervisory control task. This kind of responsibility allocation leads to several problems.

1.1 Manual control skills
“Physical skills deteriorate if they are not used“ [1]. If the operator must take over after unusual system behaviour, he might not have much experience in manually handling the system anymore. This, in turn, can increase the time-delay and control gain (in case of manual control tasks) of the operator’s response, and lead to longer decision times or “worse” responses. Complicating things further, situations that require human intervention will most likely be abnormal situations which the automation couldn’t handle anymore, requiring not only nominal system control capabilities, but the ability to control the system while it shows abnormal behaviour.

1.2 Cognitive skills
“(…) An operator will only be able to generate successful new strategies for unusual situations if he has an adequate knowledge of the process” [1]. Long-term knowledge retrieval is dependent on frequent use. As the operator is only expected to take over control in rarely occurring unusual situations, he might lack enough opportunities to employ his knowledge, hindering his capability of retrieving his knowledge efficiently. In addition, an operator with mainly supervisory responsibilities will seldom have opportunities to build up a knowledge basis about the system behaviour. Therefore, the operator’s ability to build up, as well as the ability to employ his system knowledge might be impaired by introducing automation.
In addition to long-term knowledge storage, the operator of a complex system keeps short-term system characteristics and system states in working storage memory. This knowledge is used to make predictions about future control actions and their effects. “This information takes time to build up. The implication of this (…) is that the operator who has to do something quickly can only do so on the basis of minimum information” [1]. After taking over from automation, the operator will need some time to acquire knowledge of the system state.

1.3 Monitoring
Humans are not built for the task of monitoring a system. “Vigilance” studies show “(…) that it is impossible for even a highly motivated human being to maintain effective visual attention towards a source of information on which very little happens, for more than about half an hour” [1]. Trying to solve this problem by introducing automated alarms only shifts the problem one level higher (“who monitors the alarm system?”). When automated systems run long enough without incident, it might also induce “automation complacency”, the tendency to trust the automated systems even in situations where human intervention is necessary.
In addition, in cases where the automated system outperforms the human operator in nominal conditions, it can be hard for the human operator to distinguish between appropriate system behaviour and abnormal states. It might be hard for him to understand the system’s actions: in nominal conditions, it performs the task at hand “better”, possibly taking more input data or system dynamics into consideration than the human operator can.

2. Operator attitudes
Introducing automation will affect the skills and tasks of the human operator, but it might also impact his satisfaction with the task at hand and his health. It can be hard for operators to build up and maintain their manual control skills if they are not used (anymore). This can lead to situations as described by [4], where fast process dynamics, a high frequency of actions and inadequate skills/options to control the system correspond to high stress levels, high workload levels and poor operator health.

3. Conclusion
Several “ironies of automation” must be considered while designing automated systems. Inducing automation complacency, vigilance problems or a degradation of piloting skills should be avoided. The existence of these issues warrants the analysis of human-automation interaction in the helicopter domain, to enable the development of resilient and effective automation systems that reliably support the pilot, ideally also in unanticipated situations and emergencies.

Literature

[1] L. Bainbridge, “Ironies of automation,” Automatica, vol. 19, no. 6, pp. 775–779, Nov. 1983.
[2] G. Baxter, J. Rooksby, Y. Wang, and A. Khajeh-Hosseini, “The ironies of automation: still going strong at 30?,” in Proceedings of the 30th European Conference on Cognitive Ergonomics – ECCE ’12, 2012, p. 65.
[3] B. Strauch, “Ironies of Automation: Still Unresolved After All These Years,” IEEE Trans. Human-Machine Syst., pp. 1–15, 2017.
[4] C. L. Ekkers, C. K. Pasmooij, A. A. F. Brouwers, and A. J. Janusch, “HUMAN CONTROL TASKS: A COMPARATIVE STUDY IN DIFFERENT MAN—MACHINE SYSTEMS,” in Case Studies in Automation Related to Humanization of Work, Elsevier, 1979, pp. 23–29.

Opening School

NITROS opening school has been held between the 25th and the 26th of January.

The morning of the 25th was dedicated to the introduction to the network of the subject of Project Working Group (PWG) held by Matteo Ragazzi, Head of Airworthiness at Leonardo Helicopter Division. The subject of the PWG is the assesment of the feasibility of an Extended Range Helicopter Operational Standard (EHOPS) for Offshore Flights.
EHOPS is an interesting proposal on how to manage risk properly, and so to increase safety, launched by Leonardo Helicopters during the 11th Rotorcraft Symposium in 2017.
The idea is to develop design and operation rules for helicopters in a fashion proportional to the specific risk faced. Safety improvement could not be linked just to airworthiness of the design but it should be linked to operational risk. The risk in fact is the combination of the predicted severity – i.e. criticality – and likelihood – i.e. probability – of the potential effect of a hazard. Risk is tightly related to operation and should be considered function of many parameters related to the environment where the operation takes place, populated, congested, hostile of mountain areas. This means that the higher is the risk of the specific operation to be performed the more stringent should be the design requirements

A great example of application of this approach can be found in the ETOPS (Extended-range Twin-engine Operational Performance Standards), introduced in 1985 to apply an overall level of operational safety for twin-engined aeroplanes which was consistent with that of the three and four-engined aeroplanes the only allowed to fly transoceanic routes at that time, to which no restrictions were applied.

To increase the awareness of safety issues of the researchers that are participating to the NITROS project it has been decided to perform this assessment of the feasibility of the EHOPS for offshore operations as a teamwork.

In the following parts of the meeting all Early Stage Researchers gave a brief introduction to the network of their research activity.

The OS was closed with the annual meeting of the Scientific Supervisory Board (SSB) of the network.

Off we go, finally!

After a lengthy selection process, the twelve researchers who will be part in the NITROS research and training program joined us. Recruitment procedures have been completed and researchers are beginning to plan their activities in the four participating universities: Politecnico di Milano, TU Delft University of Glasgow and University of Liverpool. The researchers were selected starting from about 120 very good candidates from all over the globe. The selected group represents a great combination of very heterogeneous experiences that will certainly lead to innovative results, increasing the safety of helicopter flight over the years to come. You can check their biographies in the Fellows section at www.nitros-ejd.org After much work in the organization, we can finally start with scientific and research activity.

In this section of the site we will keep you updated on the progress and results of the project, so I invite all interested parties to check the posts and comment them, if you like to. I wish good work to all those who participate and collaborate with the NITROS project.