http://www.dirc.org.uk/  
 
 
   
Overview
Research
 

   Themes  
   Results

Sites
People
Publications
Events
Related Projects
   
 

FULL TITLE

Cognitive conflicts

KEYWORDS

Mental model; Flightdeck systems; Human error

SUMMARY

Many computer-based systems trigger actions under conditions whose complexity is sometimes beyond human cognitive capabilities. The net effect is that the operators sometimes find themselves out-of-the-loop in problematic situations. Namely, they have difficulties in understanding (and predicting) the actionsof the system. When the behaviour of a system is misrepresented in the operator’s mental model, the objectives prescribed by the task may not be achievable, even though there is no technical failure.

Example of conflict: the Cali crash.

In december 1995, the crew of a Boeing B757 did not notice that they had selected an incorrect beacon (Romeo; see figure) as their next waypoint for their approach. It took the crew over a minute to notice that the aircraft was veering off on the wrong heading. Turning back put the aircraft on a fatal course, and it crashed into a mountain near Buga, 10 miles east of the descent track.

(Partial, amended chart of the approach to runway 19 (southbound) at Cali. © Reproduced with permission of Jeppesen Sanderson, Inc.)

In this case, the airplane veering off the expected track was the cognitive conflict. The crew did not immediately detect their mistake because the workload was extremely high when the flight path was being programmed, and because two beacons had the same identifier in the computer database.

The common features of the situations in which cognitive conflicts occur comprise:
• a complex dynamic system;
• the occurrence of an undetected technical problem/human error;
• the poor predictability of the system’s behaviour;
• failure to reject initial plans in a timely manner.

The inherent complexity in current computer-based systems (e.g. aircraft cockpits) does not always allow operators to anticipate the future behaviours of the system. This is often critical to the operation of dynamic systems because of time constraints. Two potential solutions that can enhance the human interaction with complex computer-based systems are being explored further:
• Glass-cockpit assistants. Assistance must be timely. The assistants need to correctly capture the operators’ intentions, be integrated with the existing task environment, provide trouble-shooting support and allow for the evolution of the flightdeck systems.
• Transparent flightdecks must allow predictable systems, with a direct understanding of the inner structure and using computers in an advisory capacity with the crew still maintaining overall responsibility.

Papers

D. Besnard and G. Baxter. Cognitive conflicts in dynamic systems. In "Structure for Dependability: Computer-Based Systems from an Interdisciplinary Perspective", D. Besnard, C. Gacek and C.B. Jones (Eds.), Springer, 2006, ISBN 1-84628-110-5.

Authors

Denis Besnard & Gordon Baxter

 

 
Page Maintainer: webmaster@dirc.org.uk Credits      Project Members only Last Modified: 10 August, 2005