Psychologist logo
BPS updates

Methods: Accessing expert cognition

Julie Gore and Claire McAndrew explore advances in cognitive task analysis.

16 March 2009

Essentially, cognitive task analysis (CTA) attempts to identify how individual experts perform a cognitive task. Such qualitative methods have received much research attention in cognitive psychology, particularly within the decision sciences and the field of cognitive ergonomics (see Crandall et al., 2006). They assume that language reflects thought – as Ormerod and Ball (2007) note, otherwise there is no 'cognition' to study – and also that cognition is required for the successful completion of an explicit task (Hoffman & Militello, 2008).

There is no single, well-accepted definition of CTA. This is partly due to the fact that the knowledge elicitation techniques that lie at the heart of these methods vary greatly. The most frequently used include structured and semi-structured one-to-one interviews, group interviews, real-time or retrospective 'think-aloud' protocols, analyses of previous incidents and observations of task performance. Each of these methods has had some success using realistic problem-solving and decision-making tasks, many experts and many tasks, or many different scenarios revolving around the same task (Klein & Militello, 2004).

However, CTA methods have been criticised as being incredibly time-consuming and difficult to use, sometimes resulting in problematic data analysis (see Hoffman & Woods, 2000; McAndrew & Gore, 2007). In this article we will review recent developments, providing an overview of one promising technique that begins to provide practical solutions to eliciting aspects of expert cognition.

The ACTA technique

Like CTA, the applied cognitive task analysis (ACTA) technique is intended to assist in the identification of the key cognitive elements required to perform a task proficiently, albeit in a way useful to practitioners (Militello & Hutton, 1998). Empirical work has successfully used ACTA to understand expertise in a diverse range of areas including weather forecasting (Hoffman et al., 2006), clinical nursing (Militello & Lim, 1995), recruitment (Gore & Riley, 2004), financial markets (McAndrew & Gore, 2007), and military command and control operations (Drury & Darling, 2008). The refinement of ACTA for these purposes by Beth Crandall, Gary Klein, Robert Hoffman and Laura Militello has provided a significant development in available tools and techniques for the identification of training needs in knowledge-based work. Furthermore, these developments have provided instructional designers with clearer guidelines when designing training for cognitively demanding tasks in domain specific areas.

The cognitive requirements CTA and ACTA seek to address are:
I    difficult judgments and decisions;
I    attentional demands;
I    critical cues and patterns; and
I    problem-solving strategies/other related topics.

What is unique about ACTA is that it employs a variety of knowledge-elicitation and -representation techniques that systematically build on one another, providing task-specific, high-quality knowledge. ACTA's knowledge elicitation techniques involve interviews (and sometimes observation), whilst the knowledge-representation techniques provide a structured means of organising and comparing cognitive information (cognitive mapping). These techniques were developed to complement each other, each tapping into different aspects of cognition.

We illustrate the methodological process with extracts taken from a study by one of us (McAndrew, 2008), which applied ACTA to the field of behavioural finance in order to understand the cognitive challenges day traders face. The insights elicited by ACTA enabled this study to document the social psychological aspects of financial markets and re-conceptualise day traders' skill as an interactional expertise (i.e. one that occurs between cognition, social environment and technology).

The first step in the process, the production of the task diagram, provides the interviewer with a broad overview of the task. This interview helps identify areas requiring complex cognitive skills, one of which is explored in greater detail during steps two and three. For example, in McAndrew's study of day traders, foreign exchange transactions were normally chunked into four important parts – deciding what currency pairs should be considered as a basis of the trade; what currency pair will perform; technical analysis of the size of trade and level to enter the market at; and exit levels. Participants are encouraged to break the complex task into three to six parts and then identify which aspect of the task is most cognitively challenging. Participants at this stage sometimes want to provide too much detail – at this stage of the process all that is required is a 'big picture' overview. By producing a simple diagram with the participant it is easy to check understanding.

The second step, the knowledge audit, reviews the aspects of expertise required for the effective execution of a cognitive subtask from step one. The audit is theoretically grounded in the research literature on expert-novice differences (see Hoffman, 1992; Klein & Hoffman, 1993) and critical decision method studies (Militello & Lim, 1995). The knowledge audit has been developed with the joint aims of capturing key aspects of expertise and improving and 'streamlining' data collection and analysis. When aspects of expertise are elicited they are individually probed using a series of generic probes to elicit detail and identify concrete examples associated with the task. This technique also encourages the interviewee to identify why elements of the task may present a problem to inexperienced individuals.

Returning to the trading example, the technical analysis subtask from step one was selected to be probed in more detail (see Militello & Hutton, 1998, for sample questions). For instance, by asking the question 'Is there a time when you walked into the middle of a situation and knew exactly how things got there and where they were headed?', the study identified:

I    Aspects of expertise: The example provided by the day trader focused upon a euro–dollar currency pair trend, where the market was performing in line with expectations.
I    Cues and strategies: As an expert, the day trader assesses the situation by looking for key cues and strategies based upon their expertise. In this example, the trader stresses the importance of technical analysis in monitoring the highs and lows of daily trends.
I    Difficulties: The day trader suggests that this particular element of the task would be difficult for inexperienced traders, as novices may lack the conceptual understanding of trending and may have insufficient knowledge of technical analysis methods.

The knowledge audit can take up to two hours to complete and often elicits previously undocumented aspects of successful task completion.

The third step, the simulation interview or scenario, obtains information on the contextualisation of the job or task that is not easy to acquire with the preceding steps. It allows the interviewer to explore and probe issues such as situation assessment, potential errors and biases and how a novice would be likely to respond to the same situation. Again this stage of the process can be adapted to the context of the task and its environment, taking the form of a paper-and-pencil task or computer simulation. Participants' responses are recorded in a similar manner to the knowledge audit.

In the final step, the production of a cognitive demands table is a means of merging and synthesising data. The cognitive demands table is the deliverable of the ACTA intended for practitioner use, therefore allowing a focus on the specific outcomes of the analysis that are pertinent to problem solving and decision making in areas such as training and job design. Again the study with day traders illustrates for the task of foreign exchange transactions how ACTA might usefully identify areas that require complex cognitive skills, for example:
I    Difficult cognitive element: Bucking the market – identifying emerging trends e.g. bull markets and rallies.
I    Why difficult? Distinguishing market rumour vs. real trend; dips do not always indicate rallies; difficult to pick exit levels due to uncertainty.
I    Common errors: Unlikely to factor-in psychological element; confuse rally with reaction high; interfere with position once placed.
I    Cues and strategies used: Technical analysis (channel of higher highs/lows, daily highs/lows); Fibonacci retracements do not work on reaction highs.

ACTA and training needs

Adopting a training perspective, Salas and Cannon-Bowers (2001) argue that cognitive task analysis is a potentially useful tool. Specifically they draw attention to the need for a theoretically driven methodology that clearly outlines the steps of data analysis. They also note that whilst much of the rhetoric associated with the theory and practice of training argues that training needs analysis (including methods such as task analysis) is the most important phase, it remains largely 'more art than science'. Determining the training needs of individuals and organisations, they argue, requires the development of a more systematic and conceptually rigorous methodology.

This need is especially evident with regard to knowledge work and the development of knowledge assets and human capital. Our hope is that some readers will think that the ACTA techniques can assist in developing models of the problem space that practitioners face, and highlight how practitioners achieve expertise. Alongside Hoffman and Militello (2008) we hope that by breaking down barriers about how we think about studying cognitive work in context, this article will encourage those who may have dismissed CTA techniques as difficult to use and time-consuming to try out the ACTA techniques in order to share experiences and document their qualitative analysis.

Julie Gore is a Chartered Psychologist and Senior Lecturer in Organisational Behaviour at the University of [email protected] 

Claire McAndrew is a postdoctoral researcher in the Research Unit for Information Environments, University of the Arts [email protected]

References

Crandall, B., Klein, G. & Hoffman, R.H. (2006). Working minds. Cambridge, MA: MIT Press.
Drury, J.L. & Darling, E. (2008). A 'thin-slicing' approach to understanding cognitive challenges in real-time command and control. Journal of Battlefield Technology, 11(1), 9-16.
Gore, J. & Riley, M. (2004). Recruitment and selection in hotels. In H. Montgomery, R. Lipshitz & B. Brehmer (Eds.) How professionals make decisions (pp.343–350). Mahwah, NJ: Lawrence Erlbaum.
Hoffman, R.R. (1992). The psychology of expertise: Cognitive research and empirical AI. Mahwah, NJ: Lawrence Erlbaum.
Hoffman, R.R. & Militello, L.G. (2008). Perspectives on cognitive task analysis. Hove: Psychology Press.
Hoffman, R.R., Trafton, G. & Roebber, P. (2006). Minding the weather. Cambridge, MA: MIT Press.
Hoffman, R.R. & Woods, D.D. (2000). Studying cognitive systems in context. Human Factors, 42(1), 1–7.
Klein, G. & Hoffman, R.R. (1993). Seeing the invisible: Perceptual-cognitive aspects of expertise. IEEE Transactions on Systems, Man and Cybernetics, 19(3), 462–472.
Klein, G. & Militello, L. (2004). The knowledge audit as a method for cognitive task analysis. In H. Montgomery et al. (Eds.) How professionals make decisions (pp.335–342). Mahwah, NJ: Lawrence Erlbaum.
McAndrew, C. (2008). Cross-fertilising methods in naturalistic-decision making and managerial cognition. Unpublished PhD thesis, University of Surrey.
McAndrew, C. & Gore, J. (2007). "Convince me…": An inter-disciplinary study of NDM and investment managers. Proceedings of the Eighth Conference on Naturalistic Decision Making. Pacific Grove, CA.
Militello, L.G. & Hutton, R.J.B. (1998). Applied cognitive task analysis (ACTA). Ergonomics, 41, 1618–1641.
Militello, L.G. & Lim, L. (1995). Early assessment of NEC in premature infants. Journal of Perinatal and Neonatal Nursing, 9, 1–11.
Ormerod, T.C. & Ball, L.J. (2007). Qualitative methods in cognitive psychology. In C. Willig & W. Stainton Rogers (Eds.) Handbook of qualitative research in psychology. London: Sage.
Salas, E. & Cannon-Bowers, J.A. (2001). The science of training. Annual Review of Psychology, 52, 471–499.