Wednesday 4 July 2012

Cognitive Anti-Patterns 1

Something went wrong with the previous post in Blogger so a slightly revised version published here.
Anti-patterns are discussed here and here. Jim Coplien states: "an anti-pattern is something that looks like a good idea, but which backfires badly when applied".  SEI has published a document (pdf) with system archetypes - using archetypes to beat the odds. These archetypes (which have origins in ITIL) are very similar to anti-patterns. They are also nicely set out here.

The early recognition and countering of anti-patterns is an extremely valuable skill that is rarely taught, and is probably not very hard to acquire. I suspect that it is not taught often is because it sees what might pretentiously be called knowledge work as a craft or skill. On the contrary, this sort of diagnostic skill is at the heart of expertise. It all appears very negative, unfortunately, but this is the case with all risk management. Are there opportunities to complement these risks? Possibly. Not the subject of this post though. The list looks like it has a real down on automation. This in no way puts it near Marcuse' 'One Dimensional Man' or the Unabomber Manifesto. It is just a reflection of the prevalence of technology-push in our current society.

This post does not (yet) have well formulated anti-patterns, just some starting points and first drafts.

Starting points

Gary Klein offered three great 'unintelligent system anti-patterns' in this document (pdf).
  • The Man behind the Curtain (from the Wizard of Oz). Information technology usually doesn’t let people see how it reasons; it’s not understandable. The alternative is to design a 'human window' (Donald Michie).
  • Hide-and-Seek. On the belief that decision-aids must transform data into information and information into knowledge, data are actually hidden from the decision maker. The negative consequence of this antipattern is that decision makers can’t use their expertise.
  • The Mind Is a Muscle. In the attempt to acknowledge human factors in the procurement process, some guidelines end up actually working against human-centering considerations: “Design efforts shall minimize or eliminate system characteristics that require excessive cognitive, physical, or sensory skills.”
Sue E. Berryman's Cognitive Apprenticeship Model  proposed Five Assumptions About Learning - All Wrong:
1. That people predictably transfer learning from one situation to another.
2. That learners are passive receivers of wisdom - vessels into which knowledge is poured.
3. That learning is the strengthening of bonds between stimuli and correct responses.
4. That learners are blank slates on which knowledge is inscribed.
5. That skills and knowledge, to be transferable to new situations, should be acquired independent of their contexts of use.

First drafts

People are just a source of error that needs to be minimised. The alternative is to recognize that people (also) 'make safety'.
Accidents are usually the result of human error. The alternative is to see human error as an outcome (rather than a 'cause'), a sign that something is wrong with the system (Sidney Dekker).
Safe systems are usually safe. The alternative is that safe systems usually run broken.
Cycle of error (Cook and Woods). After an incident, 'things need tightening up, lessons must be learned'. Organizational reactions to failure focus on human error. The reactions to failure are: blame & train, sanctions, new regulations, rules, and technology. These interventions increase complexity and introduce new forms of failure.
Providing feedback on operational performance can be bad for morale and is best not done.

Rationality/logic/MEU is the benchmark for human decision making. The alternative is "reasoning is not about truth but about convincing others when trust alone is not enough. Doing so may seem irrational, but it is in fact social intelligence at its best." Gerd Gigerenzer, or "Man is not a rational animal, he is a rationalizing animal".  Robert A Heinlein.
People are information processors, like computers. The alternative is to recognise the role of narrative, metaphore etc.
Cognitive biases are useful aids to people making decisions.
Human cognition is a higher mental function, and the lizard brain and emotions should not be involved.
People without emotional influence make better decisions.
Bull (Norman Dixon). Being clean and tidy is vital, whether it is polished brass, dress codes or tidy desks.

The important aspect of human decision making is the 'moment of choice' . Design and operational aspects should be focused on this. The alternatives include a narrative approach (Rao).

Automate what you can and leave the operator to do the rest (job design by left-overs). Supervisory control is a good model for job design. The human-centred automation alternative is to design a human-machine team to avoid cogminutia fragmentosa.
Automation reduces workload.
Automation improves performance.
Automation reduces staffing requirements.
Strong, silent automation is good (Dave Woods).
There are no UNK-UNK failure modes (Tom Sheridan), so we do not need to design or plan for them.

Regulations, rules and procedures will work as intended without reactive or cumulative effects. Technology improves safety. The alternative is to consider the reactive effects of their introduction, including risk compensation, and to remember that the people at the sharp end make continuing judgments balancing risk, profitability, workload (ETTO).

People will obey the rules in potentially high hazard systems just because they are there.

Procedures can be expected to cover all circumstances. Risk management can be comprehensive. Things will go according to the plan, so it is worth having a really detailed plan, and not investing in preparedness. The alternative is "In preparing for battle I have always found that plans are useless, but planning is indispensable" D D Eisenhower.

Providing unnecessary data 'just in case', whether it is a fourteen page checklist, a handful of alarm channels, or an overfilled tactical display. Planned information overload has adverse consequences (see operator error).
Chartjunk is a good basis for display design.  The flows through a system (the 'big picture') can be presented as disjointed bullet points (Tufte).

Work can be divided by procurement or organizational boundaries, leading to stovepipe sub-systems, and the crew doing the 'integration work'.

Training can fix design problems.

No comments:

Post a Comment