Biases that block effective problem solving.
|All the best problem solving techniques can be defeated by biases that get in the way of clear thinking. And the worst of it is, we can't often tell when we are in the grip of these patterns. There are three we've seen often in our work that having a major impact on technical problem solving.|
One is what my wife refers to as "The Hot Dog" syndrome. Michael Anissimov calls it "The widespread tendency to categorize oneself as above average ". Certainly we techies tend to think of ourselves as a clever lot; this can cause us problems when it encourages us to disregard input from others, to discount the possibility that we made a mistake, act without due dilligence, or so often to underestimate how long it will take us to do something. It also contributes to the oft-cited failing of Not-Invented-Here, where a techie will devote a week cobbling together a solution to a problem rather than buying a two-hundred dollar utility to accomplish the same task.
A countervailing bias is the "Cover your Ass" syndrome [CYA]. Since in real life a large number of the problems a techie is called upon to solve are problems she may have played some role in creating, the need to defend one's reputation really is inherent in every IT crisis. It affects problem solving: I really don't want to find out that the reason no credit cards have been processed for three weeks is because I reconfigured the firewall. So I instinctively look for changes other people, other vendors, or other departments made.
In organizations where blame is a big part of the culture, CYA can have a major effect on problem solving. Organizational culture can play a significant role in keeping people from thinking straight. I was talking to a client the other day who referred to the change log in our application as the "Blame" table - you go there to see which of your co-workers is to blame. Really striking. I've heard other users in their in-house training say "This is how you can find out who knows why a particular change was made". Quite a difference. Be on guard in particular for teams that need to affix the blame before attempting to solve the problem.
Another important bias is one Groopman in his article called "Availability." This bias is more cognitive than emotional, and reflects our tendency to stop at the first two or three possible solutions that come to mind, that are the most available to us. It can mean that it does not occur to us to seek additional information when we should, because we immediately apply one of the first few solutions that come to mind.
I'm thinking of a time when I was given an old mainframe printer. I wanted to hook it up to a computer, but it used the pins in the parallel interface differently than the PC did, so it would not work when I connected it. I'm a software guy. So the solution that occurred to me - that was available, was to write a routine that hooked into the interrupt handler for the parallel port and changed each character's pin out as it was transmitted. It worked fine, but it had to be loaded when the computer was booted, and then prevented any other printer from being used on that port. What I should have done was make a custom cable. Duh.
What is the value of categorizing these harmful patterns? Giving patterns names can help us recognize them. For the negative patterns or biases, this enhanced recognition can help us avoid falling into them. The first two biases often have feelings associated with them -- if I'm sweating bullets while I work on something, I'm vulnerable to CYA. If I'm showing off to my client, I might be a Hot Dog victim. The third one is most insidious, and leave fewer traces. If we are moving very quickly to a time-consuming or expensive solution with a sense of certainty that we know what to do, or if we are feeling a sense of time pressure, we may be in the availability trap.