The Problem With Constantly Finding Problems

Earlier this week I was challenged about my overuse of the word ‘problem’.

It’s a fair cop – innovation and design types are fond of saying you shouldn’t go looking for great ideas, you should unearth great problems.

If you jump straight to answers two things can happen:

  • You spend too little time on idea generation, experimenting, and thinking.
  • You can miss the root cause entirely and embark on silver bullet solutions to the wrong problem.

Indeed many of our organisations have a bias towards getting quick answers. We favour execution rather than contemplation. Great performance is usually defined as creating and implementing solutions rather than finding the best problems to tackle.

My experience shows me that a lot of leaders simply don’t like problem definition, or even the word problem.

See, people don’t like admitting that organisations , or their departments, even have problems. One of the reasons for this is that is it runs counter to the narrative of the heroic leader. Admitting that you don’t even know the problems you face, never mind the solutions to them, is a definite blot on your copybook.

However , there is a problem with obsessive problem seeking.

As Pat McCardle writes, seeing people as a series of problems to be solved can lead to an epidemic of mass fixing. “The expectation is that everything from noisy neighbours, exam stress, misbehaving kids, sadness, unhappiness, everything that we experience as negative in our life, must either be solved by a service, state intervention, or a drug.” As she says, when we have evolved cultures and systems that are only designed to solve problems we risk focussing on weakness and deficits. We become very efficient hammers searching for a vulnerable looking nails.

Our brain is constantly searching for problems to fix, even when that problem is reducing. When something becomes rare, we tend to see it in places more than ever. This in part explains why people feel the world is getting worse despite almost every measure confirming our planet is safer, happier and less violent than ever.

At organisational level this presents an issue – as we can unknowingly employ lots of people whose job it is to find problems that either don’t exist or aren’t a priority.

There are lots of examples of this that we see in day to day life. David Levari gives us the scenario of a Neighbourhood Watch made up of volunteers. When a new member starts volunteering, they raise the alarm when they see signs of serious crimes, like burglary. Overtime though the neighbourhood watcher may start to make relative judgments which keep expanding their concept of “crime” to include milder and milder transgressions, long after serious crimes have become rare. The ‘problem’ expands even as the original problem appears to have been solved.

The reason for this, as Daniel Gilbert says, may lie in a phenomenon called “prevalence induced concept change”. In a series of experiments they showed that as the prevalence of a problem is reduced, humans are naturally inclined to redefine the problem itself. The result is that as a problem becomes smaller, people’s conceptualisations of that problem become larger, which can lead them to miss the fact that they’ve solved it.

In some cases, Gilbert says, prevalence-induced concept change makes perfect sense, as in the case of an Accident and Emergency doctor trying to triage patients. Someone who has sprained an ankle will have longer to wait than someone with a head wound. But on a quiet day the sprained ankle could take precedent over other less serious issues. The context changes the priority of the problem.

In other cases, however, prevalence-induced concept change can be a problem.

As Gilbert outlines “Nobody thinks a radiologist should change his definition of what constitutes a tumour and continue to find them even when they’re gone.That’s a case in which you really must be able to know when your work is done. You should be able to see that the prevalence of tumours has gone to zero and call it a day. Our studies simply suggest that this isn’t an easy thing to do. Our definitions of concepts seem to expand whether we want them to or not.”

So if you’ve ever faced:

  • The overzealous IT Infosec person who constantly raises security concerns.
  • The Health and Safety team who create more and more training courses for people to complete.
  • The Research team who keep telling you more research and more resource is needed.
  • The Design team who tell you that your latest service needs to go back to problem definition as it hasn’t been implemented correctly.
  • The CEO who wants another change programme.

You could be facing cases of prevalence-induced concept change.

As Gilbert says – anyone whose job involves reducing the prevalence of something should know that it isn’t always easy to tell when their work is done.  

This is something our businesses have to get better at, as not knowing when to stop is the the prime driver of organisational overreach. But as the studies suggest – simply being aware of this problem is not sufficient to prevent it.

What can prevent it?

That’s another problem.

Related: The Regressive Power of Labelling People As Vulnerable