Intelligent (?) design, evolution, and failure

At a recent management meeting, top leadership spoke eloquently and forcefully about the huge challenges we face from a ‘perfect storm’ combination of our ongoing national financial crisis and the health care reform act with its unknown and largely unknowable changes. They emphasized the need for innovations that are carefully considered, centrally controlled, and rapidly developed and deployed. The underlying theme was: “Major change is inevitable, and mistakes are not an option. We have to get it right the first time.” 

How wrong this is. Not the change part, the mistakes part. As Niels Bohr said: “Prediction is hard, especially about the future.” 

Planning a response to a known and static situation is difficult enough, and should always be accompanied by accepting the tentative nature of any solution, and remembering the need to be mindful of the inevitability of unintended consequences. (See footnote 1.) Planning for a poorly understood, constantly changing, and complex (see footnote 2) situation is not hard; it is impossible. In fact, believing one can plan accurately or effectively for the coming changes in our health care systems is delusional.

Planning, even carefully researched and well thought out planning by experts, is greatly overrated. This is why the scientific method, based on formulating a hypothesis and then testing it, is superior to the speculative method, based on untested intuition. The fallibility of planning is at the core of the Deming Cycle or PDCA, a process of change that is by definition iterative: every planned action is assumed to be off target and is therefore checked to determine its actual (rather than planned) effect. The need for further change is assumed. (We don’t have posters that say Plan-Do-Succeed or Plan-Do-Relax.) At best, change is a partial improvement that creates new issues or exposes hidden issues. There is profound truth to the claim that the chief cause of problems is solutions, and we all know that the overwhelming majority of innovations fail. 

Since planning and study cannot reliably tell us what will work best or what all the consequences of any change will be, should we abandon all attempts to plan? Absolutely not. But the planning framework should shift. Rather than an emphasis on planning as a central process by administrators and managers who do not have hands-on familiarity with the tasks trying to develop and deploy a limited number of safe and reliable solutions that can be applied across the enterprise in a fail-safe manner (the planning delusion), the focus should be on support for lots of local experimentation at the margins with careful monitoring to determine what works and what doesn’t. We aren’t intelligent enough to d.esign for the future. We need, instead, to evolve into the future by creating as many small changes (mutations) as possible, preserving and disseminating those that are beneficial and discarding those that fail (natural selection) .

In short, it is necessary to have failures, and good to have many failures. The key is: Fail often, early and cheaply. 



Footnote: 1. "The chief cause of solutions." A popular meme on the internet and in IT. OI first heard it in German from my father: "Die Hauptursache Problemen sind Lösungen." 2. Complexity is used here in the mathematical sense, describing any system where small changes have large and often unpredictable effects

Links to more on this topic::