In the face of uncertainty, buy options.

Yesterday I posted about how the streetlight effect pulls us towards bad choices in systems engineering. Today I’m going to discuss a different angle on the same class of challenges, one which focuses less on cognitive bias and more on game theory and risk management.


In the face of uncertainty, buy options. This is a good rule whether you’re doing whole-system design, playing boardgames, or deciding whether and when to carry a gun.



A useful way to sort the decision challenges we face is into situations of high uncertainty versus low uncertainty. These call for vary different adaptations. In a situation of low uncertainty there is a single optimal choice; your effort should go into determining what it is and then executing it as hard and fast as possible. Unless uncertainty rises during execution (for example because you discover you made a serious mistake in your problem analysis), deviation from plan is most likely to be a mistake. Buying options is wasteful.


In a situation of high uncertainty you don’t know what your best choice is up front; there’s a broad range of possible ones that might be optimal, and there may be choices you can’t yet see. In this situation, what you need to do is enable yourself to collect on as many of the options as you can identify and afford to buy. Your hope is to be able to narrow the range of conditions you need to cope with as you learn more.


This dichotomy is so fundamental that it has moral consequences. If, in any role other than military in a war zone, you are ever carrying a gun because you have a have a high-certainty expectation that you will use it against humans, your life choices have probably gone very badly wrong. On the other hand, carrying a gun as a hedge against uncertainty – for example, if you need to visit someone in a dicey enough part of town that you might have to defend yourself – makes both practical and moral sense.


In yesterday’s example, I described using a Unix SBC rather than an Arduino-class microcontroller as a way to counter-bias against our tendency to underestimate and underweight software-development costs we can’t estimate crisply. It is that; it is also a way to buy options as a hedge against uncertainty about what how the whole system should behave. Often we don’t actually know this until field testing. When we get it wrong, the pain of correcting is in direct proportion to the cost of changing the whole system’s behavior; it can be lowered if the controller is chosen for flexibility and low development costs.


This is how “overkill” can save your butt. Suppose you’re right that an Arduino-class chip is sufficient for every scenario you imagine in your planning phase; it may still be the case that you’ll be mugged by a field reality you didn’t anticipate. If you don’t have good judgement about how to hedge against this possibility, your designs will have more than your share of expensive failures.


The hardest part of this lesson to learn is that an early choice to buy hedges against uncertainty does not retrospectively turn into a mistake if you get lucky and everything goes as originally planned. You have to sum over all possible worlds; whether the choice to carry a gun or deploy a more flexible controller was wise depends only on the accuracy of your ex ante risk evaluation, not on whether you actually got attacked or the firmware Arduino-class controller is actually good enough on the first spin.


(Yes, I know. Firearms ethics and systems engineering in the same blog post. If you don’t find this amusing, go read something stupid.)

 •  0 comments  •  flag
Share on Twitter
Published on February 18, 2018 05:04
No comments have been added yet.


Eric S. Raymond's Blog

Eric S. Raymond
Eric S. Raymond isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Eric S. Raymond's blog with rss.