SmoothSpan Blog

For Executives, Entrepreneurs, and other Digerati who need to know about SaaS and Web 2.0.

Substitute Darwinism in an Absence of Good Data

Posted by Bob Warfield on September 16, 2007

I remember the first time I started thinking about Darwinian Selection or Darwinism for short (my term for making evolutionary processes and competition work for you) was during a conversation with Harvard Business Review Editor Tom Stewart.  We were discussing the problems of business and how they had changed in relatively recent times.  Stewart meets with lots of CEOs from companies of all sizes in all industries and all over the world.  When asked what were the problems facing companies today, one that he focused on was risk.  Stewart mentions that there are really two kinds of risk which I’ll call quantifiable risk and unquantifiable risk.  The business world has lots of tools for dealing with quantifiable risk.  If you’re willing to spend the money, you can purchase financial instruments that completely hedge away that kind of risk.  Quantifiable risk is what insurance companies are good at dealing with too.  Unfortunately, unquantifiable risk has very few tools available to help businesses even to think about it.

The nature of unquantifiable risk is that you really have no idea what the likelihood or magnitude of the risk might be.  There is no good data to be had in decisionmaking about unquantifiable risks.  What can you do in the face of unquantifiable risk?  Adopt Darwinism.  Place a lot of little bets, see which ones work, and then double down on what’s working.  But keep placing more bets as well, because most problems that suffer from unquantifiability are chaotic moving targets.  What worked today can stop cold tomorrow.  This is not unlike the strategy adopted by most investors with Modern Portfolio Theory wherein diversification saves you from having bet too much in the wrong place.  Many financial decisions involve unquantifiable risk. 

The success or failure of most marketing programs is not quantifiable in advance.  It’s simply too unpredictable.  You could find yourself with a program that’s completely ineffective, or you could luck into a program that drives huge traffic.  There are folks out there thinking about this from a portfolio management standpoint, which is exactly right.

Interestingly, adoption of a utility computing architecture that can scale up and scale down is a way of reducing the impact of the unquantifiable risk of web traffic on your hosting infrastructure.  Reducing the cost of failure is exactly what these strategies are all about.

%d bloggers like this: