If you’re going to spend money on preparing for disaster recovery, you should know how much to spend. After all, there is no point in spending more on a DR solution than the cost of the IT disaster itself.

That gives us an upper bound, but still leaves us with questions. For example, what is the true cost of an IT disaster? And how does it compare to the costs associated with preparing and maintaining a DR solution?

The obvious dimension for cost is the financial one. A straightforward calculation of potential loss for a given threat is P x C x T, where P is the probability of the threat materialising, C is the cost incurred in lost productivity and revenue per unit of time (like a day, for instance), and T is total time (expressed in the units of time above) that the system is likely to be down.

So, for example, the probability of lightning striking your facility within the next 12 months might be 10%, according to statistics of past data.

The daily cost might be $100,000, and systems might be down for one week. Overall potential loss in this case would then be 10% x $100,000 x 7 = $70,000.

On this data alone, yearly expenditure on DR preparation should not exceed $70,000. However, several other factors must be considered.

Costs are not limited to loss of productivity and revenue. Loss of customer satisfaction and loyalty, and damage to brand value and stakeholder value can also represent sizable costs, even if they are more difficult to evaluate.

Employee satisfaction, confidence and engagement may also be affected.

All this will cause the limits on DR spending, fixed or variable, to rise. In addition, enterprises might choose to spend a little more, knowing that other expenses like insurance premiums may go down, in the light of proper disaster recovery preparations.

The way forward is to make one or more models of expenditure and pick the most realistic one as the base for calculating what you can reasonably spend on DR.