“CloudOps” is the latest buzzword. Its meaning is simple: the ability to operate workloads, including both applications and data, once they get to the public cloud.
What’s not so simple is figuring out what CloudOps will cost over time, both based on future changes in technology costs and on adding or deleting workloads from the public cloud.
- NW: Number of workloads under CloudOps
- CW: Complexity of workloads (on a scale of 1.01 to 2.0)
- SR: Security requirements (on a scale of 100 to 500)
- MR: Monitoring requirements (on a scale of 100 to 500)
- COM: CloudOps multiplier (on a scale of 1,000 to 10,000), based on resources used, including the cost of cloud services and the cost of people
The typical calculation looks like this:
CloudOps Cost Per Year = ((NW*CW)*COM)+((NW*CW)*SR)+((NW*CW)*MR)
Thus, a typical use case would be:
CloudOps Cost Per Year = ((1,000*1.75)*5,000)+((1,000*1.75)*350)+((1,000*1.75)*250)
That use case’s cost comes to $9.8 million: $8,750,000 + $612,500 + $437,500. It covers the CloudOps budget for a year to operate a thousand fairly complex workloads (CW=1.75), with somewhat above-average security complexity (SR=350), average monitoring complexity (MR=250), and low resource usage (COM=1,000).
Don’t worry about this specific formula; its general nature will likely result in an incorrect calculation when you first use it. Instead, consider it a starting point, then customize the formula and your definition of the correct values over time to end up with a formula you can wield with assurance. Or at least employ it as a reality check to ensure you account for the operational costs of CloudOps, not only the startup costs.
Too many enterprises don’t get the proper budget needed to operate cloud-based systems effectively, and they may die the death of a thousand cuts due to their underestimated operational costs. Those operations are critical, so you can’t afford to underfund them.