Bernard Golden| Computerworlduk
It’s undeniable that the entire technology industry is shifting to cloud computing. Just as the ’80s was the era of the PC, and the ’90s (and ’00s, too) was the era of the Web, it’s inevitable that the ’10s will be the era of cloud computing.
Endless words have been written about the technology underlying cloud computing. A number of orchestration products joust, each described by its company as the most complete, best performing product on the market. We’ve seen hybrid cloud products released by every vendor from Borneo to Nome, every one non pareil in tying together distributed orchestration products. One has even seen IBM describe its mainframe products as “truly a cloud” because mainframes, well…compute, I guess.
Of course, this is understandable. Every cloud product is associated with a vendor, and every vendor has to make its numbers. If the technology trend a la mode is cloud, well, then every vendor needs to look au courant. In addition, IT groups love new technology; after all, that’s what they specialise in, and every new trend and product that comes down the block is a new chance to build expertise and cement their position as the technology nomenklatura.
But cloud computing is a curious phenomenon because much of the uptake is by groups that traditionally didn’t drive adoption or even get much involved in infrastructure decisions: Application groups and software developers. They’ve embraced cloud computing, particularly Amazon Web Services, with gusto-so much so that central IT has developed new terminology to describe it: shadow IT, or, even more witheringly, rogue IT. Anyone who has looked at the growth of AWS (as I did), can see that it’s experiencing enormous growth.
Vendor and IT organisation embrace of cloud computing is understandable. But why have end users so assiduously adopted it? After all, throughout most of IT history, application groups stood aloof from infrastructure involvement, seeing it as nothing more than plumbing managed by specialists. What’s driving “shadow” IT?
Less is More: As Commodities Get Cheaper, Consumers Stock Up
Jevons was a Victorian-era economist who developed theories about marginal value. More specifically, he studied the then-unsettled question about whether a lower price for a commodity would motivate people to shift spending to other commodities. In other words, would they continue to consume the same amount of the commodity and use the savings for other purposes?
Jevons’ test case was coal. As the use of coal became more efficient-meaning that the same amount of work would require less purchase of coal-overall coal use rose. This went against common sense, which said if the price dropped, people would have more money to spend on something else. (This was so counterintuitive that it came to be known as Jevons Paradox). Far from using less coal to perform tasks associated with its traditional use, people found many new tasks that beforehand could not be cost-justified but were now economic, given coal’s lower price.
Jevons Paradox is widely discussed, with respect to cloud computing, thanks primarily to Simon Wardley. While the “true” cost of cloud computing vs. on-premises IT infrastructure is quite controversial, there’s no question that, for short-term resource use, on-demand pricing is far cheaper than the traditional IT cost associated with upfront capital expenditure.
Just as Jevons would have predicted, users who found traditional IT pricing burdensome and cloud computing on-demand pricing congenial have begun finding vast new uses for computing, based on the cloud’s lower costs. Applications that could never have been cost-justified under traditional IT economics suddenly become affordable.
Particularly relevant here are the type of applications that in the past could never have been cost-justified-the so-called “systems of engagement” such as company-based social media campaigns that, since they are not tied to economic transactions, struggle to be justified. With the cost of putting these types of applications in the cloud plummeting, there has been an explosion of system of engagement applications.
Firms Exist to Reduce Costs, But This Only Goes So Far
Jevons isn’t the only economist who has something to teach us about why cloud adoption is so massive. A second economist is even more important to the adoption explosion: Ronald Coase, whose relevant work, The Nature of the Firm, was published in 1937. (Remarkably, Coase is still alive and kicking at 102 and, one hopes, basking in the renown his work quite rightly deserves.)
In his article, Coase asked a question that hadn’t been examined before: Why do business organisations exist? Why don’t individual actors buy and sell among themselves, using the market to set prices, thereby ensuring prime economic efficiency?
His answer? Transaction costs. The cost of using an external resource includes not only the price of the resource, but also associated costs: Searching (seeking and identifying the right resource), bargaining, protecting trade secrets and so on. It can be more efficient to employ people to provide resources, as the associated costs are avoided and the total resource cost is lower.
However, there are natural limits to this centralising advantage. Coase notices that “decreasing returns to the entrepreneur function, including increasing overhead costs and increasing propensity for an overwhelmed manager to make mistakes in resource allocation.”
Here’s a vivid example of how these overhead costs play out in IT. A friend recounted that gaining access to a server required eight separate emails, spread across five days. This for a high-priority project. Set against this, the attractiveness of immediate availability of resources from a CSP is easily comprehensible.
In essence, shadow IT represents the struggle created when a high-transaction cost environment (manual IT processes) confronts a low-transaction cost environment (self-service cloud computing). Application groups are choosing cloud computing in hurricane proportions. The choice-even if the low-transaction cost environment ultimately costs more money-is understandable when examining the full transaction costs of the internal alternative.
One can predict how this struggle will turn out. Users will overwhelmingly adopt cloud computing, given the overall high transaction costs of the incumbent solution. Moreover, given the lower transaction cost of the new alternative, adoption will be much greater than anyone expects, as users find new ways to apply the technology.
Trying to stave off public cloud computing by demonstrating that the raw cost of the resource is more than if it were provisioned internally is pointless if the overall transaction costs of the internal offering are higher. Far better than running meaningless economic comparisons would be to evaluate the true transaction cost of user options and developing methods to deliver lower overall costs from internal resources. Otherwise, huge amounts of labor (and money) are going to be wasted on incomplete economic analysis that users ignore based on their own estimation of overall transaction costs.
The next time you overhear (or participate) in a cloud economics discussion, keep Jevons and Coase in mind, and remember Jevons Paradox and Coase’s understanding of firm and transaction costs.