Eric Knorr| Infoworld
Why wouldn’t you want a dynamically scalable infrastructure and self-service provisioning? That’s the cloud model in a nutshell — and it has proven itself to the degree where you have to ask why you wouldn’t adopt it, rather than why you would.
There remain reasons not to, of course. Why switch to SaaS versions of enterprise applications, for example, when your old client-server deployment is working just fine? Not to mention all sorts of compliance and security reasons to be conservative about core applications that differentiate a business.
But in 2013, it became clearer than ever that both the private and public cloud — particularly the latter — provide the platforms of choice if you want to stay competitive. Engaging with customers, partners, and soon the vast array of sensors known as the Internet of things requires an infrastructure that scales like blazes due to unpredictability of demand and also provides a platform for continuous experimentation.
A whole lot happened in 2013. But I’m keeping this list of lessons learned short in order to point the spotlight on stuff that mattered most:
1. CMOs really like the public cloud. No doubt you’ve heard that marketing departments everywhere are going outside internal IT to deploy Web and mobile apps to engage with customers. Often, they’ll turn to an agency that will build those apps on a PaaS (platform as a service), perhaps with big data analytics on the back end, because internal IT lacks the time, inclination, or skills to build such systems of engagement. That can work well in cases where IT is involved in planning and management. When it happens willy-nilly, it can result in a big mess.
2. The hybrid model is getting real. The dream of the cloud has long been to have the public cloud be an extension of internal infrastructure. In practice, “bursting” to the cloud tends to be impractical. But if you can at least manage local and public cloud resources of a piece, it lightens the load on IT. One of 2013’s surprises has been how aggressively Microsoft has moved in this direction, with Windows Server 2012 and System Center providing a widening conduit to Azure resources. VMware is not as far along but plans a similar approach. And of course one of the major points of OpenStack has been to establish a framework for both public and private clouds.
3. Data sovereignty really matters. The NSA scandal may not be deterring U.S. companies from moving to the public cloud, but there’s little question Europe is up in arms over the idea of American cloud providers providing backdoor access to data stored by European companies. Every cloud provider I’ve talked to this year has cited this as a potential inhibitor to business. It’s another complicating factor on top of data governance regulations that vary country to country.
4. The Internet of things is a cloud thing. The drumbeat started last April when the VMware spin-off Pivotal launched as a next-generation PaaS backed in part by $105 million investment by GE, which is busily embedding millions of sensors across a broad swatch of industrial products. A key component of the platform is the GemFire event processing software meant to handle telemetry from all those sensors. In November, Amazon added its Kinesis service and Salesforce announced its Salesforce1 integration platform, both of which can be used for similar purposes.
Though these five takeaways rose to the top, others surfaced: For example, the fact that we are hurtling toward software-defined infrastructure, which already powers the big cloud service providers and will eventually reach a private cloud near you. The vital importance of cloud identity management and cloud integration became more clear, as did the fact that the private cloud remains very hard work for internal IT to pull off. And despite NSA snooping, fears about public cloud security and availability are abating. Like it or not, you’re living in the cloud era.