Ruven made a really great post about cloud interoperability today:
In my never ending quest to speak to every venture capitalist on the planet, I find myself in Boston this evening and pondering something I said in a few of my investor meetings today.
In one of my famous off topic VC rants, I described my vision for a unified cloud interface using an analogy of the Internet's self governing model as the basis of an adaptive enterprise cloud. Funny as it may sound, this is the first I've used this particular analogy.
The Internet itself would appear to be the perfect model for a "cloud platform". The Internet uses a self governing model whereby there is no single administrative entity and must continue to operate in the event of critical failures. By design, the web's core architecture assumes there will be sporadic global failures and can fail gracefully without effecting the internet as a whole. The Internet exists and functions as a result of the fact millions of separate services and network providers work independently by using common data transfer protocols to exchange communications and information with one another (which in turn exchange communications and information with still other systems). There is no centralized storage location, control point, or communications channel for the Internet. The very decentralized architecture of the internet has allowed it to adapt and evolve overtime, almost like a living organism.
One of the reasons why the internet works is because of its open communication protocols, by their very nature they form the ideal model for a cloud coordination tool - exactly the kind of system that can automate the routine 99 percent of computer-to-computer interactions you'd want in a cloud platform. But there is a catch. Protocols automate interoperability only if all core Internet service providers agree to use the same ones. (Enter Cloud Interoperability)
Open standards are key to the Internet's composition and are a core component to interoperability within a truly distributed command and control structure. The internet works because anyone who wants to create a Web site can freely use the relevant document and network protocol formats (HTML and HTTP, etc), secure in the knowledge that anyone who wants to look at their Web site will expect the same format. An open standard serves as a common intermediate language - a simplifying approach to the complex coordination problem of allowing anyone to communicate successfully with anyone else.
My random thought for this evening.
I posted a reply up to him on his blog, as I've given this particular thing a lot of thought myself. And, so I'm reposting it here. First, the idea kind of closes in on IBM's universal business adapter ad.
The idea is good, but there is a flaw in your logic. In fact the Internet is the exact opposite of a cloud. Cloud economics benefit first and foremost from centralization of resources (and the resulting federation of interfaces to access those resources). The Internet gains it's benefit from the exact opposite force, it's incredible distribution, and the protocols built on it are designed to seek that advantage. Clouds standardize everything, and in so doing increase operational efficiency and reduce cost of operations. Internet standards at core are about interoperability, with operational effeciency and performance definitely taking a backseat. So the overall needs of the protocols are vastly different.
Your talking about building some kind of standards for service which would eventually cover the World with high efficiency datacenters that could be accessed easily on a lease basis, and which would enable you to reach any audience on earth with an ip within 2 hops. It's what I've been calling the Cloud of Clouds strategy: your "cloud platform." There are a lot of attempts to get there, but they kind of fall into:
- Master API Strategy: build a "federate this crap" api, so that you can abstract clouds for applications (Eucalyptus)
- Company-in-the-Middle Strategy: build a "cloud management platform" (RightScale, many others), many of which are similar to Master API Strategy; key differentiator is that value-added components are core (monitoring, management, etc)
- Master Plan Strategy: attempt to build a spec for a cloud of clouds (us, right now), which presumably would lead to protocol standardization, which presumably would lead to easy feature-set commodization and mashups with cloud providers (nice idea)
The Cloud API is a needed component of this system, and has led to some interesting discussions for me. Scott Mattoon (Western Regional Architect at Sun, and all around good guy) and I had a long conversation about whether Sun should just build an Amazon Compatable (bug compatable) API for their cloud services: I think yes, because overall, the idea of "inventing" standards for clouds is pre-mature in my opinion, but some idea of interoperability between clouds is necessary, and the simplest solution will almost always beat the more elegant one, if it exists.
You also have to consider which clouds you're talking about when you talk about standardization, and I think this might help clarify. There are going to be industry specific (vertical) clouds, clouds for internal deployment, multi-company internal clouds, msp clouds, enterprise clouds, low end consumer clouds, and a whole lot more. I don't imagine it will be that much different from the current MSP/Hosting market right now. So when we talk about interoperability standards, the question in my mind is, "standards for whom?" Some clouds will be built for compliance reasons (Medical and Finance businesses with very specific requirements for geo-location of servers, which I believe will eventually lead to GPS units in servers to report their positions on a per-request basis), and remember that compliance drives 1 in 10 IT bucks, so this may end up being a MAJOR market component. Most clouds I'm aware of today are built by MSPs for consumers. My point however, is that each kind of cloud is going to have different kinds of interoperability needs, and the risk of market fragmentation for cloud computing is very high, especially given it's weak term-definition start.
These types of standards are not inevitable, but some of their functionality is because customers are asking for it. Customers are talking to me about cloud failover, which requires interoperability. Therefore, at least two clouds will soon be connected for HA applications (spread the risk around). Customers want low switch costs for cloud vendors, so they can take advantage of competition, but this is something only the very leading edge is thinking about right now. Customers want to contain risk, so a lot of applications I've seen are running internally on hardware, with "cloud bursting" being a demand (lots of technical challenges here). AS long as customers are demanding things like this, we're going to see the rise of standards of some kind,
Ok, so a few points here:
- It's premature to be trying to build standards, but cloud of clouds is very likely
- The simplest solution will win over the standardized solution (probably based on one of those three strategies)
- The market will fragment to serve tons of different opportunities as cloud computing becomes buzzword compliant, and then consolidate as it matures
There is one more interesting thing: a standardized API eliminates a companies ability to significantly strategically differentiate it's cloud computing offering. This reduces the ability of companies to charge a premium for differentiated product features. I think this is going to be a major hurdle that has to be overcome if we're going to see federation. If, as I believe, features continue to walk up the stack in the cloud, and we see more and more features (like storage, load balancing, etc) going into the APIs to reduce implementation complexity and increasing operating yield, there is actually an *increasing* motivation for cloud providers to continue to encourage isolationsism and lockin, as much of their revenue will come from the use of these differentiated features.
There is, for example, right now an opportunity for someone to come into the cloud computing market and build a very large, high performance cloud with guaranteed uptime, failover, and extreme security and segmentation for isolation of companies, and aim it directly as business customers. It's unlikely that anyone who would do that would want to tear down the walls around their garden, which is what the cloud computing community wants them to do: it's a very real conflict of interests for those of us who use the services and those who provide them. Only smaller players seeking market share will enlist in interoperability efforts, which tends to limit their effectivity. It's the same in many other markets: I've been doing a lot of work in the CDN markets lately and it's Akamai and everyone else (though that market is finally starding to show signs of collapse, so my argument might work against me).
Bert's (from 3tera) idea that mashups are the unexpected benefit of layered standardization is an interesting one, as usual. If you look at the evolution of those protocols, they came up as they were needed, and in fact took years to evolve. Cloud needs are much simpler at the moment I think. We really just need cloud compatability layers (however that is achieved). If the end goal is standardization of interfaces and usage, and not inherently standards, doesn't one of the above strategies (Master API, Master Plan, Company in the middle) strategies adequetly address the real market pains of customers without swimming upstream against the need for strategic differentiation of cloud offerings and the problem of being at such an early stage of the market?
I'm not questioning your post: it's excellent. But I do think that it's not inherently consistent with ockham's razor, which I do believe works pretty much most of the time in the technology market. Today, customers want to connect two clouds for failover because they don't trust clouds not to go down (thank you Amazon!), and then their burning pain will be satiated. Does that scream brokering API or cloud standards? To me, it's API. And if that's the case, the protocols will evolve automatically towards larger and larger APIs, which will lead eventually to standardization for those who wish to access those customers and participate in those API communities. I'm arguing that cloud standards shouldn't be revolutionary, because that's not their natural course, but evolutionary, and let the market figure it out.
I'm not hung up on that viewpoint, but standardization often follows vendor standards because customers do.