The time has come to change the way we create, develop and ship applications. At Docker, we believe it should be quick and painless to ship application workloads across environments (dev, test and production) and hosts (laptops, data centers and clouds). Docker container technology is quickly pushing us toward that reality.
San Francisco is a city surrounded: in the morning by sparkling water, sunny weather and the sound of seagulls; in the evening by bone-chilling wind and impenetrable fog; at night by the ambient electricity of imagination and the fertile dreamscapes of shining tomorrows.
A couple months ago we acquired the team behind ZeroVM, the lightweight open source application hypervisor. At that time we promised that more was coming soon – and now we have started to say what some of the plans are for this new technology.
Last year, in my 2013 cloud predictions, I focused on Big Data and the rise of cloudy SSDs. And this year, those predictions became reality: in 2013 Rackspace launched new Performance Cloud Servers with SSD storage and businesses all over are enjoying the benefits of analyzing and getting true value out of critical data sets of all shapes and sizes. And it wasn’t just Rackspace; several other cloud providers followed suit with solid state storage-based offerings to keep up.
Developers have been buzzing lately about how virtualization containers can boost scale while lowering costs. We are big fans of containers and the ways that they simplify the deployment and management of cloud applications. We think the next step is containerizing and virtualizing the application, not just the machine.
There are a lot of good reasons to be excited about containers, a form of operating system-level virtualization with many applications. At Mailgun, we’re excited about containers for four major reasons: