The time has come to change the way we create, develop and ship applications. At Docker, we believe it should be quick and painless to ship application workloads across environments (dev, test and production) and hosts (laptops, data centers and clouds). Docker container technology is quickly pushing us toward that reality.
At Rackspace::Solve San Francisco last week, Docker CEO Ben Golub took the stage to discuss the future of applications. In his talk, Golub drilled into how Docker uses container technology to change the way developers build, ship and run applications and how that can reduce the time it takes developers to launch apps from weeks to minutes.
San Francisco is a city surrounded: in the morning by sparkling water, sunny weather and the sound of seagulls; in the evening by bone-chilling wind and impenetrable fog; at night by the ambient electricity of imagination and the fertile dreamscapes of shining tomorrows.
There are a lot of good reasons to be excited about containers, a form of operating system-level virtualization with many applications. At Mailgun, we’re excited about containers for four major reasons:
The concept of virtualization makes cloud computing possible and plenty of people are familiar with machine-level virtualization hypervisors such as Xen, KVM and VMware. Machine-level virtualization is an intuitive abstraction level for a lot of enterprise computing workloads because it makes one server look like many servers. Yet there’s a growing interest in operating system-level virtualization and new projects are emerging to take advantage of the unique properties of this technology for cloud computing.