We’ve been hearing a lot lately from customers who are frustrated by the limitations of one-size-fits all clouds, whether they’re based on public cloud or private cloud or bare metal servers. These customers want each of their workloads to run where it runs best and most cost-effectively. And that’s what we at Rackspace work to deliver to them, through our hybrid cloud.
Academic and scientific research often involves the construction of mathematical and numerical models to solve scientific and engineering problems. Traditionally, these complex and intensive computational models have been implemented on super computers or high-performance computing (HPC) infrastructure. These models are difficult to setup and operate, and can create a painful experience for researchers who often have to wait in a long line to use their university’s super computing infrastructure, whether it’s for a few hours or a few days.
Kevin Jackson recently joined Rackspace as a senior solutions architect. Kevin, who is based in the UK, has worked with OpenStack environments for nearly two years, and actually wrote the book on OpenStack – “OpenStack Cloud Computing Cookbook.” Here, we get to know more about Kevin, what drew him to OpenStack and how he became a Racker.
In every conversation I have with enterprise cloud customers they use the words public, private and hybrid to describe the projects or phases of projects they’re considering. It’s fantastic that people are finally thinking about how they will use the different types of resources available. But this mindset of “many clouds” in your enterprise is flawed. I’m not the first person by a long shot to say this, but it bears repeating because most enterprises aren’t listening.
Since the OpenStack-powered Rackspace Private Cloud Software launched in August, thousands of organizations have downloaded the software. We’ve seen downloads from more than 125 countries spanning all seven continents – yes, even Antarctica. And at least 25 percent of the Fortune 100 and nearly 100 colleges, universities and research centers have downloaded the software.
We want to take the headache out of deploying and using Big Data solutions. And today, through a strategic partnership with Hortonworks, a leader in Apache Hadoop development, implementation, support, operations and training, we will do just that.
CloudU Notebooks is a weekly blog series that explores topics from the CloudU certificate program in bite sized chunks, written by me, Ben Kepes, curator of CloudU. How-tos, interviews with industry giants and the occasional opinion piece are what you can expect to find. If that’s your cup of tea, you can subscribe here.
We’ve received a bunch of questions since our launch of Rackspace Private Cloud Software (code-named Alamo) about how we are using open source software. How is Alamo licensed? Are we adding restrictions to the components? What happens to the components if you stop using Alamo? And many more. If you have these same questions, this blog post is for you!
This week, Rackspace opened our latest data center in Australia. You might think this marks our entry into the country, but we have actually been there for some time. Throughout our history we have had a large number of Australian companies serving their customers and markets from our other data centers, and in particular in our Hong Kong facility. In 2009, we setup a permanent team of Rackers in Australia to serve those customers led by the awesome Mark Randall (@racker_randall). With this investment, and the rollout of our cloud portfolio, the demand from the market for us to have a local data center rapidly grew; hence opening the new data center this week.