Cloud Big Data: All Articles


- Getting Started -

When you're creating new Rackspace cloud resources it's important to understand what a region is and how to use regions effectively. What is a Region? A region is a collection of one or more data centers interconnected by a...
18 Comments
Filed in: Cloud Servers, Filed in: Managed Operations, Filed in: Cloud Sites, Filed in: Cloud Backup, Filed in: Cloud Files, Filed in: Cloud Load Balancers, Filed in: Cloud Databases, Filed in: Cloud Block Storage, Filed in: RackConnect, Filed in: Cloud Queues, Filed in: Cloud Big Data, Filed in: Cloud Orchestration, Filed in: Auto Scale

Apache

The Rackspace Cloud Big Data Platform provides a scalable, robust, and complete Hadoop cluster within a few clicks. All Cloud Big Data deployments are backed by Hortonworks Data Platform (HDP). Using HDP enables Cloud Big Data to...
0 Comments
Filed in: Cloud Big Data
Rackspace Cloud Big Data Platform is a new public cloud offering leveraging the Hortonworks Data Platform (HDP) and OpenStack. Users can quickly deploy a full HDP stack and scale the solution simply by adding new nodes on the...
0 Comments
Filed in: Cloud Big Data
All accounts, by default, have a preconfigured set of thresholds (or limits) to manage capacity and prevent abuse of the system. The system recognizes rate limits and absolute limits. Rate limits are thresholds that are reset...
0 Comments
Filed in: Cloud Big Data
Cluster A group of servers (nodes). In Cloud Big Data, the servers are virtual.HDFS The Apache Hadoop Distributed File System. This is the default file system used in Cloud Big Data.MapReduce A framework for performing...
0 Comments
Filed in: Cloud Big Data
After you have successfully created a new Cloud Big Data cluster, you need to get your data into the cluster so that you can put Hadoop to work. You can use many methods to accomplish this, but the method that you choose will...
0 Comments
Filed in: Cloud Big Data