Virtualization

Why Infrastructure/Virtualization For Big Data?


virtualization.png

Hadoop is a modern distributed application framework with features such as consolidation of jobs and HA that complement capabilities enabled by virtualization. When virtualizing Hadoop, you avoid the occurrence of dedicated hardware clusters for each individual Hadoop cluster required by business users, developers and testers. Instead of dedicating a complete cluster of hardware to one use, virtualizing Hadoop allows organizations to pool the hardware resources so that they can be shared among a collection of user groups.

Product/Solution:

Case Studies:

foo

EMC and VCE:  Optimized Infrastructure For Hadoop

Organizations that have built Hadoop and adjacent solutions on commodity hardware are looking to transition to next generation infrastructure due to requirements around mission-critical availability and performance, interoperability with the rest of enterprise architecture, and privacy and security. When deploying Big Data applications on Vblock systems, businesses can enjoy the benefit of standardizing on the optimized infrastructures across data centers while choosing the right set of compute, network and storage for the big data and analytic use cases.

 

vce.jpg

#foo

EMC Hybrid Cloud Use Case:  Deliver Hadoop-as-a-Service

The EMC Hybrid Cloud (EHC) Solution, announced at EMC World 2014, is a new end-to-end reference architecture that is based on a Software-Defined Data Center architecture comprising technologies from across the EMC federation of companies: EMC storage and data protection, Pivotal CF Platform-as-a-service (PaaS) and the Pivotal Big Data Suite, VMware cloud management and virtualization solutions, and VMware vCloud Hybrid Service.

 

EHC leverages these tight integrations across the Federation so that customers can extend their existing investments for automated provisioning & self-service, automated monitoring, secure multi-tenancy, chargeback, and elasticity to addresses requirements of IT, developers, and lines of business.

 

EHC’s Hadoop-as-a-Service was demonstrated at last week’s VMworld 2014 San Francisco - the underpinnings of a Virtual Data Lake:

 

#fff

Quickly Deploy, Run, and Manage Hadoop With VMware Big Data Extensions

VMware vSphere Big Data Extensions (BDE) is a feature within vSphere to support Big Data and Hadoop workloads. BDE provides an integrated set of management tools to help enterprises deploy, run, and manage Hadoop on the vSphere platform. Through the vSphere vCenter user interface, enterprises are able to manage and scale Hadoop seamlessly.
Read More.png

#foo

EMC Hadoop Starter Kit:  Step-by-Step Guide To Deploy Hadoop in Minutes Using VMware BDE and EMC Isilon For HDFS Storage

EMC Hadoop Starter Kit is a step-by-step guide to quickly and easily deploy a Hadoop cluster in minutes using VMware BDE and EMC Isilon HDFS enabled storage. Watch demo below.

 

Read More.png