Over a million developers have joined DZone.

Using Hadoop for Big Data Acceleration

· Database Zone

Build fast, scale big with MongoDB Atlas, a hosted service for the leading NoSQL database. Try it now! Brought to you in partnership with MongoDB.

In this video, D.K. Panda from Ohio State University presents: Accelerating Big Data with Hadoop (HDFS, MapReduce and HBase) and Memcached.

"The SuperMUC has 147,456 cores and a peak performance of about 3 petaflop/s. The main memory will be 288 terabytes together with 12 petabytes hard disk space based on the GPFS file system. The system will use 18,432 Intel Xeon Sandy Bridge-EP processors running in IBM System x iDataPlex servers. It will also use a new form of cooling that IBM developed, called Aquasar, that uses hot water to cool the processors, a design that should cut cooling electricity usage by 40 percent, IBM claims."

Recorded at the 2013 HPC Advisory Council Switzerland Conference.

Learn more at: http://www.hpcadvisorycouncil.com/eve...


Now it's easier than ever to get started with MongoDB, the database that allows startups and enterprises alike to rapidly build planet-scale apps. Introducing MongoDB Atlas, the official hosted service for the database on AWS. Try it now! Brought to you in partnership with MongoDB.

Topics:

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

SEE AN EXAMPLE
Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.
Subscribe

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}