Over a million developers have joined DZone.

Preparing for Hadoop Certification with Hortonworks

DZone's Guide to

Preparing for Hadoop Certification with Hortonworks

Before you embark on getting certified for HortonWorks, you need to have some basics.

· Big Data Zone ·
Free Resource

The open source HPCC Systems platform is a proven, easy to use solution for managing data at scale. Visit our Easy Guide to learn more about this completely free platform, test drive some code in the online Playground, and get started today.

Before you embark on getting certified for HortonWorks, you need to have some basics.   You need to know Linux, Java (very helpful), SQL and some basic scripting and programming.   Knowledge of JDBC, compression, Python and Scala wouldn't hurt.

Tools and Technologies to know and Work with:  Yarn, Tez, HDFS, Hive, Pig, Ambari, Flume, Linux, SSH, HQL, Sqoop, SQL, HCatalog, SerDe and REST APIs.

Step 1:   Make sure you are working with the HortonWorks Sandbox in a VM.   Go through the tutorials and the documentation.

Step 2:  Take the practice exam in an AWS environment.

Step 3:  Read and follow this guide.

Step 4:  Practice some data ingest with a few different types of data (Avro, CSV, Parquet, JSON, ORCFile) and compression.

Step 5:  Transform some data with Pig scripts.

Step 6:  Analyze data with Hive queries.

Step 7:  Become very familiar with all the admin screens inside Ambari and linked from it.

Step 8:  Check logs and directory structures so you are quick at navigation and know all the tools.   Be comfortable in the Hadoop Linux environment.

Step 9:  Access relational data with Sqoop.

Step 10:  Make sure your PC, monitor, webcam, network and id are all ready to go 20 minutes before the test.

Step 11:  Take the test and pass!


Managing data at scale doesn’t have to be hard. Find out how the completely free, open source HPCC Systems platform makes it easier to update, easier to program, easier to integrate data, and easier to manage clusters. Download and get started today.

big data ,hadoop ,sqoop ,flume ,hive ,hdfs

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}