Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Preparing for Hadoop Certification with Hortonworks

DZone's Guide to

Preparing for Hadoop Certification with Hortonworks

Before you embark on getting certified for HortonWorks, you need to have some basics.

· Big Data Zone ·
Free Resource

Hortonworks Sandbox for HDP and HDF is your chance to get started on learning, developing, testing and trying out new features. Each download comes preconfigured with interactive tutorials, sample data and developments from the Apache community.

Before you embark on getting certified for HortonWorks, you need to have some basics.   You need to know Linux, Java (very helpful), SQL and some basic scripting and programming.   Knowledge of JDBC, compression, Python and Scala wouldn't hurt.

Tools and Technologies to know and Work with:  Yarn, Tez, HDFS, Hive, Pig, Ambari, Flume, Linux, SSH, HQL, Sqoop, SQL, HCatalog, SerDe and REST APIs.

Step 1:   Make sure you are working with the HortonWorks Sandbox in a VM.   Go through the tutorials and the documentation.

Step 2:  Take the practice exam in an AWS environment.

Step 3:  Read and follow this guide.

Step 4:  Practice some data ingest with a few different types of data (Avro, CSV, Parquet, JSON, ORCFile) and compression.

Step 5:  Transform some data with Pig scripts.

Step 6:  Analyze data with Hive queries.

Step 7:  Become very familiar with all the admin screens inside Ambari and linked from it.

Step 8:  Check logs and directory structures so you are quick at navigation and know all the tools.   Be comfortable in the Hadoop Linux environment.

Step 9:  Access relational data with Sqoop.

Step 10:  Make sure your PC, monitor, webcam, network and id are all ready to go 20 minutes before the test.

Step 11:  Take the test and pass!


Resources



Hortonworks Community Connection (HCC) is an online collaboration destination for developers, DevOps, customers and partners to get answers to questions, collaborate on technical articles and share code examples from GitHub.  Join the discussion.

Topics:
big data ,hadoop ,sqoop ,flume ,hive ,hdfs

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}