Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Preparing for Hadoop Certification with Hortonworks

DZone's Guide to

Preparing for Hadoop Certification with Hortonworks

Before you embark on getting certified for HortonWorks, you need to have some basics.

· Big Data Zone
Free Resource

Access NoSQL and Big Data through SQL using standard drivers (ODBC, JDBC, ADO.NET). Free Download 

Before you embark on getting certified for HortonWorks, you need to have some basics.   You need to know Linux, Java (very helpful), SQL and some basic scripting and programming.   Knowledge of JDBC, compression, Python and Scala wouldn't hurt.

Tools and Technologies to know and Work with:  Yarn, Tez, HDFS, Hive, Pig, Ambari, Flume, Linux, SSH, HQL, Sqoop, SQL, HCatalog, SerDe and REST APIs.

Step 1:   Make sure you are working with the HortonWorks Sandbox in a VM.   Go through the tutorials and the documentation.

Step 2:  Take the practice exam in an AWS environment.

Step 3:  Read and follow this guide.

Step 4:  Practice some data ingest with a few different types of data (Avro, CSV, Parquet, JSON, ORCFile) and compression.

Step 5:  Transform some data with Pig scripts.

Step 6:  Analyze data with Hive queries.

Step 7:  Become very familiar with all the admin screens inside Ambari and linked from it.

Step 8:  Check logs and directory structures so you are quick at navigation and know all the tools.   Be comfortable in the Hadoop Linux environment.

Step 9:  Access relational data with Sqoop.

Step 10:  Make sure your PC, monitor, webcam, network and id are all ready to go 20 minutes before the test.

Step 11:  Take the test and pass!


Resources



The fastest databases need the fastest drivers - learn how you can leverage CData Drivers for high performance NoSQL & Big Data Access.

Topics:
big data ,hadoop ,sqoop ,flume ,hive ,hdfs

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}