DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
  1. DZone
  2. Data Engineering
  3. Big Data
  4. Hadoop Hive Web Interface

Hadoop Hive Web Interface

Gareth Rushgrove user avatar by
Gareth Rushgrove
·
Jul. 25, 12 · Interview
Like (1)
Save
Tweet
Share
15.41K Views

Join the DZone community and get the full member experience.

Join For Free

I’ve been playing with Hive recently and liking what I’ve found. In theory at least it provides a very nice, simple way of getting into analysing large data sets. To make it even easier to show other people what you’re up to Hive has a nascent web interface with a little documentation on the wiki

image of hive web ui

On the one hand it’s rather simple at this point, but that should be easily enought to prettify given a bit of time. The bigger problem was getting it working in the first place. What follows worked for me using the latest cloudera packages on debian testing. I’m assuming you already have Hive and Hadoop installed, the basic packages worked fine for me here.

Next up you’ll need the JDK (not just the JRE) as their is some compilation that will go on the first time you run the web interface.

apt-get install ant sun-java6-jdk

Next up I had to modify the installed /etc/hive/conf/hive-site.xml file as follows:

I changed this:

<property>
  <name>hive.metastore.uris</name>
  <value>file:///var/lib/hivevar/metastore/metadb/</value>
  <description>Comma separated list of URIs of metastore servers. The first server that can be connected to will be used.</description>
</property>

To this. Note the hivevar path doesn’t exist so I’m not sure if this was a typo in the source.

<property>
  <name>hive.metastore.uris</name>
  <value>file:///var/lib/hive/var/metastore/metadb/</value>
  <description>Comma separated list of URIs of metastore servers. The first server that can be connected to will be used.</description>
</property>

I also change the following section regarding the metastore name:

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:derby:;databaseName=/var/lib/hive/metastore/${user.name}_db;create=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

To this, with a fixed name. When using the above confirguration the file was actually called ${user.name} rather than my username being subsituted in. Elsewhere this seems to work fine.

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:derby:;databaseName=/var/lib/hive/metastore/metastore_db;create=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

I’m not convinced the above two changes are needed but have left them here just in case. The main tricky part is making sure a load of environment variables are correctly set. The following worked for me:

export ANT_LIB=/usr/share/ant/lib
export HIVE_HOME=/usr/lib/hive
export HADOOP_HOME=/usr/lib/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export JAVA_HOME=/usr/lib/jvm/java-6-sun

All being well that should allow you to run the hive command with the web interface like so:

hive --service hwi

That should bring up a webserver on port 9999 where you should see something similar to the screenshot above.

Interface (computing) hadoop

Published at DZone with permission of Gareth Rushgrove, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • How To Create and Edit Excel XLSX Documents in Java
  • How To Check Docker Images for Vulnerabilities
  • Fraud Detection With Apache Kafka, KSQL, and Apache Flink
  • How Observability Is Redefining Developer Roles

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: