DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workkloads.

Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Apache Ranger and AWS EMR Automated Installation and Integration Series (3): Windows AD + EMR-Native Ranger
  • Spring Boot - How To Use Native SQL Queries | Restful Web Services
  • How to Consume REST Web Service (GET/POST) in Java 11 or Above
  • Breaking Up a Monolithic Database with Kong

Trending

  • Chaos Engineering for Microservices
  • Zero Trust for AWS NLBs: Why It Matters and How to Do It
  • Why Documentation Matters More Than You Think
  • How to Build Scalable Mobile Apps With React Native: A Step-by-Step Guide
  1. DZone
  2. Coding
  3. Languages
  4. Quick Start With Apache Livy

Quick Start With Apache Livy

Learn how to get started with Apache Livy, a project in the process of being incubated by Apache that interacts with Apache Spark through a REST interface.

By 
Guglielmo Iozzia user avatar
Guglielmo Iozzia
·
Nov. 28, 17 · Tutorial
Likes (7)
Comment
Save
Tweet
Share
47.8K Views

Join the DZone community and get the full member experience.

Join For Free

Apache Livy is a project currently in the process of being incubated by the Apache Software Foundation. It is a service to interact with Apache Spark through a REST interface. It enables both submissions of Spark jobs or snippets of Spark code. The following features are supported:

  • Jobs can be submitted as pre-compiled jars, snippets of code, or via Java/Scala client API.

  • Interactive Scala, Python, and R shells. 

  • Support for Spark 2.x and Spark1.x, Scala 2.10, and 2.11.

  • Doesn't require any change to Spark code.

  • Allows for long-running Spark Contexts that can be used for multiple Spark jobs by multiple clients.

  • Multiple Spark Contexts can be managed simultaneously — they run on the cluster instead of the Livy Server in order to have good fault tolerance and concurrency.

  • Possibility to share cached RDDs or DataFrames across multiple jobs and clients.

  • Secure authenticated communication.

The following image, taken from the official website, shows what happens when submitting Spark jobs/code through the Livy REST APIs:

Image title

This article provides details on how to start a Livy server and submit PySpark code.

Prerequisites

The prerequisites to start a Livy server are the following:

  • The JAVA_HOME env variable set to a JDK/JRE 8 installation.

  • A running Spark cluster.

Starting the Livy Server

Download the latest version (0.4.0-incubating at the time this article is written) from the official website and extract the archive content (it is a ZIP file). Then setup the SPARK_HOME env variable to the Spark location in the server (for simplicity here, I am assuming that the cluster is in the same machine as for the Livy server, but through the Livy configuration files, the connection can be done to a remote Spark cluster — wherever it is). By default, Livy writes its logs into the $LIVY_HOME/logs location; you need to manually create this directory. Finally, you can start the server:

$LIVY_HOME/bin/livy-server

Verify that the server is running by connecting to its web UI, which uses port 8998 by default http://<livy_host>:8998/ui.

Using the REST APIs With Python

Livy offers REST APIs to start interactive sessions and submit Spark code the same way you can do with a Spark shell or a PySpark shell. The examples in this post are in Python. Let's create an interactive session through a POST request first:

curl -X POST --data '{"kind": "pyspark"}' -H "Content-Type: application/json" localhost:8998/sessions

The  kind attribute specifies which kind of language we want to use (pyspark is for Python). Other possible values for it are spark (for Scala) or sparkr (for R). If the request has been successful, the JSON response content contains the id of the open session:

{"id":0,"appId":null,"owner":null,"proxyUser":null,"state":"starting","kind":"pyspark","appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["stdout: ","\nstderr: "]}

 You can double-check through the web UI:

Image title

You can check the status of a given session any time through the REST API:

curl localhost:8998/sessions/ | python -m json.tool 

Let's execute a code statement:

curl localhost:8998/sessions/0/statements -X POST -H 'Content-Type: application/json' -d '{"code":"2 + 2"}'

The code attribute contains the Python code you want to execute. The response of this POST request contains the id  of the statement and its execution status:

{"id":0,"code":"2 + 2","state":"waiting","output":null,"progress":0.0}

To check if a statement has been completed and get the result:

curl localhost:8998/sessions/0/statements/0

If a statement has been completed, the result of the execution is returned as part of the response (data attribute):

{"id":0,"code":"2 + 2","state":"available","output":{"status":"ok","execution_count":0,"data":{"text/plain":"4"}},"progress":1.0}

This information is available through the web UI, as well:

Image title

The same way, you can submit any PySpark code:

curl localhost:8998/sessions/0/statements -X POST -H 'Content-Type: application/json' -d'{"code":"sc.parallelize([1, 2, 3, 4, 5]).count()"}' 

Image title

When you're done, you can close the session:

curl localhost:8998/sessions/0 -X DELETE 

And that's it!

cluster Apache Spark Session (web analytics) REST Web Protocols career Python (language) Web Service Fault tolerance Scala (programming language)

Opinions expressed by DZone contributors are their own.

Related

  • Apache Ranger and AWS EMR Automated Installation and Integration Series (3): Windows AD + EMR-Native Ranger
  • Spring Boot - How To Use Native SQL Queries | Restful Web Services
  • How to Consume REST Web Service (GET/POST) in Java 11 or Above
  • Breaking Up a Monolithic Database with Kong

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!