DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • Detailed Guide: Setup Apache Spark Development Environment on Windows 10
  • Scala at Light Speed, Part 1: The Essentials
  • Mastering Advanced Aggregations in Spark SQL
  • Thermometer Continuation in Scala

Trending

  • Bridging UI, DevOps, and AI: A Full-Stack Engineer’s Approach to Resilient Systems
  • Is Big Data Dying?
  • Apache Spark 4.0: Transforming Big Data Analytics to the Next Level
  • How to Merge HTML Documents in Java
  1. DZone
  2. Coding
  3. Languages
  4. Let's Unblock: Spark Setup (Intellij)

Let's Unblock: Spark Setup (Intellij)

Starting a series 'Let's Unblock' for the common blockers that programmers/developers face while developing an application specifically Spark with Scala.

By 
Dheeraj Gupta user avatar
Dheeraj Gupta
DZone Core CORE ·
May. 02, 21 · Tutorial
Likes (5)
Comment
Save
Tweet
Share
33.0K Views

Join the DZone community and get the full member experience.

Join For Free

Before getting our hands dirty into code let's set up our environment and prepare our IDE to understand the Scala language and SBT plugins.

Prerequisites:

  1. Java (preferably JDK 8+).
  2. Intellij IDE (community or ultimate).

IDE Setup:

So let's start configuring the plugin required for Scala and SBT environment by the following steps:

  1. Click on Configure in the bottom right of the IDE.
  2. Then click plugins.
  3. Click on Browse Repositories and search Scala.
  4. Now, enter SBT in the search section and install the SBT plugin there.

Woohoo, the IDE is now ready to understand the language you try to communicate it with.

The Project Setup:

We will learn here to set up a scala spark project using 2 build tools one by using our favorite maven and the other one using the Scala family's SBT.

Let's Start With The SBT

  1. Click on create new Project.
  2. Select Scala and then choose SBT from the options shown in the right pane:Select Scala and Then Choose SBT
  3. Add a name to your project:Add a Name to Your Project
  4. Click on the finish button. 
  5. Congrats, your project structure is created and you will be shown a new window with its structure.
  6. Let's edit the build.sbt to make our scala project a spark one.
  7. Add this spark dependency in the project to make it one: libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.11" % "2.1.0" )
  8. Adding Spark DependencyClick on the refresh icon appearing on the top right in the editor to download the relevant dependency.
  9. Create a Scala class (e.g., Master) by right-clicking on the Scala folder in the project structure window.Adding Scala Class
  10. To see your code running paste this code in your master class:
Scala
 




x
20


 
1
import org.apache.spark.SparkContext
2
import org.apache.spark.SparkConf
3
 
          
4
object Master {
5
 
          
6
  def main(args: Array[String]): Unit = {
7
 
          
8
    println("Hello Scala")
9
    val sparkConf = new SparkConf()
10
      .setAppName("dheeraj-spark-demo")
11
      .setMaster("local[1]")
12
    val sparkContext = new SparkContext(sparkConf)
13
    val data = Array(1, 2, 3, 4, 5)
14
    val rdd  = sparkContext.parallelize(data)
15
    rdd.foreach(item=>{
16
     println("Item from array "+item)
17
    })
18
    println("Thank you Spark")
19
  }
20
}


Click on the play button and you will be able to see your output in the Run window below.

Output in Run Window

Output

Maven (Java Friends favorite) way of creating a project:

  1. Click on create new Project from the file menu.
  2. Click on Maven, proceed with create from an archetype, and hit the checkbox
  3. Select archetype scala-archetype-simple:1.2Selecting Archetype
  4. Woohoo, no need to make the main class here, it will already be freshly prepared for you.
  5. Let's make our Scala project a Spark one by adding the following dependencies to pom.xml:
<dependency>
  <groupId>org.scala-lang</groupId>
  <artifactId>scala-library</artifactId>
  <version>${scala.version}</version>
</dependency>
<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.12</artifactId>
  <version>3.0.0</version>
  <scope>compile</scope>
</dependency>


Follow the remaining steps from above by copying the same code snippet.

spark-demo pom.xml Screenshot

App.scala Screenshot

Github Links:

SBT project: https://github.com/dheerajgupta217/spark-demo-sbt.

Maven Project: https://github.com/dheerajgupta217/spark-demo-maven.

Scala (programming language) intellij

Opinions expressed by DZone contributors are their own.

Related

  • Detailed Guide: Setup Apache Spark Development Environment on Windows 10
  • Scala at Light Speed, Part 1: The Essentials
  • Mastering Advanced Aggregations in Spark SQL
  • Thermometer Continuation in Scala

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!