Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Proper SBT Setup for Scala 2.10 and Spark Streaming

DZone's Guide to

Proper SBT Setup for Scala 2.10 and Spark Streaming

Learn how to set up SBT for Scala 2.10 and Spark streaming, from the directory to build properties.

· Java Zone
Free Resource

The single app analytics solutions to take your web and mobile apps to the next level.  Try today!  Brought to you in partnership with CA Technologies

The first step on your way to building Spark Jobs in Scala is to setup your project.   Below is a listing of a proper directory structure and SBT build files.

Directory

project
   build.properties
   assembly.sbt
   plugins.sbt

build.sbt

src/main/scala

src/main/resources

To Assembly Our Jar for Spark, sbt assembly. The first time, I like to do sbt clean assembly.   For my example, it is assumed you have Scala 2.10.x on MacOSX or Linux with Spark 1.6 installed.

build.sbt

name := "MySparkScala"

version := "1.0"

scalaVersion := "2.10.5"

// Configure JAR used with the assembly plug-in
jarName in assembly := "MySparkScala.jar"

ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }

libraryDependencies  ++= Seq(

  "org.apache.ignite" % "ignite-spark_2.10" % "1.4.0",
  "org.apache.ignite" % "ignite-spring" % "1.4.0",
  "org.apache.spark" % "spark-core_2.10" % "1.4.1" % "provided",
  "org.apache.spark" % "spark-sql_2.10" % "1.4.1" % "provided",
  "org.scalanlp" %% "breeze-viz" % "0.11.2",
  "com.github.nscala-time" %% "nscala-time" % "2.2.0",
  "org.apache.commons" % "commons-math3" % "3.5" % "provided"
)

assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)

resolvers ++= Seq(
  "Sonatype Releases" at "https://oss.sonatype.org/content/repositories/releases/"
)

mergeStrategy in assembly := {
  case m if m.toLowerCase.endsWith("manifest.mf")          => MergeStrategy.discard
  case m if m.toLowerCase.matches("meta-inf.*\\.sf$")      => MergeStrategy.discard
  case "log4j.properties"                                  => MergeStrategy.discard
  case m if m.toLowerCase.startsWith("meta-inf/services/") => MergeStrategy.filterDistinctLines
  case "reference.conf"                                    => MergeStrategy.concat
  case _                                                   => MergeStrategy.first
}

build.properties

sbt.version = 0.13.8%

assembly.sbt

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")

That's how to start, and you're ready to build your first Scala-based Apache Spark application.

CA App Experience Analytics, a whole new level of visibility. Learn more. Brought to you in partnership with CA Technologies.

Topics:
sbt ,scala ,spark ,spark streaming ,twitter

Opinions expressed by DZone contributors are their own.

THE DZONE NEWSLETTER

Dev Resources & Solutions Straight to Your Inbox

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

X

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}