Over a million developers have joined DZone.

Overview of Spark and HTTP Testing With JUnit

DZone 's Guide to

Overview of Spark and HTTP Testing With JUnit

In this article, a testing expert takes a brief look at how to setup HTTP testing in Spark using JUnit so you can test the quality of your integrations.

· Integration Zone ·
Free Resource

TLDR: Spark is static so having it run in an @BeforeClass allows HTTP request testing to begin.

I use Spark as the embedded web server in my applications. I also run simple HTTP tests against this as part of my local Maven build. And I start Spark within the JUnit tests themselves. In this post, I’ll show how.

We all know that there are good reasons for not running integration tests during our TDD Red/Green/Refactor process. We also know that we can run subsets of tests during this process and avoid any integration tests. And, hopefully, we recognize that expedient, fast, automated integration verification can be useful.

What Is Spark?

Spark is a small, easy to add to your project with a single Maven dependency, embedded web server for Java.

I use it for my:

Spark Is Easy to Configure Within Code

get("/games/", (req, res) -> {res.redirect("/games/buggygames/index.html"); return "";});

And it will look in a resource directory for the files:


And it is easy to change the port (by default 4567)


And I can do fairly complicated routings if I want to for all the HTTP verbs.

                (request, response) -> {
                    return api.getHeartbeat(
                        new SparkApiRequest(request),
                        new SparkApiResponse(response)).getBody();
                (request, response) -> { 
                    response.header("Allow", "GET"); 
                    return "";});
        path(ApiEndPoint.HEARTBEAT.getPath(), () -> {
            before("", (request, response) -> {              
                                        new SparkApiRequest(request))){

I tend to use abstraction layers so I have:

  • Classes to handle Spark routing.
  • Application Classes to handle functionality, e.g. api
  • Domain objects to bridge between domains, e.g. SparkApiRequest represents the details of an HTTP request without having Spark bleed through into my application.

Running it for Testing

It is very easy when using Spark to simply call the main method to start the server and run HTTP requests against it.

String [] args = {};

Once Spark is running, because it is all statically accessed, the server stays running while our @Testmethods are running.

I’m more likely to start my Spark using the specific Spark abstraction I have for my app:

    public void startServer() {
        server = new RestServer("");

We just have to make sure we don’t keep trying to start running it again, so I use a polling mechanism to do that.

Because this is fairly common code now. I have an abstraction called SparkStarterwhich I use.

This has a simple polling start mechanism:

public void startSparkAppIfNotRunning(int expectedPort){

    sparkport = expectedPort;

    try {
        if(!isRunning()) {


    }catch(IllegalStateException e){

        sparkport = Spark.port();
    }catch(Exception e){
        System.out.println("Warning: could not get actual Spark port");


And the wait is:

private void waitForServerToRun() {
    int tries = 10;
    while(tries>0) {
            try {
            } catch (InterruptedException e1) {
        tries --;

These methods are in an abstract class so I create a specific ‘starter’ for my application that knows how to:

  • Check if it is running.
  • Start the server.
    public boolean isRunning(){

            HttpURLConnection con = (HttpURLConnection)
                        new URL("http",host, sparkport, heartBeatPath).
            return con.getResponseCode()==200;
        }catch(Exception e){
            return false;


    public void startServer() {
        server = CompendiumDevAppsForSpark.runLocally(expectedPort);

You can see an example of this in GitHub repo for [CompendiumAppsAndGamesSparkStarter.java].

And in the JUnit Code

    public static void ensureAppIsRunning(){
                    get("localhost", "/heartbeat" ).

e.g. PageRoutingsExistForAppsTest.java

You can find examples of this throughout in my TestingApp.

Because it is static, this will stay running across all my tests.

Pretty simple and I find it very useful for the simple projects that I am working on.

Bonus Video

“Spark Java Embedded WebServer And Testing Overview”


Spark is a simple embedded Java WebServer. I can also spin it up during JUnit tests to make my testing easy.

In this video I show:

  • An overview of Spark Java Embedded Web Server.
  • How to use it during JUnit execution.
  • Abstraction code separating Spark from my Application.

apache spark ,junit ,testing ,http ,integration

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}