DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • How to Activate New User Accounts by Email
  • Integrate Cucumber in Playwright With Java
  • Keep Your Application Secrets Secret
  • Creating Your Swiss Army Knife on Java Test Stack

Trending

  • Endpoint Security Controls: Designing a Secure Endpoint Architecture, Part 2
  • Building a Real-Time Change Data Capture Pipeline With Debezium, Kafka, and PostgreSQL
  • Customer 360: Fraud Detection in Fintech With PySpark and ML
  • Introducing Graph Concepts in Java With Eclipse JNoSQL, Part 3: Understanding Janus
  1. DZone
  2. Coding
  3. Java
  4. Overview of Spark and HTTP Testing With JUnit

Overview of Spark and HTTP Testing With JUnit

In this article, a testing expert takes a brief look at how to setup HTTP testing in Spark using JUnit so you can test the quality of your integrations.

By 
Alan Richardson user avatar
Alan Richardson
·
Apr. 30, 18 · Tutorial
Likes (1)
Comment
Save
Tweet
Share
11.1K Views

Join the DZone community and get the full member experience.

Join For Free

TLDR: Spark is static so having it run in an @BeforeClass allows HTTP request testing to begin.

I use Spark as the embedded web server in my applications. I also run simple HTTP tests against this as part of my local Maven build. And I start Spark within the JUnit tests themselves. In this post, I’ll show how.

We all know that there are good reasons for not running integration tests during our TDD Red/Green/Refactor process. We also know that we can run subsets of tests during this process and avoid any integration tests. And, hopefully, we recognize that expedient, fast, automated integration verification can be useful.

What Is Spark?

  • http://sparkjava.com/

Spark is a small, easy to add to your project with a single Maven dependency, embedded web server for Java.

I use it for my:

  • Multi-user text adventure game compendiumdev.co.uk/page/restmud
  • Compendium of Testing Games, Apps, and APIs github.com/eviltester/TestingApp

Spark Is Easy to Configure Within Code

get("/games/", (req, res) -> {res.redirect("/games/buggygames/index.html"); return "";});

And it will look in a resource directory for the files:

staticFileLocation("/web");

And it is easy to change the port (by default 4567)

Spark.port(1234);

And I can do fairly complicated routings if I want to for all the HTTP verbs.

        get(ApiEndPoint.HEARTBEAT.getPath(), 
                (request, response) -> {
                    return api.getHeartbeat(
                        new SparkApiRequest(request),
                        new SparkApiResponse(response)).getBody();
            });
        options(ApiEndPoint.HEARTBEAT.getPath(), 
                (request, response) -> { 
                    response.header("Allow", "GET"); 
                    response.status(200); 
                    return "";});
        path(ApiEndPoint.HEARTBEAT.getPath(), () -> {
            before("", (request, response) -> {              
                if(!api.isMethodAllowed(ApiEndPoint.HEARTBEAT.getPath(),
                                        new SparkApiRequest(request))){
                    halt(405);
                }
            });
        });

I tend to use abstraction layers so I have:

  • Classes to handle Spark routing.
  • Application Classes to handle functionality, e.g. api
  • Domain objects to bridge between domains, e.g. SparkApiRequest represents the details of an HTTP request without having Spark bleed through into my application.

Running it for Testing

It is very easy when using Spark to simply call the main method to start the server and run HTTP requests against it.

String [] args = {};
Main.main(args);

Once Spark is running, because it is all statically accessed, the server stays running while our @Testmethods are running.

I’m more likely to start my Spark using the specific Spark abstraction I have for my app:

    public void startServer() {
        server = new RestServer("");
    }

We just have to make sure we don’t keep trying to start running it again, so I use a polling mechanism to do that.

Because this is fairly common code now. I have an abstraction called SparkStarterwhich I use.

  • SparkStarter.java

This has a simple polling start mechanism:

public void startSparkAppIfNotRunning(int expectedPort){

    sparkport = expectedPort;

    try {
        if(!isRunning()) {

            startServer();

        }
    }catch(IllegalStateException e){
        e.printStackTrace();
    }

    try{
        sparkport = Spark.port();
    }catch(Exception e){
        System.out.println("Warning: could not get actual Spark port");
    }

    waitForServerToRun();
}

And the wait is:

private void waitForServerToRun() {
    int tries = 10;
    while(tries>0) {
        if(!isRunning()){
            try {
                Thread.sleep(1000);
            } catch (InterruptedException e1) {
                e1.printStackTrace();
            }
        }else{
            return;
        }
        tries --;
    }
}

These methods are in an abstract class so I create a specific ‘starter’ for my application that knows how to:

  • Check if it is running.
  • Start the server.
    public boolean isRunning(){

        try{
            HttpURLConnection con = (HttpURLConnection)
                        new URL("http",host, sparkport, heartBeatPath).
                                openConnection();
            return con.getResponseCode()==200;
        }catch(Exception e){
            return false;
        }

    }

    @Override
    public void startServer() {
        server = CompendiumDevAppsForSpark.runLocally(expectedPort);
    }

You can see an example of this in GitHub repo for [CompendiumAppsAndGamesSparkStarter.java].

And in the JUnit Code

    @BeforeClass
    public static void ensureAppIsRunning(){
        CompendiumAppsAndGamesSparkStarter.
                    get("localhost", "/heartbeat" ).
                    startSparkAppIfNotRunning(4567);
    }

e.g. PageRoutingsExistForAppsTest.java

You can find examples of this throughout in my TestingApp.

Because it is static, this will stay running across all my tests.

Pretty simple and I find it very useful for the simple projects that I am working on.

Bonus Video

“Spark Java Embedded WebServer And Testing Overview”

https://youtu.be/7b0SnEznYnk

Spark is a simple embedded Java WebServer. I can also spin it up during JUnit tests to make my testing easy.

In this video I show:

  • An overview of Spark Java Embedded Web Server.
  • How to use it during JUnit execution.
  • Abstraction code separating Spark from my Application.

JUnit application Testing Web server Java (programming language) Abstraction (computer science) Multi-user Embedded Java app

Published at DZone with permission of Alan Richardson, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • How to Activate New User Accounts by Email
  • Integrate Cucumber in Playwright With Java
  • Keep Your Application Secrets Secret
  • Creating Your Swiss Army Knife on Java Test Stack

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!