DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations
The Latest "Software Integration: The Intersection of APIs, Microservices, and Cloud-Based Systems" Trend Report
Get the report

How to Test HTTP Clients Using the Spark Micro-Framework

Spinning up heavyweight web or application servers adds complexity and slows tests down, but using the Spark micro-framework can make things easier and speed things up.

Scott Leberknight user avatar by
Scott Leberknight
·
Dec. 09, 16 · Tutorial
Like (1)
Save
Tweet
Share
10.50K Views

Join the DZone community and get the full member experience.

Join For Free

Testing HTTP client code can be a hassle. Your tests might need to run against a live HTTP server, or you somehow need to figure out how to send mock requests (which is generally not easy in most libraries that I have used). The tests should also be fast, meaning that you need a lightweight server that starts and stops quickly. Spinning up heavyweight web or application servers or relying on a specialized test server is generally an error-prone task that adds complexity and slows tests down.

In projects that I'm working on lately, we are using Dropwizard, which provides first-class testing support for testing JAX-RS resources and clients as JUnit rules. For example, it provides DropwizardClientRule, a JUnit rule that lets you implement JAX-RS resources as test doubles and starts and stops a simple Dropwizard application containing those resources. This works great if you are already using Dropwizard, but if not, then a great alternative is Spark. Even if you are using Dropwizard, Spark can still work well as a test HTTP server.

Spark is self-described as a "micro-framework for creating web applications in Java 8 with minimal effort." You can create the stereotypical "Hello World" in Spark like this (shamelessly copied from Spark's website):

import static spark.Spark.get; 

public class HelloWorld { 
  public static void main(String[] args) { 
    get("/hello", (req, res) -> "Hello World"); 
  } 
} 

You can run this code and visit http://localhost:4567 in a browser or by using a client tool like curl or httpie. Spark is a perfect fit for creating HTTP servers in tests (whether you call them unit tests, integration tests, or something else is up to you; I will just call them tests here). I have created a very simple library sparkjava-testing that contains a JUnit rule for spinning up a Spark server for functional testing of HTTP clients.

This library consists of one JUnit rule, the SparkServerRule. You can annotate this rule with @ClassRule or just @Rule. Using @ClassRule will start a Spark server one time before any test is run. Then your tests run, making requests to the HTTP server, and finally, once all tests have finished, the server is shut down. If you need true isolation between every single test, annotate the rule with @Rule and a test Spark server will be started before each test and shut down after each test, meaning that each test runs against a fresh server. (The SparkServerRule is a JUnit 4 rule mainly because JUnit 5 is still in milestone releases, and because I have not actually used JUnit 5.)

To declare a class rule with a test Spark server with two endpoints, you can do this:

@ClassRule 
public static final SparkServerRule SPARK_SERVER = new SparkServerRule(() -> { 
  get("/ping", (request, response) -> "pong"); 
  get("/healthcheck", (request, response) -> "healthy"); 
}); 

The SparkServerRule constructor takes a Runnable which define the routes the server should respond to. In this example there are two HTTP GET routes, /ping and /healthcheck. You can of course implement the other HTTP verbs such as POST and PUT. You can then write tests using whatever client library you want. Here is an example test using a JAX-RS:

@Test 
public void testSparkServerRule_HealthcheckRequest() { 
  client = ClientBuilder.newBuilder().build(); 
  Response response = client.target(URI.create("http://localhost:4567/healthcheck")) 
    .request() 
    .get(); 
  assertThat(response.getStatus()).isEqualTo(200); 
  assertThat(response.readEntity(String.class)).isEqualTo("healthy"); 
} 

In the above test, client is a JAX-RS Client instance (it is an instance variable which is closed after each test). I'm using AssertJ assertions in this test. The main thing to note is that your client code is parameterizable so that the local Spark server URI can be injected instead of the actual production URI. When using the JAX-RS client, as in this example, you need to be able to supply the test server URI to the Client#target method. Spark runs on port 4567 by default, so the client in the test uses that port.

The SparkServerRule has two other constructors: one that accepts a port in addition to the routes and another that takes a SparkInitializer. To start the test server on a different port, you can do this:

@ClassRule 
public static final SparkServerRule SPARK_SERVER = new SparkServerRule(6543, () -> { 
  get("/ping", (request, response) -> "pong"); 
  get("/healthcheck", (request, response) -> "healthy"); 
}); 

You can use the constructor that takes a SparkInitializer to customize the Spark server. For example, in addition to changing the port, you can also set the IP address and make the server secure. The SparkInitializer is an @FunctionalInterface with one method init(), so you can use a lambda expression. For example:

@ClassRule 
public static final SparkServerRule SPARK_SERVER = new SparkServerRule( () -> { 
  Spark.ipAddress("127.0.0.1"); 
  Spark.port(9876); 
  URL resource = Resources.getResource("sample-keystore.jks"); 
  String file = resource.getFile(); 
  Spark.secure(file, "password", null, null); 
  }, 
  () -> { 
    get("/ping", (request, response) -> "pong"); 
    get("/healthcheck", (request, response) -> "healthy"); 
  }); 

The first argument is the initializer. It sets the IP address and port, and then loads a sample keystore and calls the Spark#secure method to make the test server accept HTTPS connections using a sample keystore. You might want to customize settings if running tests in parallel, specifically the port, to ensure that parallel tests do not encounter port conflicts.

The last thing to note is that SparkServerRule resets the port, IP address, and secure settings to the default values (4567, 0.0.0.0, and non-secure, respectively) when it shuts down the Spark server. If you use the SparkInitializer to customize other settings (for example the server thread pool, static file location, before and after filters, etc.), those will not be reset, as they are not currently supported by SparkServerRule.

Last, resetting to non-secure mode required an incredibly awful hack because there is no way I found to easily reset security. You cannot just pass in a bunch of null values to the Spark#secure method, as it will throw an exception, and there is no unsecure method (probably because the server was not intended to set and reset things a bunch of times like we want to do in test scenarios). If you're interested, go look at the code for the SparkServerRule in the sparkjava-testing repository, but prepare thyself and get some cleaning supplies ready to wash away the dirty feeling you're sure to have after seeing it.

The ability to use SparkServerRule to quickly and easily set up test HTTP servers, along with the ability to customize the port, IP address, and run securely in tests has worked very well for my testing needs thus far. Note that unlike the above toy examples, you can implement more complicated logic in the routes (for example, to return a 200 or a 404 for a GET request depending on a path parameter or request parameter value.) At the same time, don't implement extremely complex logic, either.

Most times, I simply create separate routes when I need the test server to behave differently (for example, to test various error conditions). Or, I might even choose to implement separate JUnit test classes for different server endpoints so that each test focuses on only one endpoint and its various success and failure conditions. As is many times the case, the context will determine the best way to implement your tests.

Testing

Published at DZone with permission of Scott Leberknight, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Front-End Troubleshooting Using OpenTelemetry
  • Create Spider Chart With ReactJS
  • How Agile Architecture Spikes Are Used in Shift-Left BDD
  • MongoDB Time Series Benchmark and Review

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: