Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Architecting a Testable Web Service in Spark Framework

DZone's Guide to

Architecting a Testable Web Service in Spark Framework

When building a web service with the Spark Java framework, keep the following methods in mind to make the app more testable.

· Java Zone ·
Free Resource

Get the Edge with a Professional Java IDE. 30-day free trial.

Background of the Spark and REST Web App Testing

I’m writing a REST web app to help me manage my online training courses. I’m building it iteratively and using TDD and trying to architect it to be easily testable at multiple layers in the architectural stack.

Previously, with RestMud, I had to pull it apart to make it more testable after the fact, and I’m trying to avoid that now. I’m using the Spark Java Framework to build the app because it is very simple and lightweight, and I can package the whole application into a standalone jar for running anywhere that a JVM exists with minimal installation requirements on the user. Which means I can also use this for training.

TDD is pretty simple when you only have domain objects as they are isolated and easy to build and test. With a web app, we face other complexities:

  • Needs to be running to accept HTTP requests.
  • Often needs to be deployed to a web/app server.

Spark has an embedded Jetty instance so it can start up as its own HTTP/app server, which is quite jolly. But that generally implies that I have to deploy it and run it prior to testing the REST API.

If you look at the examples on the Spark website, it uses a modern Java style with lambdas, which makes it a little more difficult to unit test the code in the lambdas.

Making It a Little More Testable

To make it a little more testable, in the lambda I can delegate off to a POJO:

get("/courses", (request, response) -> {    
  return coursesApi.getCourses(request,response);
});

This was the approach I took in RestMud and it means, in theory, that I have a much smaller layer (routing) which I haven’t unit tested.

But the request and response objects are from the Spark framework and they are instantiated with an HttpServletRequest and HttpServletResponse , therefor,e if I pass the Spark objects through to my API, I create a much harder situation for my API unit testing and I probably have to mock the HttpServletRequest and HttpServletResponse to instantiate a Spark Request and Response and I tightly couple my API processing to the Spark framework.

I prefer, where possible, to avoid mocking, and I really want simpler objects to represent the Request and Response.

Simpler Interfaces

I’m creating an interface that my API requires - this will probably end up having many similar methods to the Spark Request and Response but won’t have the complexity of dealing with the Servlet classes and won’t require as robust error handling (since that’s being done by Spark).

get("/courses", (request, response) -> {    
  return coursesApi.getCourses(                         
    new SparkApiRequest(request),                         
    new SparkApiResponse(response));
});

I’ve introduced aSparkApiRequest, which implements my simplerApiRequest interface and knows how to bridge the gap between Spark and my API.

I’m coding my API to use ApiRequest and therefore have created a TestApiRequest object which implements ApiRequest to use in my API Unit @Test methods, and this is ugly at the moment, it is a first draft @Test method and haven’t refactored it to create the various methods that will help me make my test code more literate and readable.

@Testpublic 
void canCreateCoursesViaApiWithACourseList(){    
  Gson gson = new Gson();    

  CoursesApi api = new CoursesApi();    

  CourseList courses = new CourseList();    
  Course course = new CourseBuilder("title", "author").build();    
  courses.addCourse(course);    

  ApiRequest apiRequest = new TestApiRequest();    
  ApiResponse apiResponse = new TestApiResponse();    

  String sentRequest = gson.toJson(courses);    

  apiRequest.setBody(sentRequest);    

  System.out.println(sentRequest);    

  Assert.assertEquals("", api.setCourses(apiRequest, apiResponse));    
  Assert.assertEquals(201,apiResponse.getStatus());    

  Assert.assertEquals(1, api.courses.courseCount());
}

In the above, I create the domain objects, use Gson to serialize them into a payload, create theTestApi request and response, and pass those into my API. This has the advantage that the API is instantiated as required for testing - Spark is static so is a little harder to control for unit testing.

I also have direct access to the running application objects so I can check the application state in the unit test, which I can’t do with an HTTP test, I would have to make a second request to get the list of courses. This allows me to build up a set of @Test methods that can drive the API, without requiring a server instantiation.

But this leaves the routing and HTTP request handling as a gap in my testing.

Routing and HTTP Request Handling Testing

With RestMud, I take a similar approach but I’m working a level down where the API calls the Game, and I test at the Game. Here, I haven’t introduced aCourse Management Applevel; I’m working at an API level. I might refactor this out later.

With RestMud, I test at the API with a separate set of test data, which is generated by walkthrough unit tests at the game level (read about that here).

I wanted to take a simpler approach with this app, and since Spark has a built-in Jetty server, it is possible for me to add HTTP tests into the build.

For some of you decrying “That’s not a Unit Test,” that’s fine, I have a class called IntegrationTest, which at some point will become a package filled with these things.

To avoid deploying, I create an@Testmethod which starts and stops the Spark jetty server:

@BeforeClasspublic 
static void createServer(){    
  RestServer server = new RestServer();    
  host = "localhost:" + Spark.port();    
  http = new HttpMessageSender("http://" + host);
}

@AfterClasspublic 
static void killServer(){    
  Spark.stop();
}

I pushed all my server code into a RestServer object rather than have it all reside inmain, but could just as easily have used:

    String [] args = {};    
Main.main(args);    
    // RestServer server = new RestServer();

Because Spark is statically created and managed, as soon as I define a routing, Spark starts up and creates a server and runs my API.

Then it is a simple matter to write simple @Test methods that use HTTP:

@Testpublic 
void serverIsRunning(){    
  HttpResponse response = http.get(ApiEndPoints.HEARTBEAT);    
  Assert.assertEquals(204, response.statusCode);   
  Assert.assertEquals("", response.body);
}

I have anHttpMessageSenderabstraction, which also uses anHttpRequestSender.

  • HttpMessageSender is a more ‘logical’ level that builds up a set of headers and has information about base URLs, etc.
  • HttpRequestSender is a physical level.

In my book Automating and Testing a REST API, I have a similar HTTP abstraction and it uses REST Assured as the HTTP implementation library.

For my JUnit Run Integration @Test, I decided to drop down to a simpler library and avoid dependencies, so I’m experimenting with the Java .netHttpURLConnection.

How Is This Working Out?

Early days, but thus far it allows me to TDD the API functionality with @Test methods which create payloads and set headers which I can pass into the API level. I can also TDD the HTTP calls, and this helps me mitigate HTTP routing errors and errors related to my transformation of Spark Request and Responses to API Request and Responses. This is also a lot faster than having a build process for the Unit tests, and then package and deploy, startup app, run integration tests, close down the app. This also means that (much though I love using Postman), I’m not having to manually interact with the API as I build it. I can make the actual HTTP calls as I develop.

This does not mean that I will not manually interact with the application to test it, and that I will not automate a separate set of HTTP API execution. I will… but not yet.

At some point, I’ll also release the source for all of this to GitHub.

Get the Java IDE that understands code & makes developing enjoyable. Level up your code with IntelliJ IDEA. Download the free trial.

Topics:
spark ,rest api ,java ,code ,web testing

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}