Over a million developers have joined DZone.

Spring Data and Scala - can they Mix?

DZone 's Guide to

Spring Data and Scala - can they Mix?

· Performance Zone ·
Free Resource
A while back I got to the task of creating my first fully blown Scala server. I had a few decisions to make:
  1. Persistence Store
  2. Persistence Framework
  3. Application Server
I've had my eye set a long time on Mongo - so that was an easy choice.

Using NoSql basically removed any JPO related frameworks from the table (which i consider to be a good thing). Loving Spring as I do, led me to choose spring-data as the persistence framework. It seemed to have decent support for mongo.

For the Application Server I've decided to rely on past successes and used Jetty 9.

Setting up mongo was rather easy. As I was using Scala, I wanted to use case classes as my model. The goal was to use the same objects for my REST layer using Jackson Scala Module on one hand and be able to write the same Object type in my persistence store. 

Spring-data looked promising at first. I was able to send Json in, convert to my object model transparently using the following xml snippet:

            <bean class="org.springframework.http.converter.json.MappingJackson2HttpMessageConverter">
                <property name="objectMapper" ref="mapper"/>


The mapper was an instance of jackson-module-scala. So far so good.

I did notice one odd thing. Spring didn't know how to serialize my case classes out-of-the-box (obviously), so it used a generic class serializer when writing my objects into Mongo. this worked nicely enough reading back the state into an object until I hit a showtopper.

Apparently, case classes that contained a Scala 'Option' - were not de-serialized properly. For example, consider the following code:

myOptionField match {
      case Some(value) => do something
      case None => do something else

Turns out, it never got to the 'case None' segment. Reading my state from Mongo, although serialized as None, caused the above to break. The only workaround was to do something like this:

myOptionField match {
      case Some(value) => do something
      case _=> do something else

This, of course, was less than optimum.

If that wasn't enough, for some cases where we had to successive Scala Option fields - we got weird Json serialization errors:

com.fasterxml.jackson.databind.JsonMappingException: (None,None) (of class scala.Tuple2)

Hmmm. I Don't feel comfortable with the current state...

As my project was young, I had the luck to be able to re-factor. Luckily I had an isolated db-layer which enabled me rather easily to revisit my Persistence Framework decision.

Reading some more, I've found that Mongo has an official Scala driver  (how on earth did I miss that?). Nice. But wait. Starting to implement the change from spring-data to casbah scala driver quickly revealed I can't pass my case classes to the scala driver. It only uses MongoDB Objects. Oh man. Do I really have to create readers and writers to all my 20+ case classes?

Well, NO.

Salat to the rescue. Finally a neat solution. I can easily convert my case classes to and from mongo using concise and easy to read code:

  val mongodbConnetion = MongoConnection(host,port)
  val mydb = mongodbConnetion("mydb")
  mydb.writeConcern = WriteConcern.FsyncSafe
  val my-coll-db = mydb("my-coll-db")

//insert new document

That's it. Easy to use. Easy to read. East to write.

To summarize, spring-data (great as it is) isn't the best fit to work with Scala. Luckily we have good alternatives.


Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}