The other day I was lamenting the fact that every time I made a tiny little change to the case classes that I use for reading/writing Ajax requests from the JavaScript client code for my web application, I have to go and manually modify my JSON combinators that convert between the Scala case classes and the Json representations.

There are some undocumented (Play Framework’s own website doesn’t make any mention of this, even on the page for Json combinators!) Scala 2.10 macros that actually allow for auto-generation of this conversion code … I wish I had coined the term myself, but someone else appropriately refers to this activity as JSON inception.

The basic idea behind Play’s JSON combinators is that they let you use a natural, fluid syntax to convert to/from JSON. For example, you might have the following implicit writes that lets you “jsonify” a zombie sighting case class:

implicit val zombieSightingWrites = (
    ( __ \ "name").write[String] and
    ( __ \ "timestamp").write[Int] and
    ( __ \ "location").write[GpsCoordinate]
)(unlift(ZombieSighting.unapply))
implicit val gpsCoordinateWrites = (
    ( __ \ "long").write[Double] and
    ( __ \ "lat").write[Double] and
    ( __ \ "altitude").write[Double]
)(unlift(GpsCoordinate.unapply))

It doesn’t look like all that much code to maintain, but let’s say my application deals in about 20 different kinds of individual case classes that can be sent or received from Ajax/web service calls. Certainly in the middle of development, making changes to this is going to be annoying and while doing it, I couldn’t shake the feeling that this could be cleaner, more elegant, more kotan.

The first thing I did was wrap all my implicit reads and writes up into a single Scala object so I could just do an import JsonReadsWrites._ and then all my Json conversion code is in a single place. That felt a little better, but I still thought it could be easier. The above sample is overly simplistic, my real case classes are filled with values of type Option[T] and dealing with those manually in the unapply/apply combinators you normally write for Play makes maintenance even more tedious.

Enter Scala 2.10 macros…

As of Scala 2.10, macros are now fully supported. A macro is basically a pre-compile code generation pass. If you flag a method as a macro method, then it will be executed at compile time and then the return value of your method is an AST (abstract syntax tree). So, what Play Framework has are macro methods called writes and reads. These methods are executed at compile time and they replace the writes and reads code that you see in the IDE with a syntax tree that constructs your Json combinators for case class conversion for you automatially.

To be honest, when I first looked at how this is done, it looked like it was some form of black magic, or that there was no way it would be possible without the use of some magic fairy dust. I read and re-read the documentation on Scala macros and after a while, it started to sink in. Reflection is available to the code you write in your macro, so, at compile time, your code can introspect the types of information passed to your macro via generics, and can then use that information to figure out how to construct a Json reader or writer.

So now, the code I wrote above can be re-written as:

implicit val zombieSightingReads = Json.reads[ZombieSighting]
implicit val gpsCoordinateReads = Json.reads[GpsCoordinate]

Now I can make changes to the case classes and the macro-generated code will automatically compensate for those changes, and it handles arrays, nested case classes (which Salat doesn’t even do for case class conversion for MongoDB…).

I would be hard pressed to find a better use of Scala macros than this.