Ever have discussions of this pattern?
BOSS: Do me this obviously ludicrous so data heavy as to be next to impossible to make work at all network-based thing.
YOU: Uh … ok. <do next to impossible thing>
BOSS: It’s too slow! And too expensive! Waaaaaah!
Here’s a couple of options to be aware of when you’re trying to squeeze every last bit of juice out of your network traffic:
For over a year now, Twitter has supported the SPDY protocol and today it accounts for a significant percentage of our web traffic. SPDY aims to improve upon a number of HTTP’s shortcomings and one client segment in particular that has a lot of potential to benefit is mobile devices. Cellular networks still suffer from high latency, so reducing client-server roundtrips can have a pronounced impact on a user’s experience…
One of our primary goals with our SPDY implementation was to make integration with an existing application as easy, transparent, and seamless as possible. With that in mind, we created two integration mechanisms—one for NSURLConnection and one for NSURLSession—each of which could begin handling an application’s HTTP calls with as little as a one-line change to the code…
We’re still actively experimenting with and tuning our SPDY implementation in order to improve the user’s experience in our app as much as possible. However, we have measured as much as a 30% decrease in latency in the wild for API requests carried over SPDY relative to those carried over HTTP.
In particular, we’ve observed SPDY helping more as a user’s network conditions get worse…
If you’re slow because you’ve got a whack of back and forth traffic with a SPDY-enabled data source, this could be a pretty big win — as noted above, especially with the absolutely horrible latencies seen on cell networks that us developers tend to overlook since we’re always developing with wifi connected.
More likely though, your only big wins are going to come from optimizing your data representation, optimizing its JSON, switching to binary plists, and so forth, and here’s an interesting-looking new option to consider for that step:
FastCoder is a high-performance binary serialization format for Cocoa objects and object graphs. It is intended as a replacement for NSPropertyList, NSJSONSerializer, NSKeyedArchiver/Unarchiver and Core Data.
The design goals of the FastCoder library are to be fast, flexible and secure.
FastCoder is already faster (on average) for reading than any of the built-in serialization mechanisms in Cocoa, and is faster for writing than any mechanism except for JSON (which doesn’t support arbitrary object types). File size is smaller than NSKeyedArchiver, and comparable to the other methods.
FastCoder supports more data types than either JSON or Plist coding (including NSURL, NSValue, NSSet and NSOrderedSet), and allows all supported object types to be used as the keys in a dictionary, not just strings.
FastCoder can also serialize your custom classes automatically using property inspection. For cases where this doesn’t work automatically, you can easily implement your own serialization using the FastCoding Protocol…
Looks like a pretty nice set of advantages for your local serialization needs, and would likely be applicable to network transmissions as well; the format is simple chunk based so should be easy to create/parse as applicable with your network service development environment of choice.
As always, if you’ve got any more unconventional or obscure tricks you use to speed up and/or cut down size of your network traffic, let us know!