This post from Chris Stuccio's blog takes a critical look at the use of Hadoop and Big Data as buzzwords by asking an interesting question: What if your data isn't as big as you think it is? He offers a very concise summary of Hadoop's purpose (a straightforward MapReduce system) and limitations (nothing but a straightforward MapReduce system). Most interesting, though, is the list of alternatives he offers for people who he believes are not really working with big data. Solutions are covered for datasets of a variety of sizes:
- Hundreds of megabytes
- Ten-ish gigabytes
- A couple of terabytes
- Five terabytes and larger
The five and up range is when Hadoop becomes the best choice. So, if you think you might have been wasting your energy using Hadoop for datasets that are actually "kind of large," or "a little significant," or just generally not big enough to justify Hadoop, check out Chris Stuccio's full post for some new ideas.