Why do so many organizations archive gigs and gigs of log data to Amazon’s S3?
Over the past few years I have spoken to hundreds (if not thousands…who’s counting anyhow … of our new users as they have on-boarded with Logentries. One of the first things that strikes me is that, even if they do not have a sophisticated log management solution in place, so many organizations simply dump their logs to S3.
As roll your own logging solutions go, archiving to S3 is by far the most popular. This is, in fact, usually the first step an organization takes to implement a log management solution. Why do they do this you ask? Some good reasons to follow:
- It’s dead cheap! And, in fact, it just got cheaper - up to 22% cheaper in fact – and about 10% for average users according to Copper.io. At less than $0.10 per GB per month who can’t afford to archive away GBs of log data for safe keeping? And it’s only going to get cheaper. The question I ask is, can you afford not to?
- Your data is pretty safe – S3 provides triple redundancy and claims that your data is stored with up to 99.999999999% durability and 99.99% availability. AWS also recently obtained SOC 2 compliance assertion so you can be pretty sure they maintain robust controls to maintain security and data protection.
- PCI-DSS or HIPAA compliance requirements can mandate that you need to keep log data for a given period so you can prove that sensitive data you are handling has not been tampered with.
- You may need to maintain records in case of a legal challenge to your service that raises its ugly head months or years down the road. An example I came across recently was a media company who needed to keep log data, as the local police would often require evidence for following up on “hate spam” (i.e. where it came from, etc…).
- Sometimes you need to hang on to that data for the long term just in case! A little like the old adage, “nobody every got fired for buying IBM,” and I’m quoting one of our customers here, “nobody ever got fired for logging too much.”