Enabling DataOps with Easy Log Analytics
Join the DZone community and get the full member experience.
Join For FreeDataOps is becoming an important consideration for organizations. Why? Well, DataOps is about making sure data is collected, analyzed, and available across the company – i.e. Ops insight for your decision-making systems like Hubspot, Tableau, Salesforce and more. Such systems are key to day-to-day operations and in many cases are as important as keeping your customer facing systems up and running.
If you think about it, today every online business is a data driven business! Everyone is accountable to have up to the minute answers on what is happening across their systems. You can’t do this reliably without having DataOps in place.
We have seen this trend across our own customer base at Logentries where more and more customers using log data to implement DataOps across their organization. Using log data for DataOps allows you to perform the following:
- Troubleshoot your systems managing your data by identifying errors and correlating data sources
- Get notified when one of these systems is experiencing issues via real time alerts or anomaly detection
- Analyze how these systems are used by the organization
Logentries has always been great at 1 and 2 above, and this week we have enhanced Logentries to now allow you to perform easier and more powerful analytics with our new easy-to-use SQL like query language – Logentries QL (LEQL).
LEQL is designed to make analyzing your log data dead simple.
There are too many log management tools that are built around complex query languages and require data scientists to operate.
Logentries is all about making log data accessible to anyone. With LEQL you are going to be able to use analytical functions like CountUnique, Min, Max, GroupBy, Sort…A number of our users have already been testing these out via our beta program. One great example is how Pluralsight has been using Logentries to manage and understand the usage of their Tableau environment. For example:
- Calculating the rate of errors over the the past 24 hours e.g. using LEQL Count function
- Understanding user usage patterns e.g. using GroupBy to understand queries performed grouped by different users
- Sorting the data to find the most popular queries and how long they are taking
Being able to answer these types of questions enables DataOps teams to understand where they need to invest time going forward. For example, do I need to add capacity to improve query performance? Are internal teams having a good user experience or are they getting a lot of errors when they try to access data?
At Logentries we are all about making the power of log data accessible to everyone and as we do this we are constantly seeing cool new use cases when using logs. If you have some cool use cases do let us know!
Opinions expressed by DZone contributors are their own.
Comments