{{announcement.body}}
{{announcement.title}}

Navigating Information Disclosure Requests With SpectX

DZone 's Guide to

Navigating Information Disclosure Requests With SpectX

In this article, we discuss how to effectively analyze logs for information disclosure requests with SpectX.

· Security Zone ·
Free Resource

In a world of compliance and disclosure requests, the ability to investigate raw log files whilst shutting out noise can not only be a time-saving maneuverer in your process, but it can also reduce the risk of mistakes. The ability to quickly analyze large volumes of log files, be it on the cloud or hidden away in on-prem archives, will significantly impact how your tech team operates.

Using higher education as an example, every year, new students join a university, and for IT teams, this means new logs. But, it also means new devices on the networks. In Europe, this includes Eduroam, a third-party network point where logs may not be as easily accessible. On average, a student will bring in a mobile phone and laptop. But, in this ever-growing IoT world, students are expected to bring more smart devices as well as devices, such as tablets. This increases a student’s footprint on any SIEM solution.

A problem often faced with universities is that every department is run differently (and often works in silos). Systems running under a lecturer’s desk on a machine that was decommissioned four audits ago are a common find. 

This can make compliance audits difficult, but when it comes to data disclosure requests by authorities (Department for Work and Pensions, local councils, HMRC, Police and Visas and Immigration), the trouble begins. When partnered with the red tape around disclosure requests from the police, this can often be time-critical, so by the time the request has gone through the procedure and policies, there is often now a battle against time for the IT department to work on producing. 

For most universities and even other local government organizations, they do not have 24/7 SOC environments where they can dedicate resources to these requests. Pricing models of SIEMs often mean that certain systems are not included in monitoring.Having the skills and a tool that can tackle this challenge is key.

You may also like: How to Check Log Files on a Server Without Logging Into the Server

With legacy Linux systems, for example, an analyst will spend more time working on the perfect grepping script than with actual data and then again trying to format data to make it readable and understandable. For local governments and higher education establishments, in this current financial climate, ensuring resources are utilized is key to growth and sustainability. 

This includes staff on the IT team being able to use their time effectively. But logs are often stored in different data sources, and depending on the SIEM, not all of it may be visible. For example, a request universities often get relates to "did X student attend this University during an X period of time?" 

A simple check of academic records will confirm that. But then it becomes more complex, for example“when X students attended your institute, do you have logs confirming the activities they carried out online. We want to see if they browsed www.badstuff.org/NotARealSite/, and we think they are involved in a malicious campaign of phishing, which originated from a university IP.” 

As this request filters down departments and procedures, for the analyst whose lap it will land on, being able to accurately combine data, brush out noise, and query the data in a logical manner will be key. Often alongside this request will be a data dump of the day requested from various systems. This scenario plays out across different IT teams in many institutes. 

An analyst will receive a dump of logs covering a particular date and be seeking either a certain date or user and being able to query. Producing a report in a readable format without having to brush up your Excel skills will also help during a time-critical stage. 

Log dump

Log dump

Here's an example of investigating whether a user has been on a system during a certain time period. We've used the free edition of SpectX to combine various system logs and then parse and search through them. 

Simultaneously querying log files stored locally as well as in Amazon S3, is as easy as:

Shell
 




x


 
1
LIST(['file://Sample_Logs.csv', 's3s://spectx-docs/logs/custom/archive/tracking_access-2018.log.pi.gz'])
2
| parse(pattern:$pattern)
3
| select(*)
4
| filter(username = 'vladimir')



The result from the query we are left with is that the user "vladimir" did access the system at that timestamp from that geolocation. 

Query result

Query result

To get a better understanding of query input commands, check the documentation section of SpectX.

Conclusion

Ease of use, especially when it comes to dealing with logs makes the process less tedious for analysts (with SpectX it becomes fun, check out the post on Analysing Git Logs). Being able to experiment and try different queries, views, and commands on raw logs whilst working on a request will not only increase the accuracy of investigations, but it will also add a new dynamic to dealing with these requests. For more information about SpectX, see the documentation.


Further Reading

Topics:
analysis tools ,data analysis tools ,forensics ,incident management ,incident response ,log aggregation ,log analysis ,log analytics ,security analysis ,siem

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}