Cybersecurity Architecture: All About Sensors
CyberSecurity architecture built with big data, along with definitions of the pieces of the puzzle.
Join the DZone community and get the full member experience.Join For Free
Welcome back to my blogging adventure. In my Cybersecurity Architecture series, we’ve spent some time discussing the value an analytic approach to the incident response process. In the last article, Conceptual Cybersecurity Architecture for analytic response, we started to drill into the solution space by giving a high level architecture to drive our discussion. Let’s spend a moment reviewing that diagram so we understand the context behind today’s discussion – sensor networks.
Conceptual Cybersecurity Architecture
Looks simple, right? A cybersecurity sensor collects data and sends it to the analytic grid for processing. Yes, but the details matter so let’s start at the basics and move forward.
Before we start, I get some confusion when I talk to folks regarding conceptual cybersecurity architecture. A conceptual architecture is way to visualize the roles and responsibilities each component plays in the overall whole. The typical technical implementation consolidates multiple components into a single software implementation.
So Why Do a Conceptual Cybersecurity Architecture?
- We decompose complex problems into simple parts.
- We determine what is required of each part to be successful.
- We then apply architectural principles such as “loose coupling” and “segmentation of concern” into the technical design.
What’s a Sensor?
The role of the sensor in our conceptual model is to detect things. To get to the next step, requirements, we need to determine:
- Where to detect
- What to detect
- How to detect
- How to forward
Where to Detect: Where the Data Exists
Let’s face reality, a perimeter based approach to cyber security where we actually believe we have a digital perimeter is invalid. The fact that a digital perimeter ever existed was, at best, an illusion after our networks first connected to the Internet. Every host that talks to the Internet, regardless of where it physically resides is an Internet gateway into your network.
So if we can’t focus on watching the digital perimeter, where should we monitor? We need to monitor wherever our data resides or where we’ve automated our business processes – the places we can suffer a material loss. This means the meta trends impact IT also impact how we implement our security sensors as well. Let’s spend some time looking at the trends and how they influence our design requirements.
What to Detect: Activity
Security Analytics at its core is the process of detecting the unknown and making it the known. This invalidates the entire event/signature based concept that our log based paradigm has been built on. We need to have raw unfiltered visibility to what is actually happening. The good thing is this doesn’t require expensive intelligence to implement; simple passive sensors that just collects data and forwards it to an analytic grid will drive us from multiple agents to a single simple sensor that can be leveraged for all our needs.
Now, does this mean we don’t want to collect all the existing event based data feeds from our existing toolchain? Of course not; however, we want to enrich the raw activity data with the event feed instead of the belief that the event feed providing complete visibility.
How to Detect: Evidence Collection
Repeat after me, we aren’t monitoring; we are collecting evidence. Every bit of data may be reviewed by a jury of our peers. The first few seconds of a compromise may make or break our ability to recover loss through the courts. We need to collect the data – with full chain of custody – before the actions of our well meaning IT folks attempting to restore services or troubleshoot issues destroy the evidence.
How to Move Forward
Since we need our sensors to reside wherever or data and business processes are, we need to be able to get that data through potentiality untrusted networks, we need a robust and secure data bus to move that data.
- Micro-batching: Network connections are always unreliable and the ability to store versus drop data is a core requirement.
- Priority based forwarding: Not all activity is equal; sending the fact that the fire alarm just went off is more important than updating the last three hours of temperature data.
- Secure transfer: Since the application may run anywhere such as cloud platforms, moving the data across untrusted networks is critical to maintaining confidentiality and integrity of the data.
TLDR (Too Long, Didn’t Read)
Okay, that’s a lot of text, let’s summarize. A sensor senses, really that simple. No rules, no signatures, just passive collection of the raw activity in an evidentiary sound manner. This data is protected in a store and forward manner because the network may be unstable and untrusted. We need to realize that installing a massive appliance in the perimeter of our network to monitor traffic no longer works because our applications are cloud enabled and mobile. Our controls have to follow the data as a distributed mesh of sensors communicating across a secure data bus.
Fortunately, Apache Nifi exists to help us create this data bus – a topic of my next article. However, as Apache Nifi’s minifi is further developed we will be able to embed this core protective data collection right in our sensors and applications. Join me next time as we dive deep into the conceptual design of our security data bus.
Published at DZone with permission of Michael Schiebel, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.