Over a million developers have joined DZone.

Cybersecurity Architecture: All About Sensors

DZone 's Guide to

Cybersecurity Architecture: All About Sensors

CyberSecurity architecture built with big data, along with definitions of the pieces of the puzzle.

· Security Zone ·
Free Resource

Welcome back to my blogging adventure.  In my Cybersecurity Architecture series, we’ve spent some time discussing the value an analytic approach to the incident response process. In the last article, Conceptual Cybersecurity Architecture for analytic response, we started to drill into the solution space by giving a high level architecture to drive our discussion.  Let’s spend a moment reviewing that diagram so we understand the context behind today’s discussion – sensor networks.

Conceptual Cybersecurity Architecture

Looks simple, right? A cybersecurity sensor collects data and sends it to the analytic grid for processing.  Yes, but the details matter so let’s start at the basics and move forward.

Before we start, I get some confusion when I talk to folks regarding conceptual cybersecurity architecture.  A conceptual architecture is way to visualize the roles and responsibilities each component plays in the overall whole.  The typical technical implementation consolidates multiple components into a single software implementation.

So Why Do a Conceptual Cybersecurity Architecture?

  • We decompose complex problems into simple parts.
  • We determine what is required of each part to be successful.
  • We then apply architectural principles such as “loose coupling” and “segmentation of concern” into the technical design.

What’s a Sensor?

The role of the sensor in our conceptual model is to detect things. To get to the next step, requirements, we need to determine:

  • Where to detect
  • What to detect
  • How to detect
  • How to forward

Where to Detect: Where the Data Exists

Let’s face reality, a perimeter based approach to cyber security where we actually believe we have a digital perimeter is invalid. The fact that a digital perimeter ever existed was, at best, an illusion after our networks first connected to the Internet.  Every host that talks to the Internet, regardless of where it physically resides is an Internet gateway into your network.

So if we can’t focus on watching the digital perimeter, where should we monitor?  We need to monitor wherever our data resides or where we’ve automated our business processes – the places we can suffer a material loss.  This means the meta trends impact IT also impact how we implement our security sensors as well.  Let’s spend some time looking at the trends and how they influence our design requirements.

  • Zero Trust/Software Defined Networks: We’ve now learned our lesson that the weakest link in our security will be targeted for the point of exploitation and having a large flat network means that the skunkwork marketing site with no security will be used to compromise the financial application sitting next to it. We have a choice to either hold every application to the highest rigor or isolate applications from one another.  This technology allows every application to run in its own isolated network. This drives design requirements from a network-tap-centric monitoring design to an endpoint-based monitoring design to ensure our monitoring doesn’t defeat the isolation we’ve strived to achieve.
  • Virtualization/Containerization: The convergence of the physical to the virtual has created holes in our monitoring as a virtual host can talk to another virtual host without ever needing to access to the physical network. Containerization is just adding additional layers of blindness to the issue. This again drives the design requirement to an end-point based approach that can reside within each VM host or container and also drives our sensor software to not require elevated/privileged access rights.
  • Service Orchestration/DevOps/Elastic Compute: Not only are our applications becoming virtual; they are now ephemeral and mobile.  Instead of long running applications that are patched and maintained for months, we are moving to a build from scratch on application start with average existence measured in hours.  This drives a design where our sensor needs to follow the application wherever & whenever it resides, and able to automatically reconfigure itself on demand.
  • What to Detect: Activity

    Security Analytics at its core is the process of detecting the unknown and making it the known.  This invalidates the entire event/signature based concept that our log based paradigm has been built on.  We need to have raw unfiltered visibility to what is actually happening.  The good thing is this doesn’t require expensive intelligence to implement; simple passive sensors that just collects data and forwards it to an analytic grid will drive us from multiple agents to a single simple sensor that can be leveraged for all our needs.

  • Non-event based raw activity:  We need a passive feed of the activity as it happens; network, system process & storage, application state, and user activity.
  • Now, does this mean we don’t want to collect all the existing event based data feeds from our existing toolchain?  Of course not; however, we want to enrich the raw activity data with the event feed instead of the belief that the event feed providing complete visibility.

    How to Detect: Evidence Collection

    Repeat after me, we aren’t monitoring; we are collecting evidence.  Every bit of data may be reviewed by a jury of our peers.  The first few seconds of a compromise may make or break our ability to recover loss through the courts.  We need to collect the data – with full chain of custody – before the actions of our well meaning IT folks attempting to restore services or troubleshoot issues destroy the evidence.

  • Chain of custody:  Since any data may need to be used in a court of law, collecting the data in a manner that meets the requirements of evidence protection is key.
  • How to Move Forward

    Since we need our sensors to reside wherever or data and business processes are, we need to be able to get that data through potentiality untrusted networks, we need a robust and secure data bus to move that data.

    • Micro-batching:  Network connections are always unreliable and the ability to store versus drop data is a core requirement.
    • Priority based forwarding:  Not all activity is equal; sending the fact that the fire alarm just went off is more important than updating the last three hours of temperature data.
    • Secure transfer:  Since the application may run anywhere such as cloud platforms, moving the data across untrusted networks is critical to maintaining confidentiality and integrity of the data.

    TLDR (Too Long, Didn’t Read)

    Okay, that’s a lot of text, let’s summarize.  A sensor senses, really that simple.  No rules, no signatures, just passive collection of the raw activity in an evidentiary sound manner.  This data is protected in a store and forward manner because the network may be unstable and untrusted.  We need to realize that installing a massive appliance in the perimeter of our network to monitor traffic no longer works because our applications are cloud enabled and mobile.  Our controls have to follow the data as a distributed mesh of sensors communicating across a secure data bus.

    Fortunately, Apache Nifi exists to help us create this data bus – a topic of my next article.  However, as Apache Nifi’s minifi is further developed we will be able to embed this core protective data collection right in our sensors and applications.  Join me next time as we dive deep into the conceptual design of our security data bus.

    hadoop ,big data ,cybersecurity ,hortonworks

    Published at DZone with permission of

    Opinions expressed by DZone contributors are their own.

    {{ parent.title || parent.header.title}}

    {{ parent.tldr }}

    {{ parent.urlSource.name }}