This post was originally written by Mat Mathews at the Plexxi blog.
As SDN gains traction within the private sector we are also seeing federal agencies adopt it as they identify the need for network infrastructure changes. While it may be instinctual to say ‘yes’ to upgrading the network to include every new additional feature offered, especially when managing large amounts of traffic, focusing on simplicity and instinctiveness is often the better option. We revisit this in-depth more below – let us know if you agree with our points. Enjoy!
In this week’s PlexxiTube of the week, Dan Bachman explains how Plexxi’s big data fabric solution is managed in comparison to more traditional tiered architectures.
In an article for Information Week, Elena Malykhina cites network complexity as a challenge amongst federal agencies. The biggest reason that complexity grows unchecked in situations like this is that people typically add to their IT infrastructure with more frequency than they subtract from it. As a result, you end up constantly adding things. When something doesn’t work quite right, you add a workaround rather than digging into the actual problem and identifying the best solution. Alternatively, when you’re in need of a new capability, it’s instinctive just to add a new feature. This incremental growth leads to IT sprawl. We need to be removing things as frequently as we add them (and arguably more frequently since we have accumulated architectural debt).
With additional complexity, interoperability ends up suffering. For example, if you deploy 600 features, even if both solutions support 599 of them, number 600 can prevent seamless interoperability. Part of the hope of SDN is that it levels the architectural playing field. It removes the reliance on these features. SDN, however, does require a rethinking of architecture and procurement practices.
So, what’s a good first step? Consider starting from scratch rather than just adding SDN as a new section to an already-overloaded document.
Jessica Scarpati reported that nearly half of existing network devices are aging or obsolete in an article this week for SearchNetworking. Jessica’s point in this article isn’t terribly surprising. In my opinion, failures happen when things change, not necessarily just because of age. When people find a configuration or architecture that works, they don’t touch it. Interestingly, this is why the most stable time of year is Christmas – when employees go home and don’t touch anything.
Some of the refresh cycle delays that Jessica comments on are tied at least in part to SDN, just not in the way that people think. In my opinion, these decisions get delayed when there are more options. This is why fast food restaurants typically go with more limited menus; people move faster when they have fewer choices. SDN brought a lot of new players into the space, which actually increases choice. This will necessarily lengthen the evaluation cycle, even if people are only considering one or two additional players beyond their incumbent.
In an opinion piece for Network World, Pete Bartolik referenced SDN as a vehicle to generate openness in the datacenter. While I agree that SDN can generate openness, I think there are two other major things that SDN provides: Improved intelligence through central control and automated workflows.
The former is actually only interesting in cases where the underlying shuffling of packets is different than today. Having more intelligence but using the same basic forwarding constructs does not result in significant change. Part of the appeal of fabrics (and why Brocade sponsored this post no doubt) is that you can do intelligent things within the fabric to better deal with how traffic is shunted. Legacy networks are built around protocols that are more than 50 years old (literally!). The future cannot be built entirely of the current set of building blocks.
Secondly, the question for users will be what to automate. Too many people view automation as scripts and keystroke removal. Automation really ought to be about smoothing out the boundaries between systems. The biggest gains will not happen if it is confined to just the network. Automation needs to reach out into compute and storage and applications in the fullness of time.
John Moore notes that government agencies are experimenting with SDN and converged infrastructure systems in a recent article for GCN. The software part of software-defined anything is interesting, but it not so much as the capabilities that come as a result. It is these capabilities that are driving interest among federal agencies.
Specifically, software-defined networking provides superior intelligence. To offer a metaphor, imagine that you are driving across a crowded metro area. You see brake lights and you roll to a stop. You might try a surface road, but the reality is that you don’t actually know if it will be any faster. You face a decision with unknown variables and maybe you play the odds. Now imagine that your best friend is in a helicopter and he has the capability to tell you where to go. That’s what an SDN controller can do. It provides a global view of the network, and that’s what these agencies are looking for. The intelligence is what the difference-maker will be.