Peer-to-Peer Content Distribution and Why It Matters Today
The transition from on-premises to the cloud has been a major topic of discussion over the past several months as enterprises accelerate plans to transition to the cloud.
Join the DZone community and get the full member experience.Join For Free
The transition from on-premises to the cloud has been a major topic of discussion over the past several months, as enterprises accelerate plans to transition to the cloud. The business case for cloud services is clear: greater flexibility and reduced costs. These are significant advantages in the current climate, in which every dollar counts, and remote workforces are being tasked with evolving and innovating faster than ever before. But there have been some significant technical sticking points.
Notably, there had not been a scalable, reliable, and cost-effective way to distribute content to endpoints from the cloud, which was a deal-breaker. This is no longer the case, however, given some very new developments. In order to grasp why these advancements are so noteworthy and what they mean for the future of network architecture, we need to take a step back to examine how content distribution works and the role peer-to-peer technology plays.
Why Efficient Content Distribution Is So Important
To be clear, what I mean by content is all of the patches, applications, and other updates that need to go to each of an organization's endpoints - Office 365 updates, third-party applications, internally developed applications, and even operating systems - that run on enterprise endpoints. All of this "stuff" is essential to maintaining network security and system performance.
With cyberattacks on the rise, any system or application vulnerability is ripe for exploitation. Unfortunately, as the workforce has shifted to remote, scheduled maintenance and regular content distribution is no longer happening at the rate it should. This leaves organizations in a precarious position. Bad actors are well aware of the fact that system hygiene has been lax, and they are more than willing to take advantage of the tiniest unpatched hole.
Delays on updates, patches, and applications are happening because content distribution traditionally takes a long time to deploy and manage. It's often performed during "off" hours in order not to impede network performance because it eats up bandwidth - and this is under normal circumstances. With a remote workforce connecting via VPN, the task gets exponentially harder. Content distribution takes longer and dramatically reduces network performance, if updates go through at all, which is sometimes difficult to ascertain. As a result, content distribution often gets scuttled in favor of being able to function and complete day-to-day work.
A Better Model for Content Distribution
Until the past few years, there was no viable option to distribute content to endpoints at scale in a timely and efficient manner. Then organizations turned to peer-to-peer computing models.
Peer-to-peer networks have come a long way since their early days, and they have tremendous value for modern enterprises - namely the ability to perform specific tasks remarkably quickly and at an enormous scale. When leveraged for content distribution, peer-to-peer technology takes unused, available compute and storage capacity that exists on endpoint devices, and it stores and shares content without the need for local server infrastructure. It also minimizes traffic over the wide-area network (WAN) and rapidly distributes content to the endpoints.
Modern peer-to-peer content distribution systems are very smart. They identify the best location to which content can be accessed via a single download, which then serves all of the devices in the same subnetwork. Because of the intelligence built into the most advanced systems, they can recognize traffic running across the network and change course in advance to avoid congestion and conflicts with other enterprise traffic. Additionally, they can sufficiently maintain a connection in the most adverse conditions to be able to complete the download regardless of the quality of the network or duration. These factors make content distribution via peer-to-peer far more reliable than other methods.
These systems can also manage the storage of content locally and oversee the content cache in a manner that doesn't impact the end-user experience. All of this goes on seamlessly in the background in real-time across hundreds or thousands of locations, but it doesn't happen haphazardly. Administrators generally have complete visibility and control, viewing content as it is being distributed by location or content type. If an administrator notices a problem or the system indicates an issue, he or she can pause, resume or reprioritize the distribution flow.
Peer-to-Peer Content Distribution in Action
So, what does this look like? Consider a multinational corporation with thousands of employees. IT needs to get new or updated software to all of these machines and endpoints, which are now in numerous remote locations. With a single download, all endpoint devices can receive a copy of the content without going over the WAN to the main content server or having to purchase, install and maintain local servers in the locations that need to serve up content. This is important because while LAN-based bandwidth is "free," bandwidth over the internet or from the content delivery network (CDN) to the WAN is typically metered.
When you're talking about highly distributed, global networks, the cost to distribute sizable content, such as an OS update, can be significant. With a large number of homogeneous endpoints, it is ideal to download an update across the metered link only once and then have it distributed across the network. Top peer-to-peer systems have this capability.
Let's drill down a bit further. Suppose you're an international retailer. Traditionally speaking, you would have had to deploy and maintain thousands of servers all over the world for the sole purpose of staging content for local distribution and providing PXE for their OS deployment. A server would be needed in every store and every distribution center. This is a massive expense and waste of resources. Couple it with the fact that some locations' network connectivity might be poor, making it difficult to maintain a connection long enough to download content reliably, and this very common setup seems outdated at best.
Now, consider what this scenario looks like with an intelligent peer-to-peer solution at work. All of those servers would be gone, and your software would be distributed and machines imaged across the globe in record time — without reliability concerns, without performance issues. The difference between approaches is staggering. Using peer-to-peer, your organization would save millions of dollars each year by eliminating hardware and network costs. You'd gain all of that time back that employees have to devote to managing servers, which could be redirected to other priorities - all the while ensuring that endpoints stay properly configured and up to date for a more secure, highly optimized network.
Additionally, with the right peer-to-peer content distribution solution, bandwidth is no longer an issue, and users have content immediately available at each location for new installs or updates because once the content is downloaded, it is always available to serve to any device that requires it. This can be a difference of hours, days, or sometimes even weeks saved in getting a new piece of software delivered and installed for the user.
If the peer-to-peer solution is properly architected and sufficiently intelligent, you eliminate the single point of failure as well by having the content available from multiple sources in the local subnet, whereas the loss of a single distribution server in a traditional client-server model can bring all software distribution to a halt. An intelligent peer-to-peer solution will always make sure that content storage is sufficiently distributed locally so that loss or removal of any endpoint or multiple endpoints will not preclude content from being available to other users.
Modern Peer-to-Peer Security
This is a strong value proposition, but inevitably one of the first questions that pop up in terms of peer-to-peer always relates to security, which should be of paramount importance in any type of content delivery system that touches the complete landscape of your network. As peer-to-peer solutions have matured, they have become incredibly secure. In fact, they are used by many of the world's largest financial institutions, government agencies, and healthcare providers.
They just need to incorporate precautions — of which there are many - to secure the distribution process. Role-based security and access controls are widely used, but there are other more advanced mechanisms that many offerings incorporate. For instance, a secure hash might be attached to content so that the integrity of distributed content can be validated. In this case, if there is any mismatch between what was sent and what was received, the file is discarded. This way administrators can ensure that even if the source file has somehow been compromised by malware or malicious actor, the compromised client has no ability to inject malicious content into the peer-to-peer system. Any modification should be automatically detected and rejected by the receiving client. And this is just one example that demonstrates how simple it is for peer-to-peer distribution to prevent content from becoming corrupted and/or harming the network.
Taking It to the Cloud
Now that we've talked through how peer-to-peer content distribution solutions work and why they are beneficial in traditional environments, let's think about the cloud.
Put simply, full migration to the cloud is not a possibility without a viable peer-to-peer content distribution system. Imagine 30,000 employees all trying to download a piece of content at once. The time and the disruption to processes required to accomplish this would be staggering, not to mention that it would be difficult to ascertain whether every endpoint completed its download. This scenario is prohibitive, and it's a large reason why hybrid environments exist today - organizations want to keep their efficient peer-to-peer distribution systems running on their highly reliable servers.
But very recently, the game changed. Intelligent cloud-based peer-to-peer content distribution solutions are coming to market. A single download is finally possible in a cloud environment, and bandwidth issues are removed altogether. Next-generation content distribution systems can handle hybrid or full cloud environments, which gives enterprises the green light to move forward with their digital transformation as they see fit.
Shifting to a peer-to-peer content distribution solution is a smart move in any infrastructure environment, but now it also makes digital transformation make sense by getting rid of network strain and delivering content securely at massive speed and scale. Now enterprises are free to reap all of the benefits of the cloud so that they can operate safely and efficiently.
Published at DZone with permission of Doug Kennedy. See the original article here.
Opinions expressed by DZone contributors are their own.