From Storage to Security: How to Keep Your Edge Sites Operating Smoothly
This article reviews the features of edge computing for data storage and processing needs versus cloud-only configuration or corporate data center.
Join the DZone community and get the full member experience.Join For Free
More and more companies are beginning to turn to edge computing for their data storage and processing needs as the benefits clearly outweigh those of a cloud-only configuration or corporate data center. Not only does edge computing allow for remote management, which is critical now more than ever as more people are working from home, but it also requires minimal space, is significantly less expensive, and most importantly, still allows applications to run on-site to support local applications.
Running applications on-site at the edge is a critical component to many business operations, especially for supermarkets and convenience stores, manufacturers, and IoT device developers, for example. The most important, and obvious benefit of running applications on-site through edge computing is the performance. An application can access and process data from on-site compute and storage much quicker than it can from the cloud. A point of sale system, for example, may have a slight delay when processing payment, but if all the purchase and payment information had to be transmitted to the cloud and back, we could expect that delay to be much longer. Additionally, the cost of transmission to the cloud or data center is significantly higher and security risks are greater. Data going to and from the cloud is more susceptible to breaches, but if an edge computing site is properly secured with encryption and key management, the risk of data loss is much lower than with constant transmission.
While running applications locally is important to businesses and offers many advantages, it can still be difficult to manage all of the data from the edge, since companies are typically running hundreds, if not thousands of edge sites. Let’s take a look at how that data can be managed at the edge now and what’s in store for the future of “edge management.”
Where Edge Management Is Now
Today, the most common way to manage applications on an edge server is through a hypervisor. A hypervisor is a software that essentially allows a physical server to share its storage, processing, and networking resources amongst multiple applications, thus eliminating the need for hundreds of servers to operate one portion of a business. Deploying a hypervisor on a server has been a mainstream practice for nearly 20 years, with its first use case even going as far back as 50 years in IBM’s mainframe.
Since hypervisors have such a long history and were initially developed for large data centers and clouds, where the majority of equipment is in one central location, the tools that work alongside them were also designed to work solely in those environments. One of the most popular hypervisors on the market has a management tool that works with a maximum of 1,000 servers. Edge computing environments can easily surpass this limit because organizations can have hundreds or thousands of locations, each of which may have two or more servers. The tool’s limitation then forces IT professionals to run multiple instances which makes the environment more complicated and significantly more expensive to manage.
Server providers also offer server management tools that allow administrators to monitor the health of their servers and ensure everything is working properly. Server management can refer to managing physical server hardware, virtual machines, or many times application servers and database servers, but typically the tools focus heavily on the hardware. In edge computing settings, however, the hardware itself is far less important than the applications running on it, so management tools need to pivot their attention towards those.
The Future of Edge Management
IT professionals building large-scale edge deployments are in dire need of a much simpler tool that allows them to deploy at thousands of locations. Because today’s management frameworks weren’t designed for this many sites, it can be very difficult trying to look at 5,000 sites on one screen. There is a huge opportunity here for innovation in the software and management spaces.
One innovation that is beginning to garner attention from data storage professionals and businesses alike is containers. Containers use far fewer resources than hypervisors and can bind together libraries of applications into a single unit. They, in fact, eliminate the need for a hypervisor altogether, thus lowering operational costs and even improving performance, since there is not another layer (e.g. a hypervisor) slowing down operations.
DevOps teams are starting to jump on the container bandwagon already because it simply makes their job easier. There aren’t additional puzzle pieces that require additional server code, so the development and testing of applications can be easier than ever before. Edge computing, however, is just now starting to get introduced to containers, as many end users aren’t completely familiar with them just yet and still aren’t asking for them extensively. As soon as they begin to realize the benefits of introducing containers into their large-scale operations, we will likely see the market boom.
The storage industry has been focused on delivering highly available, high-performance storage for virtualized environments for the last 20 years, and as the industry moves toward containers, particularly at the edge, storage technology will have to keep up. Today’s storage area networks (SANs), both virtual and physical, and hyper-converged infrastructure (HCI) solutions weren’t developed with containers in mind, but that is starting to change.
What About Security?
Data management is a critical component to an edge deployment, but data security is just as, if not more, important to ensure operations run smoothly at all times. Edge computing environments allow for minimal data distribution time, but data is distributed across a wider range of endpoints, which can create loopholes in security if not properly managed. It is well known at this point that encryption is the best way to secure a company’s data, but one component of encryption that is not often talked about is key management. A key management system (KMS) assists with the generation, storage, exchange, and use of encryption keys, thus ensuring a proper “lock and key” security operation.
In the past, key management has been carried out on large, expensive hardware security modules (HSMs), but there are new ways for IT teams to deploy it that are low-cost, easy to use, and no longer require a dedicated hardware appliance. A centralized key manager that can support encryption for thousands of sites is critical for edge deployments.
Moral of the Story
When it comes to company data at the edge, there are plenty of considerations to keep in mind to ensure it is all managed and secured properly. Data that is well managed and secured ensures business can run as smoothly as possible, with little opportunity for roadblocks. Today’s frameworks for data management may be evolving as containers and more advanced encryption technologies emerge, but there is no doubt that edge computing offers the most flexibility and easiest functionality as compared to cloud computing and data centers. How are you managing data at the edge? Leave a comment below!
Opinions expressed by DZone contributors are their own.