Delivering a Competitive IIoT Edge
Delivering a Competitive IIoT Edge
This in-depth primer for Industrial Internet setups will walk you through what's possible and how to go about gathering and using data in an industrial environment.
Join the DZone community and get the full member experience.Join For Free
Digi-Key Electronics’ Internet of Things (IoT) Resource Center Inspires the Future: Read More
Modern machines tend to be smart. You’ve got a piece of industrial equipment, possibly with an OS, a sensor bus, and drivers talking to I/O systems enabling local monitoring and operation. Now you want to monitor, control, and update that equipment from the cloud. This can be accomplished, but only after you’ve deployed data management components to what many people interestingly call “the edge.” In reality, this is the center of your universe and your customer’s operations – your machines, PLCs, and gateways. Before selecting a specific approach or vendor technology, you’ll want to set clear criteria for your goals at the edge and have a solid understanding of your constraints. Many enterprises aren’t aware of all the possibilities and pitfalls at the IoT edge, so let’s start with a quick list and next steps for collecting data and generating new value from your industrial equipment both with and without connectivity to the cloud.
The Art of the Possible
To bring “intelligence to the IoT edge” is to enable more autonomous equipment operation. Latency and bandwidth constraints (and costs) can be removed as limiting factors when everything isn’t managed by the cloud. Sadly, the success of many enterprises at freeing their operations from the requirements of connectivity to the cloud is followed by shackling themselves to a hardware provider or contract manufacturer at the edge. This is your business, and no HMI vendor, gateway vendor, or contract manufacturer should be able to dictate what software or embedded operating system you use for serving your customers, nor lock you into their offerings whether or not they continue to meet your needs.
For new equipment, you may have your pick of which industrial protocols to use. You’ve likely already got machines in the field that you want to connect as well, and if they’re already running on Modbus, Ethernet IP, OPC-UA or other protocols then that’s what you’re going to need to be able to translate into a normalized data format, such as JSON, before doing anything with the data being produced. How are you going to do this? Similarly, you’ll need to build the pipeline between your equipment at the IoT edge and the public cloud you choose for ingest, storage, analytics and other integrations. This not only enables secure and seamless processing of your data, but also makes possible the device management needed to maintain and update your connected machinery. There’s a lot going on in the “real world” that must be handled before you can make use of services in the cloud.
Suiting Up for Success at the IoT Edge
Many teams are eager to jump straight to adding intelligence at the edge and managing machine data. While the ability to execute rules, enforce security policies, and run logic for local operations, as well as the capacity to normalize, clean, buffer, and filter data are critical to successful IoT edge deployments, there are system level prerequisites that must first be addressed before starting this work.
One of the issues you’ll quickly find is your current equipment software and drivers aren’t compatible with the latest cloud connectivity clients. You’ve got library incompatibilities between the two, and your drivers can’t talk to their cloud clients. There’s a wall at the edge between your machines and services from Microsoft Azure, AWS, and Google Cloud Platform. Fortunately, containers like Docker provide a way to sidestep the incompatibilities and enable end to end remote monitoring and control, machine learning, and cloud enterprise integration. Proper use of containers and other best practices can help your team bring intelligence to the edge, speed up delivery, and increase the value of your products in the market, while leveraging your investment in your existing device drivers and control logic.
The Map Is not the Territory
While containers provide a way forward for resolving incompatibilities and enabling complex solutions spanning multiple systems, they’re just that – a way forward, requiring orchestration and execution. How should the containers be organized? What should be included in each? How will you balance the resource needs of each component inside each container when running on resource-constrained edge devices and gateways?
Follow the Leader
As a best practice, we recommend isolating your I/O and any control loops inside an individual container. Use a separate container for normalizing data and providing access to history for local usage. Your cloud client, which communicates with AWS IoT, Azure IoT Hub, Google Cloud IoT Core, or other cloud IoT services should be in a separate container as well. If you need a local HMI, this too should be in its own container. Why all the isolation? By separating IoT edge components by major functions, your team gains agility and control that is reflected in the speed of delivery and flexibility of the software produced.
Division of Labor/Separation of Concerns
Each container can be owned by a different team, decoupling dependencies at both human and technology levels. Versioning and driver incompatibilities are removed from the equation, and project managers can focus on challenges specific to their domain without impacting and being impacted by others with each change. Much like microservices in the cloud, the containers interact through well-defined interfaces, each with its own contract. These interfaces enable independent evolution and maintenance.
When the Going Gets Tough
Depending on your BOM constraints and available computing power, your system may only support a limited number of containers. Resource management across Docker containers can be difficult. How will you handle sensor history rotation and memory management? What trade-offs should you make when you can only use two containers? If you don’t have a lot of experience in this area, find someone who does early in the project design phase to avoid costly architecture changes down the road.
Rates of Maturation at the IoT Edge
Containerization also enables each component to move forward over the life of your products at different rates. Your I/O container will iterate at the speed of your internal engineering team, whereas your HTML5 user interfaces can improve at the speed of the web. Meanwhile, your cloud client components can zip ahead at the speed of the cloud providers themselves, enabling your system to take advantage of their latest offerings regardless of how much effort your team puts into the other areas.
Never Cross the Streams
As discussed previously, one of the worst things you can do when building your IoT edge solution is to adopt a vendor’s software products as a condition of using their hardware. An established best practice is to buy your hardware from hardware vendors, your manufacturing services from manufacturing vendors, and your software from software vendors. Most hardware vendors provide software tools along with their components, often at low or no cost, with the simple requirement that you continue to purchase their equipment, now and forever. Some Contract Manufacturers will provide “free” engineering services to get your product working in order to land a the manufacturing contract for your device for a period of time.
The problems that arise with these “free” offerings is that the incentives are misaligned over the lifespan of your product. If the vendor’s hardware products don’t continue to meet your needs or another vendor releases a superior or lower cost offering, you don’t want to be stuck with an inferior or overpriced component because it’s the only brand your software tools are compatible with. Similarly, even if their hardware line is a competitive market leader, the software tools that come in the package tend to be like the ‘free’ earbuds that come with new music players. They’re unlikely to be good, and they’re not going to last.
It’s not just your team who’ll have to put up with poor quality software. This is how your customers will view your entire offering. The same problems (and worse) apply to Contract Manufacturing-provided software, as their only goal is to pass acceptance tests to start manufacturing. To create and maintain a competitive edge (pun intended), you need the freedom to select the hardware and manufacturing partners that best serve your connected product goals as they change over time, and the software that delivers the most value to you and your customers.
The Not So Final Word
A last piece of advice for teams developing edge components for their connected product system is to expect new challenges after deployment. At a minimum you’ll want to keep your web components updated for security and browser compatibility reasons, and these updates may call for more resources than was required by the original code. Design your system, both software and hardware, for the future at least 18-24 months out. Adding a little more to your initial BOM cost to give your platform some headroom will pay off down the road with the flexibility to solve future problems without forcing updates to your hardware. Certification is hard, and the longer you can prolong the lifespan of your equipment in the field through software updates the more profitable your business is likely to be.
Getting into Gear
Once you’ve solved for the challenges of component isolation, Docker resource management, protocol translation, BOM constraints, device management, and hardware independence at the edge, you’re ready to start generating value and business outcomes from your deployed solutions. Starting with data management, you’ll need to normalize and clean the information coming from your equipment and PLCs over Modbus, J1939, Ethernet IP, OPC-UA, or other industrial protocol. Then you need to store it, apply local rules and logic functions, run AI and ML models locally, and buffer the data for later secure transmission to the cloud. Security is critical at the edge, and the ability to apply and enforce enterprise IT policies out in “the real world” is a must for any production system. Finally, you’ll need to decide which data you want to send to the cloud, and have a secure method for doing so.
Depending on your choice of public cloud provider, IoT edge offerings such as AWS Greengrass or Azure IoT Edge can be included in your edge solution to provide elements of the functionality mentioned above. With proper container isolation, your solutions can always take advantage of the latest innovations from Microsoft and AWS without requiring updates to other components.
Published at DZone with permission of Marc Phillips , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.